Initial Thoughts on 3DMark “2013”

First, let me say that while I understand the reasoning behind eliminating the year/version from the name, I’m going to generally refer to this release as 3DMark 2013, as there will inevitably be another 3DMark in a year or two. With that out of the way, how does this latest release stand up to previous iterations, and is it a useful addition to the benchmark repertoire?

No benchmark is ever perfect, and even “real world gaming benchmarks” can only tell part of the story. As long as we keep that thought forefront when looking at the latest 3DMark, the results are completely reasonable. With the overall scores using both the Graphics and Physics tests, it will always be beneficial to have a fast CPU and GPU working together for 3DMark rather than pairing a fast GPU with a mediocre CPU, but I can’t say that such an approach is wrong—no matter what some companies might try to say, there are always potential uses for more CPU power in games (physics and AI immediately come to mind), though not every game will need a ton of CPU performance.

In terms of advancing the state of the benchmarking industry, it’s good to see the demo modes (cool graphics with sound are more enticing to the average person than a pure graphics benchmark). I also like the addition of graphs that show performance, power, temperatures, etc., though I wish they worked on all of the hardware rather than only some of the platforms. There’s at least the potential to now use 3DMark on its own to do stress testing without running additional utilities (HWiNFO or similar) in the background.

What I want to see now is how the various tablet and smartphone offerings stack up in comparison to the laptops that I’ve tested. Some people have mused that ARM and the latest SoCs are going to kill off the low end laptop market, but we’re still a ways from that happening, at least from a performance perspective. As slow as HD 3000 can be in comparison to other discrete GPUs, it’s probably still faster than any of the currently shipping SoC GPUs, and HD 4000 is another 50-100% faster than HD 3000. They both also use far more power, but when an iPad 4 includes a battery that holds as much power as many budget laptops, we’re not exactly talking about an insurmountable gulf.

What I really wish we had was more than one of the three tests to run on SoCs. Fire Strike is obviously too much for even notebook GPUs right now, but Cloud Gate ought to be able to run on the better SoCs. Ice Storm on the other hand is running at frame rates over 1000 on a high-end desktop GPU, so if that’s the only point of comparison with the SoCs we’re missing quite a bit of detail. Regardless, it will be nice to have another cross-platform benchmark where we can gauge relative performance, and that looks to be exactly what 3DMark provides.

Initial 3DMark Notebook Results
Comments Locked

69 Comments

View All Comments

  • IanCutress - Tuesday, February 5, 2013 - link

    It is worth noting that FM have integrated a native rendering that scales to your monitor. So Fire Strike on extreme mode is natively rendered at 2560x1440 and then scaled to 1366x768 of the monitor as required. (source: FM_Jarvis on hwbot.org forums)

    Time to fire up some desktop four way, see if it scales :D
  • dj christian - Tuesday, February 5, 2013 - link

    Thanks for a great walkthrough! However i am wondering about the Llano and Trinity systems. Are those mobile and if what brand and model do they run on?
  • JarredWalton - Tuesday, February 5, 2013 - link

    All of the systems other than the desktop are laptops. As for the AMD Llano and Trinity, those are prototype systems from AMD. Llano is probably not up to snuff, as I can't update drivers, but the Trinity laptop runs well -- it was the highest scoring A10 iGPU of the three I have right now (the Samsung and MSI being the other two).
  • Alexvrb - Wednesday, February 6, 2013 - link

    I'm annoyed with Samsung for their memory configuration. 2GB+4GB? I'd like to see tests with that same laptop running a decent pair of 4GB DDR3-1600 sticks. On top of this, even if you can configure one yourself online... they gouge so bad on storage and RAM upgrades that it makes more sense for me to buy it poorly preconfigured and upgrade it myself. I could throw the unwanted parts in the trash and STILL come out way cheaper, not that I would do so.
  • Tuvok86 - Tuesday, February 5, 2013 - link

    so, does the 3rd test look "next gen" and how does it look compared to the best engines?
  • JarredWalton - Tuesday, February 5, 2013 - link

    In terms of the graphics fidelity, these aren't games so it's difficult to compare. I actually find even the Ice Storm test looks decent and makes me yearn for a good space simulation, even though it's clearly the least demanding of the three tests. I remember upgrading from a 286 to a 386 just so I could run the original Wing Commander with Expanded Memory [EMS] and upgrade graphics quality! Tie Fighter, X-Wing, Freespace, and Starlancer all graced my hard drive over the years. Cloud Gate and Fire Strike are more strict graphics demos, though I suppose I could see Fire Strike as fighting game.

    The rendering effects are good, and I'm also glad we're not seeing any rehashing of old benchmarks with updated graphics (3DMark05 and 3DMark06 come to mind). Really, though, if we're talking about games it's really the experience as much as the graphics that matter--look at indie games like FTL where the graphics are simplistic and yet plenty of people waste hours playing and replaying the game. Ultimately, I see 3DMark more as a way of pushing hardware to extremes that we won't see in most games for a few years, but as graphics demos they don't have all the trappings of real games.

    If I were to compare to an actual game, though, even the world of something like Batman: Arkham City looks better in some ways than the overabundant use of particle effects and shaders in Fire Strike. Not that it looks bad (well, it does at single digit frame rates on an HD 4000, but that's another matter), but making a nice looking demo is far different from making a good game. Shattered Horizon is a great example of this, IMO.

    Not sure if any of this helps, but of course you can grab the Basic Edition for free and run it on your own system. Or if you don't have a decent GPU, Futuremark posted videos of all three tests on YouTube I think.
  • euler007 - Tuesday, February 5, 2013 - link

    Reminds me of the days where I had a bunch of batch files to replace my config.sys and autoexec.bat to change my setup depending on what I was doing. I used QEMM back in the days, dunno why I remember that.
  • JarredWalton - Tuesday, February 5, 2013 - link

    QEMM or similar products were necessary until DOS 5.0 basically solved most of the issues. Hahaha... I remember all the Config.SYS tweaking as well. It was the "game before the game"!
  • HisDivineOrder - Tuesday, February 5, 2013 - link

    Kids today have it easy.

    Remember when buying anything not Sound Blaster meant PC gaming hell? I mean, midi was the best it got if you didn't have a Sound Blaster. And that's if you were lucky. Sometimes, you'd just nothing. Total non-functional hell.

    I remember my PC screen being given its first taste of (bad) AA with smeared graphics because you had to put your 2d card through a pass-through to get 3dfx graphics and the signal degraded some doing the pass-through.

    I remember having to actually figure out IRQ conflicts. Without much help from either the system or the motherboard. Just had to suss them out. Or tough luck, dude.

    I remember back when you had all these companies working on x86 processors. Or when AMD and Intel chips could be used on the same motherboards. I remember when Intel told us we just HAD to have CPU's that slotted in via slot. Then told us all the cool kids didn't use slot any more a few years later.

    I can remember a day way back when that AMD used PR performance ratings to make up for the fact that Intel wanted to push speed over performance per clock. Ironic compared to how the two play out now on this front.

    I can remember when Plextor made fantastic optical drives on their own of their own design. And I can remember when we had three players in the GPU field.

    I remember the joy of the Celeron 300a and boosting that bad boy on an Abit BH6 to 450mhz. That thing flew faster than the fastest Pentium for a brief time there. I remember Abit.

    I remember...

    Dear God, at some point there, I started imagining myself as Rutger Hauer in Blade Runner at the end.

    "I've seen things you wouldn't believe. Attack ships on fire off the shoulder of Orion. I watched C-beams glitter in the dark near the Tannhäuser Gate. All those moments will be lost in time, like tears in rain. Time to die.”
  • Parhel - Tuesday, February 5, 2013 - link

    I remember I bought the Game Blaster sound card maybe 6 months before the Sound Blaster came out, and how disappointed I was when I saw the Sound Blaster at the College of Dupage computer show. God, how I miss going to the computer show with my grandfather. I looked forward to it all month long. And the Computer Shopper magazine, in the early days. And reading newsgroups . . . Gosh, I'm getting old.

Log in

Don't have an account? Sign up now