Initial Thoughts on 3DMark “2013”

First, let me say that while I understand the reasoning behind eliminating the year/version from the name, I’m going to generally refer to this release as 3DMark 2013, as there will inevitably be another 3DMark in a year or two. With that out of the way, how does this latest release stand up to previous iterations, and is it a useful addition to the benchmark repertoire?

No benchmark is ever perfect, and even “real world gaming benchmarks” can only tell part of the story. As long as we keep that thought forefront when looking at the latest 3DMark, the results are completely reasonable. With the overall scores using both the Graphics and Physics tests, it will always be beneficial to have a fast CPU and GPU working together for 3DMark rather than pairing a fast GPU with a mediocre CPU, but I can’t say that such an approach is wrong—no matter what some companies might try to say, there are always potential uses for more CPU power in games (physics and AI immediately come to mind), though not every game will need a ton of CPU performance.

In terms of advancing the state of the benchmarking industry, it’s good to see the demo modes (cool graphics with sound are more enticing to the average person than a pure graphics benchmark). I also like the addition of graphs that show performance, power, temperatures, etc., though I wish they worked on all of the hardware rather than only some of the platforms. There’s at least the potential to now use 3DMark on its own to do stress testing without running additional utilities (HWiNFO or similar) in the background.

What I want to see now is how the various tablet and smartphone offerings stack up in comparison to the laptops that I’ve tested. Some people have mused that ARM and the latest SoCs are going to kill off the low end laptop market, but we’re still a ways from that happening, at least from a performance perspective. As slow as HD 3000 can be in comparison to other discrete GPUs, it’s probably still faster than any of the currently shipping SoC GPUs, and HD 4000 is another 50-100% faster than HD 3000. They both also use far more power, but when an iPad 4 includes a battery that holds as much power as many budget laptops, we’re not exactly talking about an insurmountable gulf.

What I really wish we had was more than one of the three tests to run on SoCs. Fire Strike is obviously too much for even notebook GPUs right now, but Cloud Gate ought to be able to run on the better SoCs. Ice Storm on the other hand is running at frame rates over 1000 on a high-end desktop GPU, so if that’s the only point of comparison with the SoCs we’re missing quite a bit of detail. Regardless, it will be nice to have another cross-platform benchmark where we can gauge relative performance, and that looks to be exactly what 3DMark provides.

Initial 3DMark Notebook Results
Comments Locked

69 Comments

View All Comments

  • JarredWalton - Tuesday, February 5, 2013 - link

    Celery 300a was pretty awesome, wasn't it? I think that's around the time I started reading Tom's Hardware and AnandTech (when it was still on Geo Cities!) I had that same Abit BH6 motherboard, I think... I also think I used an Abit IT5H before that, with the bus running at 83.3MHz and my Pentium 200 MMX ripping along at 250MHz! And I had a whopping 64MB or RAM.

    But even better was the good old days before we even had a reasonable Windows setup. Yeah, I recall installing Windows 2.x and it sucked. 3.0 was actually the first usable version, and there were even Windows alternatives way back then -- I had a friend running some alternative Windows OS as well. GEM maybe, or Viewmax? I can't recall, other than that it didn't really work properly for what we were trying to do at the time (play games).
  • Dug - Wednesday, March 13, 2013 - link

    I remember deciding between 8 and 12mb and then getting my 2nd voodoo card later and running time demos of Quake over and over again after overclocking. The 300 to 450 was fun times too.
  • WeaselITB - Tuesday, February 5, 2013 - link

    RE: Space Sims
    http://www.robertsspaceindustries.com/star-citizen...
  • Parhel - Tuesday, February 5, 2013 - link

    If that ever sees the light of day, I'll buy it in a second. Freelancer was one of all time favorite games. I still play it from time to time.
  • silverblue - Wednesday, February 6, 2013 - link

    Seconded... great game with excellent atmosphere.
  • Peanutsrevenge - Wednesday, February 6, 2013 - link

    Just incase you've not heard (somehow).

    Check out Star Citizen.

    Also cool is Diaspora, which is actually playable.!
  • alwayssts - Tuesday, February 5, 2013 - link

    It looks exactly like the video. :-P

    I'm not super impressed by how it looks, but appreciate what I think it's saying.

    I am of the mind Fire Strike gpu tests are FM saying to the desktop community 'if it can run test 1 with 30fps, your avg framerate in dx11 titles should be ok for the foreseeable future. If it can run test 2 at 30 frames, your minimum should be okay'. The combined test obviously shows what the current feature-set can do (and how far away from using it's full potential in a realistic manner current hardware is). Perhaps it's not a guide, but a suggestion and/or an approximation of where games and gpus are headed, and I think it's a reasonable one at that.

    For reference, test 1 would need ~ a stock 670/highly overclocked 7870/slightly overclocked Tahiti LE or 660ti, test 2 ~ a stock 680 or 7970/highly overclocked 670 or Tahiti LE/moderately overclocked 7950 for 1080p. IOW, pretty much what people use.

    The combined score looks to be a rough estimation of what to expect from the 20nm high-end to get 30fps at 1080p, which also makes sense, as it will likely be the last generation/process to target 1080p before it it becomes realistic to see higher-end consumer single screens become 4k (probably around 14nm and 2016-2017ish). Also, the differance in frame rate from the combined test to gpu test 2 is approx the difference in resolution from 720p to 1080p...so it works on a few different levels.
  • Dustin Sklavos - Tuesday, February 5, 2013 - link

    An AT editor running Nehalem and only 12GB of RAM?

    What are you, some kind of animal? Was that the only computer you could fit into your cave? ;)
  • JarredWalton - Tuesday, February 5, 2013 - link

    Hey, I only upgraded from Core 2 Quad on my desktop last year! On the bright side, I have a bunch of laptops that are decent.
  • Penti - Tuesday, February 5, 2013 - link

    Nehalem is quite fine. Performance hasn't changed all that much. Servers are still stuck on Nehalem|Westmere/Xeon-"MPs" or Sandy Bridge-EP/Xeon-E5. Should count as a modern desktop :) Really it's quite nice that a 2-4 year old system isn't ancient and can't drive games, support lots of memory or whatever like how it was in the not so old days. It's still PCI-e 2.0, DDR3 etc. You really wouldn't exactly have thought of putting say a Radeon 9800 Pro into a 600-800MHz Pentium 3 Coppermine machine.

Log in

Don't have an account? Sign up now