Initial Thoughts on 3DMark “2013”

First, let me say that while I understand the reasoning behind eliminating the year/version from the name, I’m going to generally refer to this release as 3DMark 2013, as there will inevitably be another 3DMark in a year or two. With that out of the way, how does this latest release stand up to previous iterations, and is it a useful addition to the benchmark repertoire?

No benchmark is ever perfect, and even “real world gaming benchmarks” can only tell part of the story. As long as we keep that thought forefront when looking at the latest 3DMark, the results are completely reasonable. With the overall scores using both the Graphics and Physics tests, it will always be beneficial to have a fast CPU and GPU working together for 3DMark rather than pairing a fast GPU with a mediocre CPU, but I can’t say that such an approach is wrong—no matter what some companies might try to say, there are always potential uses for more CPU power in games (physics and AI immediately come to mind), though not every game will need a ton of CPU performance.

In terms of advancing the state of the benchmarking industry, it’s good to see the demo modes (cool graphics with sound are more enticing to the average person than a pure graphics benchmark). I also like the addition of graphs that show performance, power, temperatures, etc., though I wish they worked on all of the hardware rather than only some of the platforms. There’s at least the potential to now use 3DMark on its own to do stress testing without running additional utilities (HWiNFO or similar) in the background.

What I want to see now is how the various tablet and smartphone offerings stack up in comparison to the laptops that I’ve tested. Some people have mused that ARM and the latest SoCs are going to kill off the low end laptop market, but we’re still a ways from that happening, at least from a performance perspective. As slow as HD 3000 can be in comparison to other discrete GPUs, it’s probably still faster than any of the currently shipping SoC GPUs, and HD 4000 is another 50-100% faster than HD 3000. They both also use far more power, but when an iPad 4 includes a battery that holds as much power as many budget laptops, we’re not exactly talking about an insurmountable gulf.

What I really wish we had was more than one of the three tests to run on SoCs. Fire Strike is obviously too much for even notebook GPUs right now, but Cloud Gate ought to be able to run on the better SoCs. Ice Storm on the other hand is running at frame rates over 1000 on a high-end desktop GPU, so if that’s the only point of comparison with the SoCs we’re missing quite a bit of detail. Regardless, it will be nice to have another cross-platform benchmark where we can gauge relative performance, and that looks to be exactly what 3DMark provides.

Initial 3DMark Notebook Results
Comments Locked

69 Comments

View All Comments

  • Diagrafeas - Tuesday, February 5, 2013 - link

    2600K - 7970 OC
    http://www.3dmark.com/3dm/15450
    and a video from the run
    http://www.youtube.com/watch?feature=player_embedd...
  • lever_age - Tuesday, February 5, 2013 - link

    I rarely game on this, but I own a system very similar in specs to the ASUS UX51VZ tested (i7-3630QM instead of i7-3612QM, same exact clocks and memory on a GT 650M, same exact clocks and timing on the RAM) and got somewhat better results. Running 310.90 WHQL.

    Left number is my Inspiron 17R SE's result, right is the UX51VZ:

    Ice Storm:
    74570 / 58955
    96962 / 72342
    41239 / 35781

    Cloud Gate:
    9514 / 8892
    11477 / 11056
    5952 / 5277

    Fire Strike:
    1454 / 1328
    1518 / 1367
    8288 / 7382

    Drivers, or I guess it could be the dynamic clocks of the GPU working higher. Some of the difference is the i7-3630QM at 300-400 MHz higher than a i7-3612QM, but is that supposed to show in the graphics subscores?
  • JarredWalton - Tuesday, February 5, 2013 - link

    It could. I tested all systems with either Catalyst 13.1 or NVIDIA 313.96 drivers, with the exception of the Llano system that's running some ancient driver set because nothing newer will install.
  • relztes - Tuesday, February 5, 2013 - link

    Do the graphics and physics tests run sequentially? With integrated graphics they really need to be stressing both components simultaneously. Giving the GPU 17 W and then the CPU 17 W won't reflect the performance in games when the two components have to share 17 W. If the ULV parts hold up a lot better in 3DMark than they do in real games, I think that is to blame.

    Also, how do these test results change running on battery?
  • JarredWalton - Tuesday, February 5, 2013 - link

    That's the point I'm trying to make: other than Ice Storm, ULV IVB is far closer to DC SV IVB than I would expect it to be, which suggests the results are going to correlate all that well with real games for HD 4000.

    As for running on battery, how the results change depends heavily on what power settings you use, at least with most of the systems. Set everything for maximum performance and you'll get less battery life but should have similar performance to what you see in the charts. Drop to Power Saver and optimize for battery life and you'll get anywhere from one half to maybe 20% of the performance (e.g. HD 7970M will drop the clocks pretty far if you optimize for power).
  • Spunjji - Thursday, February 7, 2013 - link

    They do indeed run sequentially, except for the final Fire Strike test which is "combined" and also the most taxing by some margin. So you may be on to something there.

    It's also worth noting that in fine tradition the scores appear to be frame-rate based, so will take little account of the awful stop-start fast-slow chuntering that you get with Intel integrated graphics when they're at their thermal limits.
  • Landspeeder - Tuesday, February 5, 2013 - link

    I would LOVE to see specs for the current crop of high end gaming notebooks. For example -
    The Clevo P370EM3 / Sager NP9370-3D single 670M, 670M SLI, 680M, 680M SLI, 7970M, 7970M xfire.
    The suite of tests done in both 2D and 3D.
    The suite of tests done in 7x64 and 8.
    Mmmm... can you tell what I've been itching to pick up?
  • JarredWalton - Tuesday, February 5, 2013 - link

    Other than the Alienware M17x R4 and the MSI GX60, I don't have any high-end laptops right now (though I'm trying to get something with GTX 680M to use as a comparison for the above). Regardless, some time in the next few months I'm sure we'll have another high-end laptop or two for testing.
  • Landspeeder - Tuesday, February 5, 2013 - link

    Thanks for all that you do Jarred!

    We are transitioning many of our number crunchers at work from self-built desktops to laptops - the shifting focus of reviews to such equipment here at Anandtech has been a fantastic help and I eagerly consume everything you folks put out.

    What city are you located in? If near enough I’d be willing to drop a dual 680M rig loaner for a few days when I pull the trigger post bonus. Have you been able to beg XOTICPC/Sager/AVADirect/ETC for a unit? If not, would you like to enlist my help?

    Given the sheer power available on the high end-laptops I’m planning on replacing my gaming rig – a massively overclocked watercooled i920 with dual 280 greens. Most of the games I want to play have some amount of support for 3D. Few games are really pushing today's GPUs/CPUs as the vendors are reined back by console specs. Even the next gen console specs appear to be a bit underwhelming. I’m fairly confident that Clevo’s massive SLI rig fit with dual 680Ms should last a good many years – hopefully. I plan on utilizing a hefty factory overclock, so I hope the heat doesn’t kill the components too quickly. It also helps push this decision that my job now sees me traveling – often with long layovers – and my aging desktop rig is seeing very little use.
  • MrSpadge - Tuesday, February 5, 2013 - link

    That's what the ORB is for. Just wait a few mode days and the results should be there.

Log in

Don't have an account? Sign up now