Initial 3DMark Notebook Results

While we don’t normally run 3DMark for our CPU and GPU reviews, we do like to run the tests for our system and notebook reviews. The reason is simple: we don’t usually have long-term access to these systems, so in six months or a year when we update benchmarks we don’t have the option to go back and retest a bunch of hardware to provide current results. That’s not the case on desktop CPUs and GPUs, which explains the seeming discrepancy. 3DMark has been and will always be a synthetic graphics benchmark, which means the results are not representative of true gaming performance; instead, the results are a ballpark estimate of gaming potential, and as such they will correlate well with some titles and not so well with others. This is the reason we benchmark multiple games—not to mention mixing up our gaming suite means that driver teams have to do work for the games people actually play and not just the benchmarks.

The short story here (TL;DR) is that just as Batman: Arkham City, Elder Scrolls: Skyrim, and Far Cry 3 have differing requirements and performance characteristics, 3DMark results can’t tell you exactly how every game will run—the only thing that will tell you how game X truly scales across various platforms is of course to specifically benchmark game X. I’m also more than a little curious to see how performance will change over the coming months as 3DMark and the various GPU drivers are updated, so with version 1.00 and current drivers in hand I ran the benchmarks on a selection of laptops along with my own gaming desktop.

I tried to include the last two generations of hardware, with a variety of AMD, Intel, and NVIDIA hardware. Unfortunately, there's only so much I can do in a single day, and right now I don't have any high-end mobile NVIDIA GPUs available. Here’s the short rundown of what I tested:

System Details for Initial 3DMark Results
System CPU (Clocks) GPU
(Core/RAM Clocks)
RAM (Timings)
Gaming Desktop Intel Core i7-965X
4x3.64GHz (no Turbo)
HD 7950 3GB
900/5000MHz
6x2GB DDR2-800
675MHz@9-9-9-24-2T
Alienware M17x R4 Intel Core i7-3720QM
4x2.6-3.6GHz
HD 7970M 2GB
850/4800MHz
2GB+4GB DDR3-1600
800MHz@11-11-11-28-1T
AMD Llano AMD A8-3500M
4x1.5-2.4GHz
HD 6620G
444MHz
2x2GB DDR3-1333
673MHz@9-9-9-24
AMD Trinity AMD A10-4600M
4x2.3-3.2GHz
HD 7660G
686MHz
2x2GB DDR3-1600
800MHz@11-11-12-28
ASUS N56V Intel Core i7-3720QM
4x2.6-3.6GHz
GT 630M 2GB
800/1800MHz

HD 4000@1.25GHz
2x4GB DDR3-1600
800MHz@11-11-11-28-1T
ASUS UX51VZ Intel Core i7-3612QM
4x2.1-3.1GHz
GT 650M 2GB
745-835/4000MHz
2x4GB DDR3-1600
800MHz@11-11-11-28-1T
Dell E6430s Intel Core i5-3360M
2x2.8-3.5GHz
HD 4000@1.2GHz 2GB+4GB DDR3-1600
800MHz@11-11-11-28-1T
Dell XPS 12 Intel Core i7-3517U
2x1.9-3.0GHz
HD 4000@1.15GHz 2x4GB DDR3-1333
667MHz@9-9-9-24-1T
MSI GX60 AMD A10-4600M
4x2.3-3.2GHz
HD 7970M 2GB
850/4800MHz
2x4GB DDR3-1600
800MHz@11-11-12-28
Samsung NP355V4C AMD A10-4600M
4x2.3-3.2GHz
HD 7670M 1GB
600/1800MHz

HD 7660G 686MHz
(Dual Graphics)
2GB+4GB DDR3-1600
800MHz@11-11-11-28
Sony VAIO C Intel Core i5-2410M
2x2.3-2.9GHz
HD 3000@1.2GHz 2x2GB DDR3-1333
666MHz@9-9-9-24-1T

Just a quick note on the above laptops is that I did run several overlapping results (e.g. HD 4000 with dual-core, quad-core, and ULV; A10-4600M with several dGPU options), but I’ve taken the best result on items like the quad-core HD 4000 and Trinity iGPU. The Samsung laptop also deserves special mention as it supports AMD Dual Graphics with HD 7660G and 7670M; my last encounter with Dual Graphics was on the Llano prototype, and things didn’t go so well. 3DMark is so new that I wouldn’t expect optimal performance, but I figured I’d give it a shot. Obviously, some of the laptops in the above list haven’t received a complete review, and in most cases those reviews are in progress.

And with that out of the way, here are the results. I’ll start with the Ice Storm tests, followed by Cloud Gate and then Fire Strike.

3DMark Ice Storm

3DMark Ice Storm

3DMark Ice Storm

3DMark Cloud Gate

3DMark Cloud Gate

3DMark Cloud Gate

3DMark Fire Strike

3DMark Fire Strike

3DMark Fire Strike

As expected, the desktop typically outpaces everything else, but the margins are a bit closer than what I experience in terms of actual gaming. Generally speaking, even with an older Bloomfield CPU, the desktop HD 7950 is around 30-60% faster than the mobile HD 7970M. Thanks to Ivy Bridge, the CPU side of the equation is actually pretty close, so the overall scores don’t always reflect the difference but the graphics tests do. The physics tests even have a few instances of mobile CPUs besting Bloomfield, which is pretty accurate—with the latest process technology, Ivy Bridge can certainly keep up with my i7-965X.

Moving to the mobile comparisons, at the high end we have two laptops with HD 7970M, one with Ivy Bridge and one with Trinity. I made a video a while back showing the difference between the two systems running just one game (Batman), and 3DMark again shows that with HD 7970M, Trinity APUs are a bottleneck in many instances. Cloud Gate has the Trinity setup get closer to the IVB system, and on the Graphics score the MSI GX60 actually came out just ahead in the Fire Strike test, but in the Physics and Overall scores it’s never all that close. Physics in particular shows very disappointing results for the AMD APUs, which is why even Sandy Bridge with HD 3000 is able to match Llano in the Ice Storm benchmark (though not in the Graphics result).

A look at the ASUS UX51VZ also provides some interesting food for thought: thanks to the much faster CPU, even a moderate GPU like the GT 650M can surpass the 3DMark results of the MSI GX60 in two of the overall scores. That’s probably a bit much, but there are titles (Skyrim for instance) where CPU performance is very important, and in those cases the 3DMark rankings of the UX51VZ and the GX60 are likely to match up; in most demanding games (or games at higher resolutions/settings), however, you can expect the GX60 to deliver a superior gaming experience that more closely resembles the Fire Strike results.

The Samsung Series 3 with Dual Graphics is another interesting story. In many of the individual tests, the second GPU goes almost wholly unused—note that I’d expect updated drivers to improve the situation, if/when they become available. The odd man out is the Cloud Gate Graphics test, which scales almost perfectly with Dual Graphics. Given how fraught CrossFire can be even on a desktop system, the fact that Dual Graphics works at all with asymmetrical hardware is almost surprising. Unfortunately, with Trinity generally being underpowered on the CPU side and with the added overhead of Dual Graphics (aka Asymmetrical CrossFire), there are many instances where you’re better off running with just the 7670M and leaving the 7660G idle. I’m still working on a full review of the Samsung, but while Dual Graphics is now at least better than what I experienced with the Llano prototype, it’s not perfect by any means.

Wrapping things up, we have the HD 4000 in three flavors: i7-3720QM, i5-3360M, and i7-3517U. While in theory they iGPU is clocked similarly, as I showed back in June, on a ULV platform the 17W TDP is often too little to allow the HD 4000 to reach its full potential. Under a full load, it looks like HD 4000 in a ULV processor can consume roughly 10-12W, but the CPU side can also use up to 15W. Run a taxing game where both the CPU and iGPU are needed and something has to give; that something is usually iGPU clocks, but the CPU tends to throttle as well. Interestingly, 3DMark only really seems to show this limitation with the Ice Storm tests; the other two benchmarks give the dual-core i5-3360M and i7-3517U very close results. In actual games, however, I don’t expect that to be the case very often (meaning, Ice Storm is likely the best representation of how HD 4000 scales across various CPU and TDP configurations).

HD 4000 also tends to place quite well with respect to Trinity and some of the discrete GPUs, but in practice that’s rarely the case. GT 630M for instance was typically 50% to 100% (or slightly more) faster than HD 4000 in the ASUS N56V Ivy Bridge prototype, but looking at the 3DMark results it almost looks like a tie. Don’t believe those relative scores for an instant; they’re simply not representative of real gaming experiences. And that is one of the reasons why we continue to look at 3DMark as merely a rough estimate of performance potential; it often gives reasonable rankings, but unfortunately there are times (optimizations by drivers perhaps) where it clearly doesn’t tell the whole story. I’m almost curious to see what sort of results HD 4000 gets with some older Intel drivers, as my gut is telling me there may be some serious tuning going on in the latest build.

3DMark for Windows Overview Initial Thoughts on 3DMark “2013”
Comments Locked

69 Comments

View All Comments

  • mindstorm - Wednesday, February 6, 2013 - link

    Result for my Dell M6700 with a FirePro M6000 2GB ram. Test with 13.2 beta 4.

    http://www.3dmark.com/3dm/63286?

    Ice Storm
    Overal: 92056
    Graphics Score 128702
    Physics Score 46107

    Cloudgate
    Overall: 12160
    Graphics Score 15605
    Physics Score 6860

    Firestrike
    Overal: 2241
    Graphics Score 2353
    Physics Score 9565
    Combined Score 896

    Better performance on the Ice Storm test then the other notebooks tested. Cloudgate 2nd (of notebooks) and firestrike 3th of notebooks.
    All seem to be due to a low graphics score. I suppose this is due to the fact that the M6000 is optimized fore business apps. Although I wonder how much drivers could be of an isue since all the versions I have been trying so far havn't been very stable. 13.1 was a bit more stable then 13.2 overclocking is out of the option with 13.2 beta 4.
  • Notmyusualid - Wednesday, February 6, 2013 - link

    M18x R2.

    i7 3920XM, 16GB CAS10 RAM.

    Left number is your M17x R4's results, middle is the my M18x R2 on ONE 7970M card, right is both 7970Ms in Crossfire (reduced cpu multipliers).:

    Ice Storm:
    85072 / 146723 / 157182
    116561 / 273526 / 343640
    43727 / 55947 / 54218

    Cloud Gate:
    16729 / 18157/ 24279
    30624 / 32362 / 65783
    6491 / 7159 / 7568

    Fire Strike:
    4332 / 4467 / 7647
    4696 / 4807 / 9542
    8910 / 10467 /10449
    Also: COMBINED score: 2699

    No GPU overclocking.

    Operating System is Win 8 (Bloatware) x64, and can't reverse install, and too lazy to restore backup.
    Graphics drivers are AMD_Catalyst_13.2_Beta5. (13.1 didn't install at all, 13.2 completed only once). No CAP file for this one yet I don't think.

    Crossfire working, but Intel's XTU informs me I was 'throttling', (somewhat) new to me. I game all day on multipliers of 4 cores @ x44, but this benchmark seems to push the system harder.
    XTU reports 3% throttling over the benchmark period @ 4cores x44.
    I dropped to x44,43,42,41 on beta 5 to allow it to complete the benchmark.

    Oddly, Super Pi completes two 32M runs back to back without error on x44,44,44,44. (and it does it a whole 30s slower than on Win 7). Ambient room temp is ~29C. (It's 35C outside).

    Hope that is of interest to someone out there.
  • Spunjji - Thursday, February 7, 2013 - link

    Definitely of interest. It's odd how the Graphics scores increase dramatically with Crossfire, as you'd expect, but somehow the Overall scores don't. Seems they've either calibrated the engine or the score generator such that it leans towards overall system performance more than the GPU. Time will tell whether that gives us a reasonable picture of the system's performance, but past experience suggests it won't.
  • Notmyusualid - Friday, February 8, 2013 - link

    Yes, good point!

    The title of the product implies it is for testing 3D performance, yet with significant increases in GPU performance, it is shocking to not see that increase in performance reflected in the overall scores.

    I took another look at some other scores online, and it appears to me that the 'Physics Score', is really just a CPU test.

    And I thought 3DMark06 was being deprecated for focusing on CPU performance to much (as well as for not being DX11 capable), rather than simply reflecting overall 3D performance, as we'd expect from such a title.

    But then, I guess, some games are more CPU-dependent, than others, so maybe it would be a mistake to leave out this sort of test for the average user looking to benchmark his systems' overall gaming performance? I can't say for sure.
  • Krysto - Saturday, February 9, 2013 - link

    I'm really skeptical about 3dmark11 outputting scores that have 1:1 parity between DirectX machines and OpenGL ES 2.0 machines. If it doesn't, then it would be pretty useless for Anandtech's benchmarks, because you're trying to compare GPU's and hardware, not graphics API's.

    So if say Tegra 3 on Nexus 7 gives a 1000 score, and the Surface RT gives a 1500 score, because the benchmark gives higher score to certain DirectX features, then the benchmark is useless. Because it was supposed to show the GPU's are equal, and it won't.

    That's just a speculation for now, but Anand and the guys should really watch out for this.
  • Krysto - Saturday, February 9, 2013 - link

    To be clear, if the drivers are better on one device than the other, then the benchmark SHOULD reflect that, and I'm sure it will. Also it should reflect a higher score if the graphics look better on DirectX machines or anything like that (although that will probably come with a hit in battery life, but that's another issue).

    What I'm saying is that if everything else is EQUAL , DirectX shouldn't get higher score, just because it's a more complex API than OpenGL ES 2.0. That wouldn't be on its merits.

    Also, I'm very disappointing they are making such a big launch out of this, and they aren't even going to support OpenGL ES 3.0 out of the gate, even though it will probably be almost a year even before they release their OpenGL ES 2.0 benchmark, compared to the time when OpenGL ES 3.0 was launched last year.

    Clearly they didn't want to prioritize the OpenGL ES part much, even 2.0, let alone 3.0. We might not see 3.0 support until mid 2014 at the earliest from them. Hopefully GLBenchmark 3.0 will come out this year.
  • shuhan - Wednesday, February 13, 2013 - link

    Anyone knows what might be the reason for this:

    http://www.3dmark.com/3dm/201917

    ?

    Thanks
  • shuhan - Wednesday, February 13, 2013 - link

    Just saw that:
    "Why is my Ice Storm score so high with my M17x R4 with a 3720QM and 7970M?

    http://www.3dmark.com/3dm/18860

    129064 overall
    244590 graphics
    48646 physics"

    My result is exactly the same! My rig: Intel Core i5-3570K @4.3, 2x GTX 670

    And then my Cloud Gate test scores lower.
  • failquail - Sunday, February 17, 2013 - link

    Certainly seems pretty :)

    I have CPU and GPU monitoring gadgets running on my second screen, i noticed that whilst the first two tests seemed fairly balanced for CPU/GPU, the third firestrike test maxed out the GPU usage the entire time with the CPU barely hitting 30% usage. A test for the sli/crossfire crowd i think :)

    Not sure if it's just a display bug, but it didn't detect my GPU driver at all (ati 6950 modded with 6970 shader count, but GPU/RAM clocks still a little lower than a default 6970) and it detected my fx-8120 CPU (set to 3.4ghz or 4.2ghz turbo) as being 1.4ghz.

    Still first go was this:
    http://www.3dmark.com/3dm/241866
    79420/14031/3423

    I need to rerun it though as i had lots of background stuff running and i think i still had AA forced in the GPU driver!

Log in

Don't have an account? Sign up now