Diablo III Mobile Performance Compared

So far we’ve determined that Diablo III isn’t a particularly taxing game, especially early on—at least not for your GPU; your mouse buttons might be a different story!—and that AMD, Intel, and NVIDIA graphics solutions deliver comparable image quality. The only question that remains is how quickly they can deliver that result to your display. We’ve used quite a few different laptops to see what sort of performance you can expect with Diablo III. Here’s the quick rundown.

First up, from AMD we have a Llano prototype with an A8-3500M APU and integrated HD 6620G graphics. There are faster clocked Llano APUs in terms of CPU performance, but by default all of the A8 GPUs run at 444MHz with 400 Radeon Cores. Second is our Trinity prototype laptop with and A10-4600M (HD 7660G graphics), running 384 Radeon Cores at a substantially higher 686MHz clock. A third option from AMD is the discrete Radeon HD 6630M, and we tested three laptops with that GPU; first is the Llano A8-3500M APU, second is a Sony VAIO C with a faster Intel i5-2410M CPU, and third is a Sony VAIO SE with an i7-2640M. This will at least give us some indication of whether or not CPU performance is a factor in Diablo III performance.

Unfortunately, we do have to make a note on the drivers for the HD 6630M laptops: all three laptops aren’t able to run the latest AMD reference drivers, as they all use some form of switchable graphics. The prototype Llano system (with drivers from June 2011) can be excused, as there’s not much point for AMD to invest a lot of time improving the drivers or end user experience on that laptop, but Sony’s laptops continue to be a concern with their often-over-six-months-old drivers. The VAIO C is using drivers that date back to June 2011 (released in October) while the VAIO SE actually is lucky as it had a driver update from Sony earlier this month; unfortunately, the driver build still appears to date back to December 2011. We didn’t notice any rendering issues with any of the 6630M laptops, but bear in mind that it’s possible performance is lower due to the outdated drivers.

From the Intel camp, we tested three different laptops. On the low end of the spectrum is a Dell Vostro V131 with i5-2410M CPU and HD 3000 graphics. We also tested with a quad-core i7-2820QM and HD 3000 graphics to see how much the slightly higher IGP clocks and significantly faster CPU matter with Diablo III. The third laptop is the ASUS N56VM Ivy Bridge prototype, with an i7-3720QM CPU and HD 4000 graphics. We do have a fourth Intel option on hand, an Intel Ultrabook with IVB ULV, but we can’t report the CPU model yet and I’m not sure about talking performance, so we’ll hold off discussing that for a few more days. Anand did test an ASUS UX21A in Diablo III and you can read his comments, but he used a different test sequence and again we can’t name the exact CPU he used, so stay tuned if you want to find out how dual-core (and potentially less expensive) Ivy Bridge matches up against Llano and Trinity.

Finally, from NVIDIA we’ve got the same ASUS N56VM with i7-3720QM, only this time we’ve enabled the GT 630M graphics. We also ran some tests with an Acer AS3830TG that has an i5-2410M CPU with GT 540M graphics. The Acer is known to have issues with CPU throttling in some games, but it does have higher clocks on the GPU than the N56VM, so this will give us some indication of how much—or how little—CPU performance matters with Diablo III. Finally, we also have in a second Clevo W110ER in for review, this time from AVADirect, with an i7-3610QM and GT 650M graphics. Overkill for Diablo III? Most likely, but it’s an awfully compact laptop for that much hardware!

Here are the benchmark results; again, keep in mind that the in-town comparisons are using an identical FRAPS run whereas the Old Ruins area is slightly randomized as far as monster locations and quantity and is more prone to variance between runs. Note that we didn’t bother running Sandy Bridge HD 3000 at our Enthusiast settings with the i7-2820QM; it was already struggling at our Mainstream settings, and the i5-2410M results will tell you everything you need to know about how well HD 3000 handles maxed out settings.

Update: As noted earlier, many are saying the later stages and higher difficulty levels can really start to drop frame rates. Take the following graphs as a reference point, and plan on dropping some detail settings and/or resolution later in the game on lower end hardware.

Diablo III - Value - New Tristram

Diablo III - Value - Old Ruins

Diablo III - Mainstream - New Tristram

Diablo III - Mainstream - Old Ruins

Diablo III - Enthusiast - New Tristram

Diablo III - Enthusiast - Old Ruins

There’s plenty of data to cover, so let’s just start at the top with the discrete NVIDIA GPUs. Not surprisingly, the GT 650M powers through Diablo III without any issues; even at maximum detail and 1080p resolution, it’s still pulling nearly 40 FPS. The second set of GPUs, the GT 630M in the N56VM and the GT 540M in the Acer AS3830TG, are in theory supposed to be roughly the same performance. However, we've seen in the past that the Acer sometimes has issues with throttling, so potentially the GT 540M is running with a thermally constrained CPU in the AS3830TG. The charts above clearly show that the Acer can’t keep up with the Ivy Bridge solution. Either Diablo III is very good at using multi-core CPUs (doubtful, given what we saw with Blizzard’s StarCraft II, not to mention a quick look at Perfmon with Diablo III), or the Acer is once again not hitting higher clock speeds.

Update #2: So it appears that the ASUS N56VM is not running a lower clocked GPU; in fact, the opposite is true. NVIDIA's control panel reports 475MHz on the GPU core, 950MHz on the shaders. I've been a bit confused about the performance since day one, but several other utilities reported 475MHz as well, including GPU-Z. Interestingly however, I just ran GPU-Z with the sensor logging option while doing a FRAPS run in Diablo III. Instead of 475/950MHz, the sensors tab is instead reporting 797.3/1594.7MHz. Mystery solved: the GT 630M in the N56VM is actually clocked almost 20% higher than the stock GT 540M. That would explain the differences seen above.

We did a quick check and found that the typical CPU clocks for the i5-2410M during our test sessions typically ranged from 800MHz to 1.7GHz range, which you can see in the above image. (Side note: we also tested with ThrottleStop active, which is what the above chart shows; it was set to a 21X multiplier, but clearly that didn't work as intended.) The average clock speeds of the two cores during our test sequance are a rather slow 1200MHz and 1085MHz, so clearly the CPU isn't really providing the sort of clocks we usually see on i5-2410M. However, Diablo III doesn’t appear to need a ton of CPU performance; given the new information we have on the GT 630M clocks (see update above), it appears that Diablo III simply doesn't push the Acer hard enough to activate higher CPU clocks most of the time.

The second grouping of scores is mostly in red/orange, representing the AMD GPUs/APUs. For the red bars, Trinity and Llano both provide acceptable performance at our Value settings, and they’re still fast enough for the Mainstream settings—remember as we mentioned in the intro that Diablo III is actually quite playable at anything above 20 FPS. Once we hit our Enthusiast settings, both drop quite a bit; Trinity remains tolerable, but Llano definitely can’t keep up and you’d need to drop the Shadow Quality to Low at the very least for 1080p. Another really interesting piece of information we discover is that Trinity with it’s integrated GPU is still faster across the board than the HD 6630M (though there’s a possibility HD 6630M is being hurt by the outdated drivers). As for the three way HD 6630M comparison, CPU performance does appear to help a bit—the i7-2640M is typically slightly faster than the i5-2410M and A8-3500M—but the largest spread is only 15% at our Value settings; at Mainstream the gap drops a bit to 10-12%, while at Enthusiast it’s under 10%. Given the frame rates, the extra 15% never really means the difference between unplayable and playable; all three laptops with HD 6630M tend to handle up to our Mainstream settings quite well.

The final three lines are the blue Intel IGP results. HD 4000 with quad-core Ivy Bridge trails Llano across all settings, though it’s often close enough. Performance at Mainstream is a bit questionable; sure, you can play Diablo III well enough in our experience at 20-25 FPS, but it’s not going to be the smoothest result. Llano may only be 3-4 FPS faster at Mainstream, but that 12% performance increase is just enough to make the result a bit smoother. Your best bet with HD 4000 is ultimately going to be turning the Shadow Quality down to Low/Off, and then running at 1600x900.

As for Sandy Bridge’s HD 3000 IGP, perhaps the less said the better. Even at our Value settings, it only qualifies as tolerable, and at Mainstream it’s quite choppy—you could still play Diablo III at 13-18 FPS in a pinch, but I wouldn’t recommend it, and I doubt it would work well in multiplayer. Once frame rates drop below 15 FPS, it appears the engine starts to slow down rather than just skipping animations. Our New Tristram run usually takes around 20 seconds to complete (even at 20.1 FPS on the HD 4000), but when frame rates are in the low teens the time for the town run increases to around 30 seconds. Single-player is still possible, but that’s as far as I’d go—and it will take longer for everything you do, thanks to the moderate slowdown. When the HD 3000 drops below 10 FPS, what was sluggish takes a major nosedive; the town run required just over 60 seconds to complete, and the Old Ruins run that usually requires about 100-110 seconds clocked in at 308 seconds. Yup, there’s a reason we didn’t try suffering through the Enthusiast benchmark a second time on HD 3000!

Other Performance Tests

We did a few other tests to round out our performance information, though we didn't repeat the tests multiple times or run them on all of the systems. For one test, we used our Enthusiast settings but with Shadows on Low/Off with the HD 4000; the result of the testing is scores that are slightly better than the Trinity scores with Shadows on High. With Low shadows at 1080p, New Tristram scored 20.1 FPS and the Old Ruins scored 18.5 FPS; drop the shadows to Off and New Tristram runs at 27.1 FPS with Old Ruins at 24.8 FPS. In total, the difference between High Shadow Quality and Low Shadow Quality is over 50%, and going from Low to Off is another 35%. The other test was to use our maxed out settings but at 1366x768, again on the HD 4000. The frame rates were 17.3/16.4, or around 35% faster than at 1080p.

Given those results, it appears that Shadow Quality is the single most demanding setting, trumping even resolution. On HD 4000, you can basically double your performance at 1080p by turning off the shadows. Without doing in-depth testing (remember, we're looking at about five minutes to set up and run each benchmark setting, so I've already spent around 10 hours just doing the basic set of results shown above, not to mention testing other settings!), I can't say for certain, but my general impression is that the results are similar with other IGPs/GPUs.

Diablo III Graphics Settings and Image Quality Detailed FRAPS Runs and Closing Thoughts
POST A COMMENT

87 Comments

View All Comments

  • JarredWalton - Sunday, May 27, 2012 - link

    Yes, all of the higher than 1366x768 results were done on an external LCD where required (which was the case for Llano, Trinity, VAIO C, TimelineX, and Vostro; the other laptops had 1080p displays, except for quad-core SNB which has a 1600x900 LCD and I didn't run the 1080p tests). Reply
  • PolarisOrbit - Saturday, May 26, 2012 - link

    Good review for what it is, but I think it could have been a little more complete with some additional information:

    1) Use Act 3 Bastion's Keep for the "intensive" case instead of Act 1 Old Town. I think this would be better representative of the game's peak demand. (probably just a run through of the signal fires quest since it's easy to get to)

    2) Include a brief section on how much of an impact additional players put on the game. I find it can actually be quite significant. This doesn't have to be full-depth review just a quick.

    Overall, I'm using an A8-3500M + 6750M crossfire (overclocked to 2.2GHz) @1366x768 and my framerates during battles (ie. when it counts) average about 1/2 to 1/3 what the reviewer posts because the game gets much more intensive than Act 1, and having a party also slows it down significantly compared to solo.

    Just some ideas to expand the review if you want =)
    Reply
  • drkrieger - Saturday, May 26, 2012 - link

    Hey folks, I've got an older Asus G71Gx which has a Nvidia GTX260M, I can play it on medium/low at about 40 fps @ 1920x1200.

    Hope this gives some idea of older mobile graphics stack up.
    Reply
  • waldojim42 - Saturday, May 26, 2012 - link

    I have been testing this out on my W520 for the sake of seeing what I can do to play diablo and maintain decent battery life.

    For what it is worth, turning off shadows, and playing @ 1366x768 on the HD 3000 results in roughly 28fps - more than enough to play the game through the first difficulty anyhow. I have been using this for some time now with 4 players in game. When running @ 1080P, it dips down into the low 20's, and occasionally is a problem in act 3 so I wouldn't suggest it.

    Point is though, that anyone that has a notebook with SB and no video card CAN still play this game, even if it isn't ideal.
    Reply
  • Zoolookuk - Saturday, May 26, 2012 - link

    Given this is a cross platform game, it would have been interesting to provide Mac results with similar hardware. I play using a GT330m and i7 dual core, and it runs pretty well. I'd like to see how it stacks up to the latest AMD chips and HD3000 on a Mac. Reply
  • egtx - Saturday, May 26, 2012 - link

    Yes I am interested in Mac results as well. Reply
  • ananduser - Saturday, May 26, 2012 - link

    Provided the testing is done on a dual booting Apple machine, D3 under Windows will always run better. Reply
  • JarredWalton - Saturday, May 26, 2012 - link

    Anecdotally, Brian and Anand have both commented that Diablo 3 on a MacBook Pro under OS X runs like crap. I'm not sure if they're running on latest generation MBP13 or something else, though, so that's about all I can pass along. Reply
  • ananduser - Sunday, May 27, 2012 - link

    Was there any doubt? OSX is severely lacking in the graphical driver support. Apple never gave a rat's rear about this crucial aspect of gaming support. They are always late with drivers and with the latest OpenGL spec. Reply
  • Penti - Thursday, May 31, 2012 - link

    The recommendations / minimum requirements on Macs are discrete graphics with good drivers though. I.e. no nvidia 7300 / 7600, ATi X1600 / X1900 etc. Starting point is 8600 GT. Obviously no integrated Intel graphics is enough there. OpenGL3.2 or OpenGL 2.1 with extensions should be fine for game developers and the drivers handle it, nVidia and AMD can put in performance improvements if they have the feedback. They could even launch their own "game edition" card for the Mac Pro with their own drivers outside of Apples distribution channel. Nvidia releases drivers on there site from time to time. That said both the game engine port and drivers are a bit less optimized then their Windows and Direct3D counterpart. They [drivers] are quiet robust and well working but might not be that fast. It's mainly a problem for the developers today though as most macs has somewhat decent graphics with maintained drivers and have pretty good driver support and support pretty much all the features you need any way.

    The OS is very dependent on OGL so the support it self is decent and fairly up to to date even if it is not OpenGL 4.2/3.3 yet. Latest OpenGL 4.2 is not even supported by much of any hardware that Apples uses either so. R700, R600, GF 8M, GF 9M and the desktop versions does not support more then OpenGL 3.3 any way which it self is a backport of as much as possible. 3.2 is a decent level there. Apple always support the whole API in the software renderer too so they have no joy hunting the latest features, though the vendors can use any extensions they wish to add those features, all the supported gpus supports the API too. Intel drivers on Windows do not have OpenGL 4.1/4.2 drivers. It's a lot better driver support then for say Intel graphics on Linux and in some regards even on Windows. Intel drivers on Windows don't support OpenGL 3.2 yet.
    Reply

Log in

Don't have an account? Sign up now