Sandy Bridge Graphics: Extended Compatibility and Performance Results

It’s been quite a while since we last looked at gaming compatibility and performance on a large group of titles, so we figured the timing was ripe with the Sandy Bridge launch. We went through and selected fourteen additional games from the past several years; the intention is to see if SNB can run the games properly, as well as what sort of performance it can provide.

For comparison, we selected four other notebooks that we had on hand, which we’ve already highlighted on the previous page. Dell’s Latitude E6410 represents the old guard Intel HD Graphics, and the Toshiba A660D (forced onto the integrated HD 4250 GPU) is AMD’s soon-to-be-replaced IGP. Both are slower than SNB by a large amount, as we’ve already established. On the higher performance side of the equation, we’ve again got the Acer 5551G with a Turion II P520 (2.3GHz dual-core) processor and HD 5650 GPU, and for NVIDIA we have the ASUS N53JF with i5-460M and GT 425M. We tested Low and Medium detail performance, again skipping the Dell and Toshiba systems for Medium.

Assassin's Creed

Batman: Arkham Asylum


Chronicles of Riddick: Dark Athena

Crysis: Warhead

Elder Scrolls IV: Oblivion

Empire: Total War

Fallout 3

Fallout: New Vegas

Far Cry 2

FEAR 2: Project Origin

H.A.W.X. 2

Mafia II

Metro 2033

Low Gaming Average - 20 Titles

Adding 14 additional titles to the mix exposes a few more areas where Intel’s HD Graphics 3000 chip needs some fine tuning, but again all titles managed to at least run (with a bit of elbow grease). The problem areas run the range of blacklisted titles to minor rendering flaws (sometimes major flaws on older Intel graphics), with one title running but doing so poorly that it may as well have failed the test.

Going into details, first up is the now-infamous Fallout 3, which required a hacked D3D9.dll file to even run (just put the file in the game’s directory—thanks to the creators at OldBlivion). The hacked DLL identifies Intel graphics as a GeForce 7900 GS; without the DLL, the game crashes to the desktop with an error message as soon as you try to enter the actual game world. (Also note that the newer Fallout: New Vegas has no such problems, so Ubisoft was kind enough to stop blacklisting Intel’s IGPs it appears.) There are almost certainly other titles where the Intel IGP is blacklisted, and more than a few games warned of an unknown GPU and potential rendering problems (HAWX 2, Mass Effect 2 and Metro 2033, for instance), but only FO3 required a hack to actually run.

Besides the above, there were some other issues. Assassin’s Creed and HAWX 2 had occasionally flickering polygons, and Mafia II had some rendering issues with shadows; both are minor glitches that don’t render the games unplayable, but in the case of Mafia II performance is too low to be manageable. Finally, the one title from our list that has clear problems with Intel’s current drivers is Chronicles of Riddick: Dark Athena. It’s interesting to note that this is the sole OpenGL title in our suite, and it checks in at a dismal <3FPS. The older Intel HD Graphics on Arrandale has the same issues as HD 3000, with the additional problem of seriously broken rendering in HAWX 2.

Outside of the above problems, performance is typically high enough to handle minimum to medium detail levels. Average frame rates on Sandy Bridge across the 20 test titles ends up at 41FPS. That works out to a 128% improvement over the previous Intel HD Graphics, and a 136% lead over AMD’s HD 4250. The HD 5650 with a slower CPU still leads by over 55%, and GT 425M likewise maintains a comfortable lead of 62%; that said, you can certainly make the case that mainstream gaming is easily achievable with Sandy Bridge. Finally, it’s worth noting that while AMD’s HD 4250 actually ends up slightly slower than the old Intel HD Graphics on average, we didn’t encounter a single noticeable rendering error with that GPU in our test suite.

There are three exceptions to “playability” in our list, counting Dark Athena: both Mafia II and Metro 2033 fail to get above 30FPS, regardless of setting—though Mafia II comes close at 29FPS when set to 800x600. These two titles are a familiar refrain, and it’s worth noting that many discrete mobile GPUs also fail to reach playable performance; in fact, Dark Athena also tends to be a bit too much for anything lower than an HD 5650/GT 420M. They’re the modern equivalent of Crysis, except you can’t even turn down setting enough (without hacking configuration files) to make them run acceptably.

Mobile Sandy Bridge Medium Gaming Performance Extended Compatibility and Performance Results – Medium Detail


View All Comments

  • mtoma - Monday, January 03, 2011 - link

    Something like Core i7 1357M could make Win 7 tablets temporarily viable. Remember that in the ultra portable space the big words are: multitasking, dual core processors (like Cortex A9). So, realistically, we need ULV dual-core Sandy Bridge. Reply
  • JarredWalton - Monday, January 03, 2011 - link

    The i7-640M runs at 1.2GHz minimum and 2.26GHz maximum. The i7-2657M runs at 1.6GHz minimum and 2.7GHz maximum. (Actually, minimum on all the Core 2nd Gen is 800MHz when you aren't doing anything that needs more speed.) That would be 33% faster base speed and up to 19% higher max speed, just on clock speeds alone. However, you forgot to factor in a round 20-25% performance increase just from the Sandy Bridge architecture, so you're really looking at anywhere from 19% (bare minimum) to as much as 66% faster for normal usage, and things like Quick Sync would make certain things even faster. Reply
  • DanNeely - Monday, January 03, 2011 - link

    You've got a limited range of TDP that any given architecture will be good in. According to Intel (at the time of the atom launch) things start getting rather ragged when the range gets to 10x. Until Core2 this wasn't really an issue for Intel because the p3 and prior's top end parts had sufficiently low TDPs that fitting the entire product line into a single architecture wasn't a problem. It didn't matter much in the P4 era because the Pentium-M and Core 1 were separate architectures and could be tuned so its sweet spot was significantly lower than the desktop P4. Beginning with Core2 however Intel only had a single architecture. The bottom tier of ULV chips suffered due to this, and on the high end the fact that overclocking (especially voltage OCing) was very poor on the performance gain/increased power consumption scale.

    The atom is weak as you approach 10W because it was designed not as a low end laptop part (although Intel is more than willing to take your money for a netbook); but to invade ARM's stronghold in smartphones, tablets, and other low power embedded systems. Doing that requires good performance at <1W TDP. By using a low power process (instead of the performance process of every prior Intel fabbed CPU) Moorestown should finally be able to do so. The catch is that it leaves Intel without anything well optimized for the 10-15W range. In theory the AMD Bobcat should be well placed for this market, but the much larger chunk of TDP given to graphics combined with AMDs historic liability in idle power make it something of a darkhorse. I wouldn't be surprised if the 17W Sandybridge is able to end up getting better battery life than the 10W Bobcat because of this.
  • Kenny_ - Monday, January 03, 2011 - link

    I have seen in the past that when Mac OS X and Win 7 are run on the same machine, Mac OS X can have significantly better battery life. Is there any chance we could see what Sandy Bridge does for battery life under Mac OS X? Reply
  • QChronoD - Monday, January 03, 2011 - link

    This was a test machine that intel cobbled together. Give it a few weeks or months after some retail machines come out, and then I'm sure that someone in the community will have somehow shoehorned OSX onto one of the machines. (Although I don't know how well it would perform since they'd probably have to write new drivers for the chipset and the graphics) Reply
  • cgeorgescu - Monday, January 03, 2011 - link

    I think that in the past we've seen MacOS and Win7 battery life comparison while running on the same Mac, not on the same Acer/Asus/Any machine (cause MacOS doesn't run on such w/o hacks). And I suspect Apple manages better power management only because they have to support only few hardware configurations (so doing optimizations especially for that hardware), it's a major advantage of their business model.
    It's like with the performance of games on Xbox and the like... The hardware isn't that impressive but you write and compile only for that configuration and nothing else: you're sure that every other machine is the same, not depending on AMD code paths, smaller or larger cache, slower or faster RAM, that or the other video card, and so on...

    Aside power management in macs, to see what Sandy Bridge can do under MacOS would be frustrating... You know how long it takes until Jobs fits new stuff in those MBPs. Hell, he still sells Core2 duo.
  • Penti - Monday, January 03, 2011 - link

    Having fewer configurations don't mean better optimized graphics drivers they are worse. Having only intel doesn't mean the GCC compiler only outputs optimized code. It's a compiler AMD contribute to among others and there's no such thing as AMD code paths, there is some minor difference in how it manages SSE but that's it. Most is exactly the same and the compiler just optimizes for x86 not a brand. If it supports the same features it is as optimized. Machine Code is the same. It's not like having a cell processor there.

    Power management is handles by the kernel/drivers. You can expect SB MacBooks in like this summer. Not too long off. And you might even be seeing people accepting Flash on their macs again as Adobe is starting to move away from their archaic none video player work flow. With 10.2 and forward. Battery/Power management won't really work without Apples firmware though. But you are simply not going to optimize code on a OS X machine like a console, your gonna leave it in a worse state then the Windows counterpart. Apple will also be using C2D as long as Intel don't provide them with optimized proper drivers. It's a better fit for the smaller models as is.
  • mcdill the pig - Monday, January 03, 2011 - link

    Perhaps the issue is more the Compal's cooling system but those max CPU temps (91 degrees celsius) seem high. It may also be that the non-Extreme CPUs will have lower temps when stressed.

    My Envy 17 already has high temps - I was looking forward to SB notebooks having better thermal characteristics than the i7 QM chips (i.e. no more hot palmrests or ball-burning undersides)....
  • JarredWalton - Monday, January 03, 2011 - link

    This is a "works as designed" thing. Intel runs the CPU at the maximum speed allowed (3.1GHz on heavily threaded code in this case) until the CPU gets too warm. Actually, funny thing is that when the fan stopped working at one point (a cold reboot fixed it), CPU temps maxed out at 99C. Even with no fan running, the system remained fully stable; it just ran at 800MHz most of the time (particularly if you put a load on the CPU for more than 5 seconds), possibly with other throttling going on. Cinebench 11.5 for instance ran about 1/4 as fast as normal. Reply
  • DanNeely - Monday, January 03, 2011 - link

    Throttling down to maintain TDP at safe levels has been an intel feature since the P4 era. back in 2001(?) toms hardware demoed this dramatically by running quake on a P4 and removing the cooler entirely. Quake dropped into slideshow mode but remained stable and recovered as soon as the heatsink was set back on top.

    The p3 they tested did a hard crash. The athlon XP/MP chips reached several hundred degrees and self destructed (taking the mobos with them). Future AMD CPUs had thermal protection circuitry to avoid this fail mode as well.

Log in

Don't have an account? Sign up now