Mobile Sandy Bridge Gaming Performance

Sandy Bridge is clearly a faster CPU than the preceding Arrandale and Clarksfield offerings—that’s no surprise. The integrated graphics are also faster, but being faster than old Intel’s HD Graphics isn’t saying a whole lot. Then again, AMD’s old HD 4250 IGP isn’t much better and is long overdue for a replacement. We’ll talk about that in a second, but first here are the standard gaming performance results at our “Low” defaults. “Medium” detail will be on the next page.

Battlefield: Bad Company 2

DiRT 2

Left 4 Dead 2

Mass Effect 2

Stalker: Call of Pripyat

StarCraft II: Wings of Liberty

Average Gaming Performance

Let’s start by talking about compatibility problems with the Intel IGP: there were none! I know it’s been fashionable over the years to bash on Intel for horrible drivers that can’t run games, and the previous IGPs certainly still fall into that “too slow” category, but of our standard six test titles both Sandy Bridge and Arrandale IGPs loaded and ran every single title. That makes talk of performance meaningful, and that’s the bigger story by far.

Again, Sandy Bridge delivers playable performance in every single title at 768p and “Low” detail settings. What’s more, it actually surpasses the GeForce 320M in Apple’s MacBook Pro 13 in five out of six games (the exception being STALKER). Similarly, it beats the entry-level Radeon Mobility 5470 in four out of six games (STALKER and StarCraft II being the exceptions) and the G 310M in five out of six (SC2 came out ahead on the 310M).

All told, Intel’s HD Graphics 3000 checks in an average of 6% faster than HD 5470, 8% faster than GeForce 320M, 25% faster than G 310M, and a whopping 130% faster than both the previous generation HD Graphics and AMD’s HD 4250 (which are essentially tied in overall performance across the selected titles). Which is not to say it wins everywhere; even with a much slower Turion II P520 Processor, the HD 5650 still leads HD 3000 by 44%; shift the same GPU to an i7-640M to remove the CPU bottleneck and the 5650 beats HD 3000 by 130%. NVIDIA’s GeForce GT 425M also leads the 3000 by 46% on average, so discrete GPUs are by no means in danger of being replaced.

We didn't have time to do serious image quality comparisons, but subjectively there did appear to be a few games where the Intel IGP wasn't rendering at the same level of detail, but it's hard to say when you're not running at the native LCD resolution; we'll get into problems in a moment, but other than those mentioned we didn't have any serious complaints. After all, being able to run a game at all is the first consideration; making it look good is merely the icing on the cake.

Mobile Sandy Bridge QuickSync and 3DMarks Mobile Sandy Bridge Medium Gaming Performance
Comments Locked

66 Comments

View All Comments

  • mtoma - Monday, January 3, 2011 - link

    Something like Core i7 1357M could make Win 7 tablets temporarily viable. Remember that in the ultra portable space the big words are: multitasking, dual core processors (like Cortex A9). So, realistically, we need ULV dual-core Sandy Bridge.
  • JarredWalton - Monday, January 3, 2011 - link

    The i7-640M runs at 1.2GHz minimum and 2.26GHz maximum. The i7-2657M runs at 1.6GHz minimum and 2.7GHz maximum. (Actually, minimum on all the Core 2nd Gen is 800MHz when you aren't doing anything that needs more speed.) That would be 33% faster base speed and up to 19% higher max speed, just on clock speeds alone. However, you forgot to factor in a round 20-25% performance increase just from the Sandy Bridge architecture, so you're really looking at anywhere from 19% (bare minimum) to as much as 66% faster for normal usage, and things like Quick Sync would make certain things even faster.
  • DanNeely - Monday, January 3, 2011 - link

    You've got a limited range of TDP that any given architecture will be good in. According to Intel (at the time of the atom launch) things start getting rather ragged when the range gets to 10x. Until Core2 this wasn't really an issue for Intel because the p3 and prior's top end parts had sufficiently low TDPs that fitting the entire product line into a single architecture wasn't a problem. It didn't matter much in the P4 era because the Pentium-M and Core 1 were separate architectures and could be tuned so its sweet spot was significantly lower than the desktop P4. Beginning with Core2 however Intel only had a single architecture. The bottom tier of ULV chips suffered due to this, and on the high end the fact that overclocking (especially voltage OCing) was very poor on the performance gain/increased power consumption scale.

    The atom is weak as you approach 10W because it was designed not as a low end laptop part (although Intel is more than willing to take your money for a netbook); but to invade ARM's stronghold in smartphones, tablets, and other low power embedded systems. Doing that requires good performance at <1W TDP. By using a low power process (instead of the performance process of every prior Intel fabbed CPU) Moorestown should finally be able to do so. The catch is that it leaves Intel without anything well optimized for the 10-15W range. In theory the AMD Bobcat should be well placed for this market, but the much larger chunk of TDP given to graphics combined with AMDs historic liability in idle power make it something of a darkhorse. I wouldn't be surprised if the 17W Sandybridge is able to end up getting better battery life than the 10W Bobcat because of this.
  • Kenny_ - Monday, January 3, 2011 - link

    I have seen in the past that when Mac OS X and Win 7 are run on the same machine, Mac OS X can have significantly better battery life. Is there any chance we could see what Sandy Bridge does for battery life under Mac OS X?
  • QChronoD - Monday, January 3, 2011 - link

    This was a test machine that intel cobbled together. Give it a few weeks or months after some retail machines come out, and then I'm sure that someone in the community will have somehow shoehorned OSX onto one of the machines. (Although I don't know how well it would perform since they'd probably have to write new drivers for the chipset and the graphics)
  • cgeorgescu - Monday, January 3, 2011 - link

    I think that in the past we've seen MacOS and Win7 battery life comparison while running on the same Mac, not on the same Acer/Asus/Any machine (cause MacOS doesn't run on such w/o hacks). And I suspect Apple manages better power management only because they have to support only few hardware configurations (so doing optimizations especially for that hardware), it's a major advantage of their business model.
    It's like with the performance of games on Xbox and the like... The hardware isn't that impressive but you write and compile only for that configuration and nothing else: you're sure that every other machine is the same, not depending on AMD code paths, smaller or larger cache, slower or faster RAM, that or the other video card, and so on...

    Aside power management in macs, to see what Sandy Bridge can do under MacOS would be frustrating... You know how long it takes until Jobs fits new stuff in those MBPs. Hell, he still sells Core2 duo.
  • Penti - Monday, January 3, 2011 - link

    Having fewer configurations don't mean better optimized graphics drivers they are worse. Having only intel doesn't mean the GCC compiler only outputs optimized code. It's a compiler AMD contribute to among others and there's no such thing as AMD code paths, there is some minor difference in how it manages SSE but that's it. Most is exactly the same and the compiler just optimizes for x86 not a brand. If it supports the same features it is as optimized. Machine Code is the same. It's not like having a cell processor there.

    Power management is handles by the kernel/drivers. You can expect SB MacBooks in like this summer. Not too long off. And you might even be seeing people accepting Flash on their macs again as Adobe is starting to move away from their archaic none video player work flow. With 10.2 and forward. Battery/Power management won't really work without Apples firmware though. But you are simply not going to optimize code on a OS X machine like a console, your gonna leave it in a worse state then the Windows counterpart. Apple will also be using C2D as long as Intel don't provide them with optimized proper drivers. It's a better fit for the smaller models as is.
  • mcdill the pig - Monday, January 3, 2011 - link

    Perhaps the issue is more the Compal's cooling system but those max CPU temps (91 degrees celsius) seem high. It may also be that the non-Extreme CPUs will have lower temps when stressed.

    My Envy 17 already has high temps - I was looking forward to SB notebooks having better thermal characteristics than the i7 QM chips (i.e. no more hot palmrests or ball-burning undersides)....
  • JarredWalton - Monday, January 3, 2011 - link

    This is a "works as designed" thing. Intel runs the CPU at the maximum speed allowed (3.1GHz on heavily threaded code in this case) until the CPU gets too warm. Actually, funny thing is that when the fan stopped working at one point (a cold reboot fixed it), CPU temps maxed out at 99C. Even with no fan running, the system remained fully stable; it just ran at 800MHz most of the time (particularly if you put a load on the CPU for more than 5 seconds), possibly with other throttling going on. Cinebench 11.5 for instance ran about 1/4 as fast as normal.
  • DanNeely - Monday, January 3, 2011 - link

    Throttling down to maintain TDP at safe levels has been an intel feature since the P4 era. back in 2001(?) toms hardware demoed this dramatically by running quake on a P4 and removing the cooler entirely. Quake dropped into slideshow mode but remained stable and recovered as soon as the heatsink was set back on top.

    The p3 they tested did a hard crash. The athlon XP/MP chips reached several hundred degrees and self destructed (taking the mobos with them). Future AMD CPUs had thermal protection circuitry to avoid this fail mode as well.

Log in

Don't have an account? Sign up now