Sandy Bridge: Bridging the Mobile Gap

We’ve been anxiously awaiting Sandy Bridge for a while, as the old Clarksfield processor was good for mobile performance but awful when it came to battery life. Take a power hungry CPU and pair it up with a discrete GPU that would usually require at least 5W and you get what we’ve lamented in the past year or so: battery life that usually maxed out at 2.5 hours doing nothing, and plummeted to as little as 40 minutes under a moderate load.

Sandy Bridge fixes that problem, and it fixes it in a major way. Not only do we get 50 to 100% better performance than the previous generation high-end Intel mobile chips, but we also get more than double the integrated graphics performance and battery life in most situations should be similar to Arrandale, if not better. And that’s looking at the quad-core offerings!

When dual-core and LV/ULV Sandy Bridge processors start arriving next month, we’ll get all of the benefits of the Sandy Bridge architecture with the potential for even lower power requirements. It’s not too hard to imagine the ULV Sandy Bridge chips reaching Atom levels of battery life under moderate loads, and performance will probably be almost an order of magnitude better than Atom. Sure, you’ll pay $700+ for SNB laptops versus $300 netbooks, but at least you’ll be able to do everything you could want of a modern PC. In summary, then, Sandy Bridge improves laptop and notebook performance to the point where a large number of users could easily forget about desktops altogether; besides, you can always plug your notebook into a keyboard, mouse, and display if needed. About the only thing desktop still do substantially better is gaming, and that’s largely due to the use of 300W GPUs.

All this raises a big question: what can AMD do to compete? The best we’ve seen from AMD has been in the ultraportable/netbook space, where their current Nile platform offers substantially better than Atom performance in a relatively small form factor, with a price that’s only slightly higher. The problem is that Intel already has parts that can easily compete in the same segment—ULV Arrandale and even standard Arrandale offer somewhat better graphics performance than HD 4225 (barring driver compatibility issues) with better battery life and substantially higher CPU performance—and it’s not like most people play demanding games on such laptops anyway. It’s a triple threat that leaves AMD only one choice: lower prices. If Intel were to drop pricing on their ULV parts, they could remove any reason to consider AMD mobile CPUs right now, but so far Intel hasn’t shown an interest in doing so.

In the near future, we’ll see AMD’s Brazos platform come out, and that should help on the low end. We expect better than Atom performance with substantially better graphics, but prices look to be about 50% higher than basic Atom netbooks/nettops and you’ll still have substantially faster laptops available for just a bit more. I’m not sure DX11 capable graphics even matter until you get CPUs at least two or three times more powerful than Atom (and probably at least twice as fast as the netbook Brazos chips), but we’ll see where Intel chooses to compete soon enough. Most likely, they’ll continue to let AMD have a piece of the sub-$500 laptop market, as that’s not where they make money.

The lucrative laptops are going to be in the $750+ range, and Intel already has a stranglehold on that market. Arrandale provides faster performance than anything AMD is currently shipping, while also beating AMD in battery life. Pair Arrandale with an NVIDIA Optimus GPU and you also cover the graphics side of things, all while still keeping prices under $1000. Now it looks like Intel is ready to bump performance up another 25% at least (estimating dual-core SNB performance), and power saving features likewise improve. AMD should have some new offerings in the next six months, e.g. Llano, but Llano is supposed to be a combination of Fusion graphics with a current generation CPU, with the Fusion plus Bulldozer coming later.

We have no doubt that AMD can do graphics better than the current Intel IGP, but at some point you reach the stage where you need a faster CPU to keep the graphics fed. Sandy Bridge has now pushed CPU performance up to the point where we can use much faster GPUs, but most of those fast GPUs also tend to suck down power like a black hole. Optimus means we can get NVIDIA’s 400M (and future parts) and still maintain good battery life, but gaming and battery life at the same time remains a pipe dream. Maybe AMD’s Fusion will be a bit more balanced towards overall computing.

I guess what I’m really curious to see is if AMD, Intel, NVIDIA, or anyone else can ever give us 10 hours of mobile gaming. Then we can start walking around jacked into the Matrix [Ed: that would be the William Gibson Matrix/Cyberspace, not the Keanu Reaves movies, though I suppose both ideas work] and forget about the real world! With Intel now using 32nm process technology on their IGP and 22nm coming in late 2011, we could actually begin seeing a doubling of IGP performance every ~18 months without increasing power requirements, and at some point we stop needing much more than that. Put it another way: Intel’s HD Graphics 3000 with 114M transistors is now providing about the same level of performance as the PS3 and Xbox 360 consoles, and you pretty much get that “free” with any non-Atom CPU going forward. Maybe the next consoles won’t even need to use anything beyond AMD/Intel’s current integrated solutions?

However you want to look at things, 2011 is shaping up to be a big year for mobility. We bumped our laptop reviews up from about 25 articles in 2009 to a whopping 100 articles in 2010, not to mention adding smartphones into the mix. It’s little surprise that laptop sells have eclipsed desktops, and that trend will only continue. While the Sandy Bridge notebook is still a notebook, you start thinking ten years down the road and the possibilities are amazing. iPhone and Android devices are now doing Xbox visuals in your hand, and Xbox 360 isn’t far off. Ten years from now, we’ll probably see Sandy Bridge performance (or better) in a smartphone that sucks milliwatts.

SNB marks the first salvo in the mobile wars of 2011, but there’s plenty more to come. Intel’s cards are now on the table; how will AMD and NVIDIA respond? Maybe there’s a wild card or two hiding in someone’s sleeve that we didn’t expect. Regardless, we’ll be waiting to see where the actual notebooks go with the new hardware, and CES should provide a slew of new product announcements over the coming week. Stay tuned!

What About Heat, Noise, and the LCD?
Comments Locked

66 Comments

View All Comments

  • mtoma - Monday, January 3, 2011 - link

    Something like Core i7 1357M could make Win 7 tablets temporarily viable. Remember that in the ultra portable space the big words are: multitasking, dual core processors (like Cortex A9). So, realistically, we need ULV dual-core Sandy Bridge.
  • JarredWalton - Monday, January 3, 2011 - link

    The i7-640M runs at 1.2GHz minimum and 2.26GHz maximum. The i7-2657M runs at 1.6GHz minimum and 2.7GHz maximum. (Actually, minimum on all the Core 2nd Gen is 800MHz when you aren't doing anything that needs more speed.) That would be 33% faster base speed and up to 19% higher max speed, just on clock speeds alone. However, you forgot to factor in a round 20-25% performance increase just from the Sandy Bridge architecture, so you're really looking at anywhere from 19% (bare minimum) to as much as 66% faster for normal usage, and things like Quick Sync would make certain things even faster.
  • DanNeely - Monday, January 3, 2011 - link

    You've got a limited range of TDP that any given architecture will be good in. According to Intel (at the time of the atom launch) things start getting rather ragged when the range gets to 10x. Until Core2 this wasn't really an issue for Intel because the p3 and prior's top end parts had sufficiently low TDPs that fitting the entire product line into a single architecture wasn't a problem. It didn't matter much in the P4 era because the Pentium-M and Core 1 were separate architectures and could be tuned so its sweet spot was significantly lower than the desktop P4. Beginning with Core2 however Intel only had a single architecture. The bottom tier of ULV chips suffered due to this, and on the high end the fact that overclocking (especially voltage OCing) was very poor on the performance gain/increased power consumption scale.

    The atom is weak as you approach 10W because it was designed not as a low end laptop part (although Intel is more than willing to take your money for a netbook); but to invade ARM's stronghold in smartphones, tablets, and other low power embedded systems. Doing that requires good performance at <1W TDP. By using a low power process (instead of the performance process of every prior Intel fabbed CPU) Moorestown should finally be able to do so. The catch is that it leaves Intel without anything well optimized for the 10-15W range. In theory the AMD Bobcat should be well placed for this market, but the much larger chunk of TDP given to graphics combined with AMDs historic liability in idle power make it something of a darkhorse. I wouldn't be surprised if the 17W Sandybridge is able to end up getting better battery life than the 10W Bobcat because of this.
  • Kenny_ - Monday, January 3, 2011 - link

    I have seen in the past that when Mac OS X and Win 7 are run on the same machine, Mac OS X can have significantly better battery life. Is there any chance we could see what Sandy Bridge does for battery life under Mac OS X?
  • QChronoD - Monday, January 3, 2011 - link

    This was a test machine that intel cobbled together. Give it a few weeks or months after some retail machines come out, and then I'm sure that someone in the community will have somehow shoehorned OSX onto one of the machines. (Although I don't know how well it would perform since they'd probably have to write new drivers for the chipset and the graphics)
  • cgeorgescu - Monday, January 3, 2011 - link

    I think that in the past we've seen MacOS and Win7 battery life comparison while running on the same Mac, not on the same Acer/Asus/Any machine (cause MacOS doesn't run on such w/o hacks). And I suspect Apple manages better power management only because they have to support only few hardware configurations (so doing optimizations especially for that hardware), it's a major advantage of their business model.
    It's like with the performance of games on Xbox and the like... The hardware isn't that impressive but you write and compile only for that configuration and nothing else: you're sure that every other machine is the same, not depending on AMD code paths, smaller or larger cache, slower or faster RAM, that or the other video card, and so on...

    Aside power management in macs, to see what Sandy Bridge can do under MacOS would be frustrating... You know how long it takes until Jobs fits new stuff in those MBPs. Hell, he still sells Core2 duo.
  • Penti - Monday, January 3, 2011 - link

    Having fewer configurations don't mean better optimized graphics drivers they are worse. Having only intel doesn't mean the GCC compiler only outputs optimized code. It's a compiler AMD contribute to among others and there's no such thing as AMD code paths, there is some minor difference in how it manages SSE but that's it. Most is exactly the same and the compiler just optimizes for x86 not a brand. If it supports the same features it is as optimized. Machine Code is the same. It's not like having a cell processor there.

    Power management is handles by the kernel/drivers. You can expect SB MacBooks in like this summer. Not too long off. And you might even be seeing people accepting Flash on their macs again as Adobe is starting to move away from their archaic none video player work flow. With 10.2 and forward. Battery/Power management won't really work without Apples firmware though. But you are simply not going to optimize code on a OS X machine like a console, your gonna leave it in a worse state then the Windows counterpart. Apple will also be using C2D as long as Intel don't provide them with optimized proper drivers. It's a better fit for the smaller models as is.
  • mcdill the pig - Monday, January 3, 2011 - link

    Perhaps the issue is more the Compal's cooling system but those max CPU temps (91 degrees celsius) seem high. It may also be that the non-Extreme CPUs will have lower temps when stressed.

    My Envy 17 already has high temps - I was looking forward to SB notebooks having better thermal characteristics than the i7 QM chips (i.e. no more hot palmrests or ball-burning undersides)....
  • JarredWalton - Monday, January 3, 2011 - link

    This is a "works as designed" thing. Intel runs the CPU at the maximum speed allowed (3.1GHz on heavily threaded code in this case) until the CPU gets too warm. Actually, funny thing is that when the fan stopped working at one point (a cold reboot fixed it), CPU temps maxed out at 99C. Even with no fan running, the system remained fully stable; it just ran at 800MHz most of the time (particularly if you put a load on the CPU for more than 5 seconds), possibly with other throttling going on. Cinebench 11.5 for instance ran about 1/4 as fast as normal.
  • DanNeely - Monday, January 3, 2011 - link

    Throttling down to maintain TDP at safe levels has been an intel feature since the P4 era. back in 2001(?) toms hardware demoed this dramatically by running quake on a P4 and removing the cooler entirely. Quake dropped into slideshow mode but remained stable and recovered as soon as the heatsink was set back on top.

    The p3 they tested did a hard crash. The athlon XP/MP chips reached several hundred degrees and self destructed (taking the mobos with them). Future AMD CPUs had thermal protection circuitry to avoid this fail mode as well.

Log in

Don't have an account? Sign up now