Sandy Bridge: Bridging the Mobile Gap

We’ve been anxiously awaiting Sandy Bridge for a while, as the old Clarksfield processor was good for mobile performance but awful when it came to battery life. Take a power hungry CPU and pair it up with a discrete GPU that would usually require at least 5W and you get what we’ve lamented in the past year or so: battery life that usually maxed out at 2.5 hours doing nothing, and plummeted to as little as 40 minutes under a moderate load.

Sandy Bridge fixes that problem, and it fixes it in a major way. Not only do we get 50 to 100% better performance than the previous generation high-end Intel mobile chips, but we also get more than double the integrated graphics performance and battery life in most situations should be similar to Arrandale, if not better. And that’s looking at the quad-core offerings!

When dual-core and LV/ULV Sandy Bridge processors start arriving next month, we’ll get all of the benefits of the Sandy Bridge architecture with the potential for even lower power requirements. It’s not too hard to imagine the ULV Sandy Bridge chips reaching Atom levels of battery life under moderate loads, and performance will probably be almost an order of magnitude better than Atom. Sure, you’ll pay $700+ for SNB laptops versus $300 netbooks, but at least you’ll be able to do everything you could want of a modern PC. In summary, then, Sandy Bridge improves laptop and notebook performance to the point where a large number of users could easily forget about desktops altogether; besides, you can always plug your notebook into a keyboard, mouse, and display if needed. About the only thing desktop still do substantially better is gaming, and that’s largely due to the use of 300W GPUs.

All this raises a big question: what can AMD do to compete? The best we’ve seen from AMD has been in the ultraportable/netbook space, where their current Nile platform offers substantially better than Atom performance in a relatively small form factor, with a price that’s only slightly higher. The problem is that Intel already has parts that can easily compete in the same segment—ULV Arrandale and even standard Arrandale offer somewhat better graphics performance than HD 4225 (barring driver compatibility issues) with better battery life and substantially higher CPU performance—and it’s not like most people play demanding games on such laptops anyway. It’s a triple threat that leaves AMD only one choice: lower prices. If Intel were to drop pricing on their ULV parts, they could remove any reason to consider AMD mobile CPUs right now, but so far Intel hasn’t shown an interest in doing so.

In the near future, we’ll see AMD’s Brazos platform come out, and that should help on the low end. We expect better than Atom performance with substantially better graphics, but prices look to be about 50% higher than basic Atom netbooks/nettops and you’ll still have substantially faster laptops available for just a bit more. I’m not sure DX11 capable graphics even matter until you get CPUs at least two or three times more powerful than Atom (and probably at least twice as fast as the netbook Brazos chips), but we’ll see where Intel chooses to compete soon enough. Most likely, they’ll continue to let AMD have a piece of the sub-$500 laptop market, as that’s not where they make money.

The lucrative laptops are going to be in the $750+ range, and Intel already has a stranglehold on that market. Arrandale provides faster performance than anything AMD is currently shipping, while also beating AMD in battery life. Pair Arrandale with an NVIDIA Optimus GPU and you also cover the graphics side of things, all while still keeping prices under $1000. Now it looks like Intel is ready to bump performance up another 25% at least (estimating dual-core SNB performance), and power saving features likewise improve. AMD should have some new offerings in the next six months, e.g. Llano, but Llano is supposed to be a combination of Fusion graphics with a current generation CPU, with the Fusion plus Bulldozer coming later.

We have no doubt that AMD can do graphics better than the current Intel IGP, but at some point you reach the stage where you need a faster CPU to keep the graphics fed. Sandy Bridge has now pushed CPU performance up to the point where we can use much faster GPUs, but most of those fast GPUs also tend to suck down power like a black hole. Optimus means we can get NVIDIA’s 400M (and future parts) and still maintain good battery life, but gaming and battery life at the same time remains a pipe dream. Maybe AMD’s Fusion will be a bit more balanced towards overall computing.

I guess what I’m really curious to see is if AMD, Intel, NVIDIA, or anyone else can ever give us 10 hours of mobile gaming. Then we can start walking around jacked into the Matrix [Ed: that would be the William Gibson Matrix/Cyberspace, not the Keanu Reaves movies, though I suppose both ideas work] and forget about the real world! With Intel now using 32nm process technology on their IGP and 22nm coming in late 2011, we could actually begin seeing a doubling of IGP performance every ~18 months without increasing power requirements, and at some point we stop needing much more than that. Put it another way: Intel’s HD Graphics 3000 with 114M transistors is now providing about the same level of performance as the PS3 and Xbox 360 consoles, and you pretty much get that “free” with any non-Atom CPU going forward. Maybe the next consoles won’t even need to use anything beyond AMD/Intel’s current integrated solutions?

However you want to look at things, 2011 is shaping up to be a big year for mobility. We bumped our laptop reviews up from about 25 articles in 2009 to a whopping 100 articles in 2010, not to mention adding smartphones into the mix. It’s little surprise that laptop sells have eclipsed desktops, and that trend will only continue. While the Sandy Bridge notebook is still a notebook, you start thinking ten years down the road and the possibilities are amazing. iPhone and Android devices are now doing Xbox visuals in your hand, and Xbox 360 isn’t far off. Ten years from now, we’ll probably see Sandy Bridge performance (or better) in a smartphone that sucks milliwatts.

SNB marks the first salvo in the mobile wars of 2011, but there’s plenty more to come. Intel’s cards are now on the table; how will AMD and NVIDIA respond? Maybe there’s a wild card or two hiding in someone’s sleeve that we didn’t expect. Regardless, we’ll be waiting to see where the actual notebooks go with the new hardware, and CES should provide a slew of new product announcements over the coming week. Stay tuned!

What About Heat, Noise, and the LCD?
Comments Locked

66 Comments

View All Comments

  • skywalker9952 - Monday, January 3, 2011 - link

    For your CPU specific benchmarks you annotate the CPU and GPU. I beleive the HDD or SSD plays a much larger role in those benchmarks then a GPU. Would it not be more appropriate to annotate the storage device used. Were all of the CPUs in the comparison paired with SSDs? If they weren't how much would that affect the benchmarks?
  • JarredWalton - Monday, January 3, 2011 - link

    The SSD is a huge benefit to PCMark, and since this is laptop testing I can't just use the same image on each system. Anand covers the desktop side of things, but I include PCMark mostly for the curious. I could try and put which SSD/HDD each notebook used, but then the text gets to be too long and the graph looks silly. Heh.

    For the record, the SNB notebook has a 160GB Intel G2 SSD. The desktop uses a 120GB Vertex 2 (SF-1200). W870CU is an 80GB Intel G1 SSD. The remaining laptops all use HDDs, mostly Seagate Momentus 7200.4 I think.
  • Macpod - Tuesday, January 4, 2011 - link

    the synthetics benchmarks are all run at turbo frequencies. the scores from the 2.3ghz 2820qm is almost the same as the 3.4ghz i7 2600k. this is because the 2820qm is running at 3.1ghz under cinebench.

    no one knows how long this turbo frequency lasts. maybe just enough to finish cinebench!

    this review should be re done
  • Althernai - Tuesday, January 4, 2011 - link

    It probably lasts forever given decent cooling so the review is accurate, but there is something funny going on here: the score for the 2820QM is 20393 while the score for the score in the 2600K review is 22875. This would be consistent with a difference between CPUs running at 3.4GHz and 3.1GHz, but why doesn't the 2600K Turbo up to 3.8GHz? The claim is that it can be effortlessly overclocked to 4.4GHz so we know the thermal headroom is there.
  • JarredWalton - Tuesday, January 4, 2011 - link

    If you do continual heavy-duty CPU stuff on the 2820QM, the overall score drops about 10% on later runs in Cinebench and x264 encoding. I mentioned this in the text: the CPU starts at 3.1GHz for about 10 seconds, then drops to 3.0GHz for another 20s or so, then 2.9 for a bit and eventually settles in at 2.7GHz after 55 seconds (give or take). If you're in a hotter testing environment, things would get worse; conversely, if you have a notebook with better cooling, it should run closer to the maximum Turbo speeds more often.

    Macpod, disabling Turbo is the last thing I would do for this sort of chip. What would be the point, other than to show that if you limit clock speeds, performance will go down (along with power use)? But you're right, the whole review should be redone because I didn't mention enough that heavy loads will eventually drop performance about 10%. (Or did you miss page 10: "Performance and Power Investigated"?)
  • lucinski - Tuesday, January 4, 2011 - link

    Just like any other low-end GPU (integrated or otherwise) I believe most users would rely on the HD3000 just for undemanding games in the category of which I would mention Civilization IV and V or FIFA / PES 11. This goes to say that I would very much like to see how the new Intel graphics fares in these games, should they be available in the test lab of course.

    I am not necessarily worried about the raw performance, clearly the HD3000 has the capacity to deliver. Instead, the driver maturity may come out as an obstacle. Firstly one has to consider the fact that Intel traditionally has problems with GPU driver design (relative to their competitors). Secondly, should at one point Intel be able to repair (some of) the rendering issues mentioned in this article or elsewhere, notebook producers still take their sweet time before supplying users with new driver versions.

    In this context I am genuinely concerned about the HD3000 goodness. The old GMA HD + Radeon 5470 combination still seems tempting. Strictly referring to the gaming aspect I honestly prefer reliability and a few FPS' missing rather than the aforementioned risks.
  • NestoJR - Tuesday, January 4, 2011 - link

    So, when Apple starts putting these in Macbooks, I'd assume the battery life will easily eclipse 10 hours under light usage, maybe 6 hours under medium usage ??? I'm no fanboy but I'll be in line for that ! My Dell XPS M1530's 9-cell battery just died, I can wait a few months =]
  • JarredWalton - Tuesday, January 4, 2011 - link

    I'm definitely interested in seeing what Apple can do with Sandy Bridge! Of course, they might not use the quad-core chips in anything smaller than the MBP 17, if history holds true. And maybe the MPB13 will finally make the jump to Arrandale? ;-)
  • heffeque - Wednesday, January 5, 2011 - link

    Yeah... Saying that the nVidia 320M is consistently slower than the HD3000 when comparing a CPU from 2008 and a CPU from 2011...

    Great job comparing GPUs! (sic)

    A more intelligent thing to say would have been: a 2008 CPU (P8600) with an nVidia 320M is consistently slightly slower than a 2011 CPU (i7-2820QM) with HD3000, don't you think?

    That would make more sense.
  • Wolfpup - Wednesday, January 5, 2011 - link

    That's the only thing I care about with these-and as far as I'm aware, the jump isn't anything special. It's FAR from the "tock" it supposedly is, going by earlier Anandtech data. (In fact the "tick/tock" thing seems to have broken down after just one set of products...)

    This sounds like it is a big advantage for me...but only because Intel refused to produce quad core CPUs at 32nm, so these by default run quite a bit faster than the last gen chips.

    Otherwise it sounds like they're wasting 114 million transistors that I want spent on the CPU-whether it's more cache, more, more functional units, another core (if that's possible in 114 million transistors) etc.

    I absolutely do NOT want Intel's garbage, incompatible graphics. I do NOT want the addition complexity, performance hit, and software complexity of Optimus or the like. I want a real GPU, functioning as a real GPU, with Intels' garbage completely shut off at all times.

    I hope we'll see that in mid range and high end notebooks, or I'm going to be very disappointed.

Log in

Don't have an account? Sign up now