Late last week we pulled back the covers on Intel's next-generation Core architecture update: Sandy Bridge. Due out in Q1 2011, we learned a lot about Sandy Bridge's performance in our preview. Sandy Bridge will be the first high performance monolithic CPU/GPU from Intel. Its performance was generally noticeably better than the present generation of processors, both on the CPU and GPU side. If you haven't read the preview by now, I'd encourage you to do so.

One of the questions we got in response to the article was: what about Sandy Bridge for notebooks? While Sandy Bridge is pretty significant for mainstream quad-core desktops, it's even more tailored to the notebook space. I've put together some spec and roadmap information for those of you who might be looking for a new notebook early next year.

Mobile Sandy Bridge

Like the desktop offering, mobile Sandy Bridge will arrive sometime in Q1 of next year. If 2010 was any indication of what's to come, we'll see both mobile and desktop parts launch at the same time around CES.

The mobile Sandy Bridge parts are a little more straightforward in some areas but more confusing in others. The biggest problem is that both dual and quad-core parts share the same brand; in fact, the letter Q is the only indication that the Core i7 2720QM is a quad-core and the Core i7 2620M isn't. Given AMD's Bulldozer strategy, I'm sure Intel doesn't want folks worrying about how many cores they have - just that higher numbers mean better things.

Mobile Sandy Bridge CPU Comparison
  Base Frequency L3 Cache Cores / Threads Max Single Core Turbo Memory Support Intel Graphics EUs Intel HD Graphics Frequency / Max Turbo TDP
Core i7 2920XM 2.5GHz 8MB 4 / 8 3.5GHz DDR3-1600 12 650 / 1300MHz 55W
Core i7 2820QM 2.3GHz 8MB 4 / 8 3.4GHz DDR3-1600 12 650 / 1300MHz 45W
Core i7 2720QM 2.2GHz 6MB 4 / 8 3.3GHz DDR3-1600 12 650 / 1300MHz 45W
Core i7 2620M 2.7GHz 4MB 2 / 4 3.4GHz DDR3-1600 12 650 / 1300MHz 35W
Core i5 2540M 2.6GHz 3MB 2 / 4 3.3GHz DDR3-1333 12 650 / 1150MHz 35W
Core i5 2520M 2.5GHz 3MB 2 / 4 3.2GHz DDR3-1333 12 650 / 1150MHz 35W

You'll notice a few changes compared to the desktop lineup. Clock speeds are understandably lower, and all launch parts have Hyper Threading enabled. Mobile Sandy Bridge also officially supports up to DDR3-1600 while the desktop CPUs top out at DDR3-1333 (though running them at 1600 shouldn't be a problem assuming you have a P67 board).

The major difference between mobile Sandy Bridge and its desktop countpart is all mobile SB launch SKUs have two graphics cores (12 EUs), while only some desktop parts have 12 EUs (it looks like the high-end K SKUs will have it). The base GPU clock is lower but it can turbo up to 1.3GHz, higher than most desktop Sandy Bridge CPUs. Note that the GPU we tested in Friday's preview had 6 EUs, so mobile Sandy Bridge should be noticeably quicker as long as we don't run into memory bandwidth issues. Update: Our preview article may have actually used a 12 EU part, we're still trying to confirm!

Even if we only get 50% more performance out of the 12 EU GPU, that'd be enough for me to say that there's no need for discrete graphics in a notebook - as long as you don't use it for high-end gaming.

While Arrandale boosted multithreaded performance significantly, Sandy Bridge is going to offer an across the board increase in CPU performance and a dramatic increase in GPU performance. And from what I've heard, NVIDIA's Optimus technology will work with the platform in case you want to do some serious gaming on your notebook.

The Roadmap
POST A COMMENT

52 Comments

View All Comments

  • SteelCity1981 - Saturday, September 11, 2010 - link

    Um it was a refresh to the current lineup. Sure it may have been the same core logic but it was still an update even intel states that. idiot. Reply
  • Mike1111 - Monday, August 30, 2010 - link

    I'm really interested to see what Apple will put into the MacBook Pro 13-inch. Because if Apple won't abandon Nvidia because of the missing OpenCL support in Sandy Bridge, are they gonna have to keep the years-old Core 2 Duo around??? There doesn't seem to be an obvious solution for Apple, apart from adding a power hungry dedicated graphics card like with the 15-inch. Reply
  • Roland00 - Monday, August 30, 2010 - link

    Reason is Llano is a 2 chip solution.
    Chip 1 is Chipset
    Chip 2 is CPU+GPU

    SandyBridge with a discrete graphic card would be a 3 chip solution
    Chip 1 is Chipset
    Chip 2 is SandyBridge CPU
    Chip 3 is Nvidia or ATI GPU with apples version of Optimus

    Besides space and cost considerations; having a different 13 inch vs 15 inch may be incentive for you to spend another 500 to 600, average selling price that the 15inch has over the 13inch Macbook pro, due to the fact that Sandybridge CPU+Separate Graphic Card will probably be faster in both CPU and GPU tasks (though not necessarily in battery life.)
    Reply
  • pcfxer - Monday, August 30, 2010 - link

    They are going with AMD/ATi. You just watch and see. Reply
  • Roland00 - Monday, August 30, 2010 - link

    though it is too far in the future to truely know (instead of rumors) which models will use AMD and which will use Intel. For all we know they may use Bobcat for Apple TV, Llano for Mac Mini, and a combination of Llano and SB for the Macbook Pros. Who knows what they will use the for the standard Macbook and IMac. Reply
  • TEAMSWITCHER - Monday, August 30, 2010 - link

    When Apple was getting the G5 chip from IBM, there were huge supply problems. This and the performance/watt efficiency of Intel processors forced Apple to the x86 architecture.

    Apple is not about to go back to the supply problem days of the G5, and AMD has has lots of supply issues. AMD should be the discrete graphics provider though.
    Reply
  • erple2 - Monday, August 30, 2010 - link

    Honestly, the primary reason why Apple went with Intel over the mobile G5, and G6 chip was supply reasons - Intel was willing to put Apple in its "preferred" category, something IBM and company weren't willing to do with the G5+'s.

    IBM was in the process of developing the low power G5 chips for use in mobile Apple's, but that was <b>not</b> the focus of IBM. Intel already had low power CPU's available, and was willing to treat Apple as a first-rate reseller. I think that's why Apple went with Intel.
    Reply
  • MonkeyPaw - Monday, August 30, 2010 - link

    What supply issues? Apple can use CPUs from both companies, just like it uses GPUs from both. That means you are LESS likely of supply issues, so long as Intel doesn't threaten Apple with limited product lines. I doubt that will happen, after the AMD settlement. Reply
  • iwodo - Monday, August 30, 2010 - link

    Again, if Dual Core GPU works like current SLI and Crossfire then we will need constant drivers update. Which is not a good thing.

    With these amount of GPU power may be we can finally move all of the Desktop Display Rendering to GPU.

    The mention of Nvidia is interesting. For Apple which wants every system to be OpenCL capable, would need another GPU. However the systems which only has 2 Chips space would means either Apple have to make an OpenCL drivers for Intel GPU or get a Nvidia Chipset. ( Via the PCI - Express Interface ).

    Now the only thing missing from my next set up is an Super Fast and affordable SSD.
    Reply
  • JarredWalton - Monday, August 30, 2010 - link

    I seriously doubt we're looking at anything like SLI/CF. This is merely a doubling of the number of GPU execution units (EUs), similar to the way NVIDIA has 16 and 32 CUDA cores, or AMD has 40 and 80 stream processors. What we don't know is if the number of shader processors is the only thing to double--i.e. is there an increase in the number of ROPs as well?

    Bandwidth is the one thing that almost certainly won't increase between the two variants, so there will be games that run into bandwidth limitations, but for mobile GPUs with DDR3-1600 we're still talking about far more bandwidth than most entry GPUs get. There you have 64-bit DDR2/3~1600 is common, so we're looking at sharing roughly twice times that much bandwidth with the CPU, if we're talking about 310M and HD5470.
    Reply

Log in

Don't have an account? Sign up now