Update: Be sure to read our Sandy Bridge Architecture Exposed article for more details on the design behind Intel's next-generation microprocessor architecture.

The mainstream quad-core market has been neglected ever since we got Lynnfield in 2009. Both the high end and low end markets saw a move to 32nm, but if you wanted a mainstream quad-core desktop processor the best you could get was a 45nm Lynnfield from Intel. Even quad-core Xeons got the 32nm treatment.

That's all going to change starting next year. This time it's the masses that get the upgrade first. While Nehalem launched with expensive motherboards and expensive processors, the next tock in Intel's architecture cadence is aimed right at the middle of the market. This time, the ultra high end users will have to wait - if you want affordable quad-core, if you want the successor to Lynnfield, Sandy Bridge is it.

Sandy Bridge is the next major architecture from Intel. What Intel likes to call a tock. The first tock was Conroe, then Nehalem and now SB. In between were the ticks - Penryn, Westmere and after SB we'll have Ivy Bridge, a 22nm shrink of Sandy.

Did I mention we have one?

While Intel is still a few weeks away from releasing Sandy Bridge performance numbers at IDF, we managed to spend some time with a very healthy sample and run it through a few of our tests to get a sneak peak at what's coming in Q1 2011.

New Naming

The naming isn’t great. It’s an extension of what we have today. Intel is calling Sandy Bridge the 2nd generation Core i7, i5 and i3 processors. As a result, all of the model numbers have a 2 preceding them.

For example, today the fastest LGA-1156 processor is the Core i7 880. When Sandy Bridge launches early next year, the fastest LGA-1155 processor will be the Core i7 2600. The two indicates that it’s a 2nd generation Core i7, and the 600 is the model number.

Sandy Bridge CPU Comparison
  Base Frequency L3 Cache Cores/Threads Max Single Core Turbo Intel HD Graphics Frequency/Max Turbo Unlocked TDP
Intel Core i7 2600K 3.4GHz 8MB 4 / 8 3.8GHz 850 / 1350MHz Y 95W
Intel Core i7 2600 3.4GHz 8MB 4 / 8 3.8GHz 850 / 1350MHz N 95W
Intel Core i5 2500K 3.3GHz 6MB 4 / 4 3.7GHz 850 / 1100MHz Y 95W
Intel Core i5 2500 3.3GHz 6MB 4 / 4 3.7GHz 850 / 1100MHz N 95W
Intel Core i5 2400 3.1GHz 6MB 4 / 4 3.4GHz 850 / 1100MHz N 95W
Intel Core i3 2120 3.3GHz 3MB 2 / 4 N/A 850 / 1100MHz N 65W
Intel Core i3 2100 3.1GHz 3MB 2 / 4 N/A 850 / 1100MHz N 65W

The names can also have a letter after four digit model number. You’re already familiar with one: K denotes an unlocked SKU (similar to what we have today). There are two more: S and T. The S processors are performance optimized lifestyle SKUs, while the T are power optimized.

The S parts run at lower base frequencies than the non-S parts (e.g. a Core i7 2600 runs at 3.40GHz while a Core i7 2600S runs at 2.80GHz), however the max turbo frequency is the same for both (3.8GHz). GPU clocks remain the same but I’m not sure if they have the same number of execution units. All of the S parts run at 65W while the non-S parts are spec’d at 95W.

Sandy Bridge CPU Comparison
  Base Frequency L3 Cache Cores/Threads Max Single Core Turbo Intel HD Graphics Frequency/Max Turbo TDP
Intel Core i7 2600S 2.8GHz 8MB 4 / 8 3.8GHz 850 / 1100MHz 65W
Intel Core i5 2500S 2.7GHz 6MB 4 / 4 3.7GHz 850 / 1100MHz 65W
Intel Core i5 2500T 2.3GHz 6MB 4 / 4 3.3GHz 650 / 1250MHz 45W
Intel Core i5 2400S 2.5GHz 6MB 4 / 4 3.3GHz 850 / 1100MHz 65W
Intel Core i5 2390T 2.7GHz 3MB 2 / 4 3.5GHz 650 / 1100MHz 35W
Intel Core i3 2100T 2.5GHz 3MB 2 / 4 N/A 650 / 1100MHz 35W

The T parts run at even lower base frequencies and have lower max turbo frequencies. As a result, these parts have even lower TDPs (35W and 45W).

I suspect the S and T SKUs will be mostly used by OEMs to keep power down. Despite the confusion, I like the flexibility here. Presumably there will be a price premium for these lower wattage parts.

A New Architecture
Comments Locked

200 Comments

View All Comments

  • seapeople - Sunday, August 29, 2010 - link

    So you're saying that integrated graphics should either be able to handle high resolution gaming using at least medium settings on the upper echelon of current games or they should not be included? That's fairly narrow minded. The bottom line is that most people will never need a better graphics card than SB provides, and the people who do are probably going to buy a $200+ graphics card anyway and replace it every summer, so are they really going to care if the integrated graphics drive the price of their $200 processor up by $10-20? Alternatively, this chip is begging for some sort of Optimus-like option, which will allow hardcore gamers to buy the graphics card they want, AND not have to chew up 100W of graphics power while browsing the web or watching a movie.

    Regardless, for people who aren't hard core gamers, the IGP on SB replaces the need to buy something like a Radeon HD 5450, ultimately saving them money. This seems like a positive step to me.
  • chizow - Sunday, August 29, 2010 - link

    No, I'm saying if this is being advertised as a suitable discrete GPU replacement, it should be compared to discrete GPUs at resolutions and settings you would expect a discrete GPU to handle and not IGPs that we already know are too slow to matter. 1024x768 and all lowest settings doesn't fit that criteria. Flash and web-based games don't either, since they don't even require a 3D accelerator in order to run (Intel's workaround Broadcom chip would be fine).

    Again, this card wouldn't even hold a candle to a mid-range $200 GPU from 3 years ago, the 8800GT would still do cartwheels all over it. You can buy these cards for much less than $100, even the GT240 or 4850 for example have been selling for less than $50 after MIR and would be a much more capable gaming card.

    Also, you're badly mistaken if you think this GPU is free by any means, as the cost to integrate a GPU onto SB's die comes at the expense of what could've been more actual CPU....so instead of better CPU performance this generation, you lose that for mediocre graphics performance. There is a price to pay for that relatively massive IGP whether you think so or not, you are paying for it.
  • wut - Sunday, August 29, 2010 - link

    You don't know what you're talking about. You pretend that you do, but you don't.

    The telling sign is your comment about L2/L3 cache.
  • chizow - Sunday, August 29, 2010 - link

    Actually it sounds like you don't know what you're talking about or you didn't read the article:

    "Only the Core i7 2600 has an 8MB L3 cache, the 2400, 2500 and 2600 have a 6MB L3 and the 2100 has a 3MB L3. The L3 size should matter more with Sandy Bridge due to the fact that it’s shared by the GPU in those cases where the integrated graphics is active. I am a bit puzzled why Intel strayed from the steadfast 2MB L3 per core Nehalem’s lead architect wanted to commit to. I guess I’ll find out more from him at IDF :)"

    You might've missed it very clearly stated in the tables also that only the 2600 has the same 8MB L3 or 2MB per core with previous 4C like Bloomfield/Lynnfield/Westmere/Clarkdale. The rest have 6MB or 3MB, which is less than 8MB or 4MB L3 used on the previous generation chips.

    This may change with the high-end/enthusiast platform, but again, the amount of L3 cache is actually going to be a downgrade on many of these Sandy Bridge SKUs for anyone who already owns a Nehalem/Westmere based CPU.
  • wut - Friday, September 10, 2010 - link

    You're parroting Anand and his purely number-based guess. Stop pretending.
  • mac2j - Saturday, August 28, 2010 - link

    990x is a Gulftown part on 1366 that's 130MHz faster than the 980x.... will cost $1000 and come out the same time as the 2600 (which will cost ~ 1/2 and deliver 90% of the performance) and at most a couple months before the i7-2800K which will cost less and trounce it performance-wise.

    You'd have to REALLY want those extra cores to buy a 990x on a lame-duck socket at that point!
  • wut - Sunday, August 29, 2010 - link

    Some has to get those chips to populate the uppermost echelons 3DMark score boards. It's an expensive hobby.
  • hybrid2d4x4 - Saturday, August 28, 2010 - link

    Anand, can you provide some more info on what the system configuration was when running the power tests? The test setup lists 2 vid cards and it's not clear which was used when deriving the power graphs. Also, what PSU was used?
    Just wondering since if it was a 1200W behemoth, then the 63W idle might really be 30W on a more reasonable PSU (assuming no vid cards)...
    As always, thanks for the article!
  • smilingcrow - Saturday, August 28, 2010 - link

    Was HT enabled for the power tests and what application was used to load the cores?
  • semo - Saturday, August 28, 2010 - link

    No USB3.0 support and a half baked SATA3 implementation. I could be a bit too harsh about the latter (can't say if SATA3 on a 6 series chipset will perform poorly or not) but why are they going with only 2 6Gb/s ports? I understand that most people are likely to be buying only 1 or so SSDs in the near future but what about in a few years when these things become mainstream? At least AMD took SATA3 seriously even if they couldn't quite make it work initially (we need a follow up on the 8 series chipsets' SATA performance!)

    Not only are Intel overlooking advance in technologies other than CPUs (which are important to most consumers, whether they are aware of it or not) but are also denying other companies who might have more focus in those areas. I wonder if Nvidia or someone else bother to release a chipset for Intel's latest and greatest.

Log in

Don't have an account? Sign up now