Final Words

If Intel's roadmap and pricing hold true, then the Core i5 2400 should give you an average of 23% better performance than the Core i5 760 at a potentially lower point. If we compare shipping configurations, the Core i5 2400 should actually perform like a Core i7 880 despite not having Hyper Threading enabled. Clock for clock however, Sandy Bridge seems to offer a 10% increase in performance. Keep in mind that this analysis was done without a functional turbo mode, so the shipping Sandy Bridge CPUs should be even quicker. I'd estimate you can add another 3 - 7% to these numbers for the final chips. That's not bad at all for what amounts to a free upgrade compared to what you'd buy today. Power consumption will also see an improvement. Not only will Sandy Bridge be noticeably quicker than Lynnfield, it'll draw less power.

While Nehalem was an easy sell if you had highly threaded workloads, Sandy Bridge looks to improve performance across the board regardless of thread count. It's a key differentiator that should make Sandy Bridge an attractive upgrade to more people.

The overclocking prevention Intel is putting into Sandy Bridge sounds pretty bad at first. However if the roadmap and pricing stay their course, it looks like overclockers looking to spend as much as they did on Core i5 750/760s won't be limited at all thanks to the K SKUs in the mix. The real question is what happens at the low end. While I don't get the impression that the Core i3 2000 series will be completely locked, it's unclear how much rope Intel will give us.

Sandy Bridge's integrated graphics is good. It's fast enough to put all previous attempts at integrated graphics to shame and compete with entry level discrete GPUs. The fact that you can get Radeon HD 5450 performance for free with a Core i5 2400 is just awesome. As I mentioned before, you won't want to throw away your GTX 460, but if you were planning on spending $50 on a GPU - you may not need to with Sandy Bridge.

Assuming mobile Sandy Bridge performs at least as well as the desktop parts, we may finally be at the point where what you get with a mainstream notebook is good enough to actually play some games. I'm really curious to see how well the higher spec integrated graphics parts do once Sandy Bridge makes it a little closer to final (Update: it looks like we may have had a 12 EU part from the start). I should add that despite the GPU performance improvement - don't believe this is enough. I would like to see another doubling in integrated GPU performance before I'm really happy, but now it's very clear that Intel is taking integrated graphics seriously.

Architecturally, I'm very curious to see what Intel has done with Sandy Bridge. Given the improvements in FP performance and what I've heard about general purpose performance, I'm thinking there's a lot more than we've seen here today. Then there are the features that we were unable to test: Sandy Bridge's improved turbo and its alleged on-die video transcode engine. If the latter is as capable as I've heard, you may be able to have better transcoding performance on your notebook than you do on your desktop today. Update: Check out our Sandy Bridge Architecture article for full details on the CPU's architecture.

With Sandy Bridge next year you'll get higher clock speeds, more performance per clock and reasonable integrated graphics at presumably the same prices we're paying today. What's even more exciting is the fact that what we're looking at is just mainstream performance. The high end Sandy Bridge parts don't arrive until the second half of 2011 which add more cores and more memory bandwidth.

Power Consumption
Comments Locked

200 Comments

View All Comments

  • seapeople - Sunday, August 29, 2010 - link

    So you're saying that integrated graphics should either be able to handle high resolution gaming using at least medium settings on the upper echelon of current games or they should not be included? That's fairly narrow minded. The bottom line is that most people will never need a better graphics card than SB provides, and the people who do are probably going to buy a $200+ graphics card anyway and replace it every summer, so are they really going to care if the integrated graphics drive the price of their $200 processor up by $10-20? Alternatively, this chip is begging for some sort of Optimus-like option, which will allow hardcore gamers to buy the graphics card they want, AND not have to chew up 100W of graphics power while browsing the web or watching a movie.

    Regardless, for people who aren't hard core gamers, the IGP on SB replaces the need to buy something like a Radeon HD 5450, ultimately saving them money. This seems like a positive step to me.
  • chizow - Sunday, August 29, 2010 - link

    No, I'm saying if this is being advertised as a suitable discrete GPU replacement, it should be compared to discrete GPUs at resolutions and settings you would expect a discrete GPU to handle and not IGPs that we already know are too slow to matter. 1024x768 and all lowest settings doesn't fit that criteria. Flash and web-based games don't either, since they don't even require a 3D accelerator in order to run (Intel's workaround Broadcom chip would be fine).

    Again, this card wouldn't even hold a candle to a mid-range $200 GPU from 3 years ago, the 8800GT would still do cartwheels all over it. You can buy these cards for much less than $100, even the GT240 or 4850 for example have been selling for less than $50 after MIR and would be a much more capable gaming card.

    Also, you're badly mistaken if you think this GPU is free by any means, as the cost to integrate a GPU onto SB's die comes at the expense of what could've been more actual CPU....so instead of better CPU performance this generation, you lose that for mediocre graphics performance. There is a price to pay for that relatively massive IGP whether you think so or not, you are paying for it.
  • wut - Sunday, August 29, 2010 - link

    You don't know what you're talking about. You pretend that you do, but you don't.

    The telling sign is your comment about L2/L3 cache.
  • chizow - Sunday, August 29, 2010 - link

    Actually it sounds like you don't know what you're talking about or you didn't read the article:

    "Only the Core i7 2600 has an 8MB L3 cache, the 2400, 2500 and 2600 have a 6MB L3 and the 2100 has a 3MB L3. The L3 size should matter more with Sandy Bridge due to the fact that it’s shared by the GPU in those cases where the integrated graphics is active. I am a bit puzzled why Intel strayed from the steadfast 2MB L3 per core Nehalem’s lead architect wanted to commit to. I guess I’ll find out more from him at IDF :)"

    You might've missed it very clearly stated in the tables also that only the 2600 has the same 8MB L3 or 2MB per core with previous 4C like Bloomfield/Lynnfield/Westmere/Clarkdale. The rest have 6MB or 3MB, which is less than 8MB or 4MB L3 used on the previous generation chips.

    This may change with the high-end/enthusiast platform, but again, the amount of L3 cache is actually going to be a downgrade on many of these Sandy Bridge SKUs for anyone who already owns a Nehalem/Westmere based CPU.
  • wut - Friday, September 10, 2010 - link

    You're parroting Anand and his purely number-based guess. Stop pretending.
  • mac2j - Saturday, August 28, 2010 - link

    990x is a Gulftown part on 1366 that's 130MHz faster than the 980x.... will cost $1000 and come out the same time as the 2600 (which will cost ~ 1/2 and deliver 90% of the performance) and at most a couple months before the i7-2800K which will cost less and trounce it performance-wise.

    You'd have to REALLY want those extra cores to buy a 990x on a lame-duck socket at that point!
  • wut - Sunday, August 29, 2010 - link

    Some has to get those chips to populate the uppermost echelons 3DMark score boards. It's an expensive hobby.
  • hybrid2d4x4 - Saturday, August 28, 2010 - link

    Anand, can you provide some more info on what the system configuration was when running the power tests? The test setup lists 2 vid cards and it's not clear which was used when deriving the power graphs. Also, what PSU was used?
    Just wondering since if it was a 1200W behemoth, then the 63W idle might really be 30W on a more reasonable PSU (assuming no vid cards)...
    As always, thanks for the article!
  • smilingcrow - Saturday, August 28, 2010 - link

    Was HT enabled for the power tests and what application was used to load the cores?
  • semo - Saturday, August 28, 2010 - link

    No USB3.0 support and a half baked SATA3 implementation. I could be a bit too harsh about the latter (can't say if SATA3 on a 6 series chipset will perform poorly or not) but why are they going with only 2 6Gb/s ports? I understand that most people are likely to be buying only 1 or so SSDs in the near future but what about in a few years when these things become mainstream? At least AMD took SATA3 seriously even if they couldn't quite make it work initially (we need a follow up on the 8 series chipsets' SATA performance!)

    Not only are Intel overlooking advance in technologies other than CPUs (which are important to most consumers, whether they are aware of it or not) but are also denying other companies who might have more focus in those areas. I wonder if Nvidia or someone else bother to release a chipset for Intel's latest and greatest.

Log in

Don't have an account? Sign up now