Last week we published our preview of Intel's 2011 Core microarchitecture update, codenamed Sandy Bridge. In the preview we presented a conservative estimate of what shipping Sandy Bridge performance will look like in Q1 2011. I call it conservative because we were dealing with an early platform, with turbo disabled, compared to fairly well established competitors with their turbo modes enabled.

It shouldn't come as a surprise to you that this performance preview, ~5 months before launch, wasn't officially sanctioned or supported by Intel. All companies like to control the manner in which information about their products is released, regardless of whether the outcome is good or bad. We acquired the chip on our own, ran the benchmarks on our own and published the article, on our own. 

As a result a number of questions remained unanswered. I measured significantly lower L3 cache latencies on SB vs. Westmere/Nehalem, I just have no idea why they were lower. I suspect many of these questions will be answered at IDF, but the point is that we were flying blind on this one.

A big unknown was the state of Sandy Bridge graphics. As I mentioned in the preview, there will be two types of integrated graphics enabled on Sandy Bridge parts: 1 core and 2 core parts. Intel refers to them as GT1 and GT2, respectively. The GT1 parts have 6 execution units (EUs), while the GT2 parts have 12.

While some desktop parts will feature GT2, all notebook parts (at launch) will feature GT2. Based on the information I had while running our tests, it looked like the Sandy Bridge sample was a GT1 part. With no official support from Intel and no way to tell how many EUs the sample had, I had no way to confirm. Since publication I've received more information that points to our sample being a GT2 part. It's not enough for me to 100% confirm that it's GT2, but that's what it looks to be at this point.

If it is indeed a GT2 part, the integrated graphics performance in our preview is indicative of the upper end of what you can expect for desktops and in the range of what we'd expect from SB notebooks (graphics turbo may move numbers up a bit but it's tough to tell at this point since our sample didn't have turbo enabled). As soon as I got this information I made updates to the articles indicating our uncertainty. I never like publishing something I'm not 100% sure of and for that, I owe you an apology. We trusted that our sources on the GT1/6EU information were accurate and in this case they may not have been. We all strive to be as accurate as possible on AnandTech and when any of us fail to live up to that standard, regardless of reasoning, it hurts. Thankfully the CPU and GPU performance data are both accurate, although we're simply unsure if the GPU performance will apply to the i5 2400 or not (it should be indicative of notebook SB GPU performance and some desktop SB GPU performance).

The desktop Sandy Bridge GPU rollout is less clear. I've heard that the enthusiast K-SKUs will have GT2 graphics while the more mainstream parts will have GT1. I'm not sure this makes sense, but we'll have to wait and see.

Many of you have been drawing the comparison to Llano and how it will do vs. Sandy Bridge. Llano is supposed to be based on a modified version of the current generation Phenom II architecture. Clock for clock, I'd expect that to be slower than Sandy Bridge. But clock for clock isn't what matters, it's performance per dollar and performance per watt that are most important. AMD has already made it clear that it can compete in the former and it's too early to tell what Llano perf per watt will be. On the CPU side I feel it's probably easy to say that Intel will have the higher absolute performance, but AMD may be competitive at certain price points (similar to how it is today). Intel likes to maintain certain profit margins and AMD doesn't mind dropping below them to maintain competitive, it's why competition is good.

Llano's GPU performance is arguably the more interesting comparison. While Intel had to do a lot of work to get Sandy Bridge to where it is today, AMD has an easier time on the graphics side (given ATI's experience). The assumption is that Llano's GPU will be more powerful than what Intel has in Sandy Bridge. If that's the case, then we're really going to have an awesome set of entry level desktops/notebooks next year. 

POST A COMMENT

43 Comments

View All Comments

  • Taft12 - Thursday, September 2, 2010 - link

    Hallelujah! Someone give this man a +6! Reply
  • anactoraaron - Wednesday, September 1, 2010 - link

    I have been curious as to the integrated graphics power consumption with arrandale/SB in the notebook realm... I LOVE the notebooks from the Core 2 series with optimus and the ~12 hr potential battery life (a.k.a. the ASUS UL series of yesterday). But it seems like arrandale doesn't really get there on battery life and I am wondering if there's some way for the integrated graphics to have the same "power gating" that the cpu's have (is this already the case?).
    What I would like to see is an improvement for Optimus (nvidia) and "switchable" (amd) for the "on die graphics" as it relates to power gating. It seems to me to be the next logical step to be able to completely turn off the integrated portion especially now that SB shares resources with the cpu when using discrete graphics. Or be able to turn off one core when not doing anything 3D, etc.
    Is this already happening and I've missed it... or am I one of many to think this is needed?
    Reply
  • Randomblame - Wednesday, September 1, 2010 - link

    I thought that performance was too good to be true for the low end intel graphics. I was hoping we could see more performance on the mobile front, the possibility of up to 50% more oomph over what we saw in your benchmarks was awesome but I guess we all just forgot these are intel integrated graphics. You can't expect too much. The results in your benchmarks are still respectable considering the source, if we want more performance we can still go the integrated route.

    I sure hope the next die shrink brings at least a doubling of execution units for the graphics, then I'll be happy.

    You know what would be very interesting? A method of accessing and using these extra gpu transistors for computing, cuda style. A lot of people are going to be running discreet graphics...
    Reply
  • Randomblame - Wednesday, September 1, 2010 - link

    I wonder if all of these chips are made with the extra execution units and half are disabled to improve yields when they don't work. They are still in the sampling stage right now they still have to get it right before they can start trying to make native gt1 parts to save silicone. Once they start gt1 parts perhaps we will see a version with the graphics disabled altogether as another method of improving yields. Reply
  • s44 - Wednesday, September 1, 2010 - link

    "the enthusiast K-SKUs will have GT2 graphics while the more mainstream parts will have GT1"

    This seems backwards, at least from the consumer perspective. Although there's a need for a minimally powerful HTPC part, won't any enthusiast desktop user who wants shader power just buy a GPU?
    Reply
  • beginner99 - Thursday, September 2, 2010 - link

    Most enthusiast won't buy this lga1155 stuff anyway and wait till Q32011 for the real thing. But in general I agree with you.

    But why intel does this is simple. Would make live even more complicated for the normal consumers if a slower cpu have better gpu's.
    Reply
  • IntelUser2000 - Thursday, September 2, 2010 - link

    Oh yea because 3/4-channel memory and 2/4 extra cores will will help in games, which is the biggest reason for enthusiasts to buy. Reply
  • shiznit - Thursday, September 2, 2010 - link

    +1

    I consider myself an enthusiast but I also like to save money, and I doubt I'm alone.

    $150 core i5 750 @ 4.0 with $70 2x2GB ram and $100 motherboard feeds my 5870 just as well as an i7 920 x58 platform that costs a lot more.

    enthusiast != big spender
    Reply
  • richardginn - Thursday, September 2, 2010 - link

    If that was GT2 power we looked at in the preview than you are looking for that in a 300 dollar plus CPU.

    I say at the price point you will also be buying a 200 buck video card anyway which defeats the purpose owning GT2 graphics, right????
    Reply
  • IntelUser2000 - Thursday, September 2, 2010 - link

    "The desktop Sandy Bridge GPU rollout is less clear. I've heard that the enthusiast K-SKUs will have GT2 graphics while the more mainstream parts will have GT1. I'm not sure this makes sense, but we'll have to wait and see."

    Wait, does it mean there's still a possibility the tested part is 1 core? The tested part was apparently i5 2400, which doesn't have any K parts at all.

    @ those complaining about K parts with graphics

    It doesn't mean only the "K" variants have it, it probably means all the CPUs that have K options have it. The i5 2500 has K also.
    Reply

Log in

Don't have an account? Sign up now