Sandy Bridge Graphics Update
by Anand Lal Shimpi on September 1, 2010 8:35 PM EST- Posted in
- CPUs
- AMD
- Intel
- Sandy Bridge
Last week we published our preview of Intel's 2011 Core microarchitecture update, codenamed Sandy Bridge. In the preview we presented a conservative estimate of what shipping Sandy Bridge performance will look like in Q1 2011. I call it conservative because we were dealing with an early platform, with turbo disabled, compared to fairly well established competitors with their turbo modes enabled.
It shouldn't come as a surprise to you that this performance preview, ~5 months before launch, wasn't officially sanctioned or supported by Intel. All companies like to control the manner in which information about their products is released, regardless of whether the outcome is good or bad. We acquired the chip on our own, ran the benchmarks on our own and published the article, on our own.
As a result a number of questions remained unanswered. I measured significantly lower L3 cache latencies on SB vs. Westmere/Nehalem, I just have no idea why they were lower. I suspect many of these questions will be answered at IDF, but the point is that we were flying blind on this one.
A big unknown was the state of Sandy Bridge graphics. As I mentioned in the preview, there will be two types of integrated graphics enabled on Sandy Bridge parts: 1 core and 2 core parts. Intel refers to them as GT1 and GT2, respectively. The GT1 parts have 6 execution units (EUs), while the GT2 parts have 12.
While some desktop parts will feature GT2, all notebook parts (at launch) will feature GT2. Based on the information I had while running our tests, it looked like the Sandy Bridge sample was a GT1 part. With no official support from Intel and no way to tell how many EUs the sample had, I had no way to confirm. Since publication I've received more information that points to our sample being a GT2 part. It's not enough for me to 100% confirm that it's GT2, but that's what it looks to be at this point.
If it is indeed a GT2 part, the integrated graphics performance in our preview is indicative of the upper end of what you can expect for desktops and in the range of what we'd expect from SB notebooks (graphics turbo may move numbers up a bit but it's tough to tell at this point since our sample didn't have turbo enabled). As soon as I got this information I made updates to the articles indicating our uncertainty. I never like publishing something I'm not 100% sure of and for that, I owe you an apology. We trusted that our sources on the GT1/6EU information were accurate and in this case they may not have been. We all strive to be as accurate as possible on AnandTech and when any of us fail to live up to that standard, regardless of reasoning, it hurts. Thankfully the CPU and GPU performance data are both accurate, although we're simply unsure if the GPU performance will apply to the i5 2400 or not (it should be indicative of notebook SB GPU performance and some desktop SB GPU performance).
The desktop Sandy Bridge GPU rollout is less clear. I've heard that the enthusiast K-SKUs will have GT2 graphics while the more mainstream parts will have GT1. I'm not sure this makes sense, but we'll have to wait and see.
Many of you have been drawing the comparison to Llano and how it will do vs. Sandy Bridge. Llano is supposed to be based on a modified version of the current generation Phenom II architecture. Clock for clock, I'd expect that to be slower than Sandy Bridge. But clock for clock isn't what matters, it's performance per dollar and performance per watt that are most important. AMD has already made it clear that it can compete in the former and it's too early to tell what Llano perf per watt will be. On the CPU side I feel it's probably easy to say that Intel will have the higher absolute performance, but AMD may be competitive at certain price points (similar to how it is today). Intel likes to maintain certain profit margins and AMD doesn't mind dropping below them to maintain competitive, it's why competition is good.
Llano's GPU performance is arguably the more interesting comparison. While Intel had to do a lot of work to get Sandy Bridge to where it is today, AMD has an easier time on the graphics side (given ATI's experience). The assumption is that Llano's GPU will be more powerful than what Intel has in Sandy Bridge. If that's the case, then we're really going to have an awesome set of entry level desktops/notebooks next year.
43 Comments
View All Comments
ClagMaster - Thursday, September 2, 2010 - link
Apology is accepted.This is a Black Op test of a motherboard vendor sample. You have done this before with Lynnwood and Conroe sample CPUs and despite small disparities between then and now, were accurate in your preliminary performance assessments. I am surprised your contact that provided you with this sample did not know whether it was a 1 or 2 GPU processor.
As you pointed out earlier in response to an earlier observation, this is a preliminary test to be supplimented by a more comprehensive (and final) test report when the hardware is released.
krumme - Thursday, September 2, 2010 - link
Thank you Anand - respectThank you for posting it on the frontpage.
Doing this sort of previews is sure a delicate balance and dangerous in many ways. I dont think we have to know how you aquire the CPU. Nothing have to be said about "formal". Who know what formal is here?
I never believed for a second the preview was santioned in any formal way by intel. But non the less, i was still left with a voice saying "favor" in my ear when i read the preview. Its okey, we get the preview, but this was over my personal limit. When i read the review, i got the clear impression that the message was that SB was ok for low- and midrange gaming. Its clearly not, and it doesnt have to. It looks to be a very fine CPU.
Again - thank you for the update on the frontpage.
Best regards
mcturkey - Thursday, September 2, 2010 - link
Between the thoroughness of your reviews and your honesty when something you report winds up being incorrect, this is far and way the best site for hardware and technology reviews. Keep up the good work!Stuka87 - Thursday, September 2, 2010 - link
When I first read the preview I did find it interesting that Intel would provide chips so early on that were not yet complete. Now it all makes sense.I too was surprised about the graphics performance if those were indeed GT1 chips. But even if they are GT2's, that still is a HUGE jump over what Intel currently offers. And it means the GT1 should still easily out perform what is offered currently as well.
And as others said, thanks for being so honest with us Anand. Its one of the main reasons I visit this site every day. I do not feel I have to worry about anything being twisted or biased one way or the other. If you make a mistake, you correct it. And that means a lot to us readers.
justaviking - Thursday, September 2, 2010 - link
Anand,Adding my voice to the others, thank you for the front-page correction.
It is good that the error stings. That means you care. But don't apologize too much. I also believe you were quite clear that many things were uncertain and there was much speculation in the original article. For that matter, even your update here includes speculation.
It must be a difficult balance. Do you publish a speculative report or do you delay giving us the information we crave. I vote for early reporting as long as it is clear that is what we are getting.
I have always appreciated these things about AnandTech ("You" being all your authors):
- You present data and analytical results as much as possible
- You add your own subjective analysis and opinion
- You clearly distinguish between objective and subjective comments
- You give the "why" behind opinions
- You give us a look behind-the-scenes
- And, as far as I can tell, you strive to be honest and forthright in your articles, as proven today
Thanks again for the prompt update. This is why I often recommend AnandTech.com on nearly every online forum I participate in.
Vamp9190 - Thursday, September 2, 2010 - link
So what is the best path to upgrade from an older Q6600 Kentsfield CPU and MB?Wait for Sandy Bridge and go 1155? Wait til then and buy a i7 930 1366 cheaper than today (but by how much?) ?
Wait until Q3-4 of next year for 2011 ?
Oh, and I want to be able to OC to 4.0GHz+ (on water if needed), but spend ~$300 for the CPU and ~200-250 for the MB.
Thoughts?
Thanks.
tatertot - Thursday, September 2, 2010 - link
If you are fairly sure this was a 12 EU sample, and certain the turbo was disabled, that does leave one other relevant question:What speed was the iGPU running at?
12 EU QC Mobile parts seem to mostly run at a 650MHz base / 1300 MHz turbo, while 6 EU Desktop high-end parts seem to be 850 MHz base / 1350 MHz turbo.
So for your non-turboing part, was it 850? 650?
thx
ibudic1 - Thursday, September 2, 2010 - link
Dear Anand,I would respectfully disagree with your logic in saying that Performance / (watt, cost) can be attributed to the CPU.
It is misleading to say that Performance / (Watt, Cost) of CPU's is important.
You Say : "But clock for clock isn't what matters, it's performance per dollar and performance per watt that are most important" -> This simply is not true for ALMOST any user, save supercomputing centers.
Cost of CPU's - especially in laptops - compared to the rest of the system is a fraction of the cost of the laptop. Purchasing a laptop that may need to be updated sooner because of worse fundamental performance will in the end cost more than one with a better CPU.
To wit: Buying a laptop now with a low performing but inexpensive CPU might force me to buy a completly new machine in say 1 year. A laptop, as you know has many more components than a CPU alone.
On the other hand buying a laptop with a more powerful, as well as unproportionally more expensive CPU is less expensive, since one would not need to replace the entire system for much longer, say one and a half years.
This is, of course, much less important in case of servers, and supercomputing centers, where the cost of CPUs is the main cost of the center. Replacing an LCD screen on a supercomputing center or on a server is a trivila cost - not worth mentioning.
Having to replace an entire system on the laptop, because of one component, is NOT trivial.
It is a loosing proposition to look at price/ performance of the CPU in a laptop and make your purchase accordingly.
However, if your goal is to help AMD out so that they can stay competative, by using your influence I agree with you. Personally, I will look for my own interest and buy whatever has high price/performance long term - at the moment, AMD is just not competative - at least in the Desktop/Laptop market. For servers they can be a better choice.
So thank you anand by giving an unfair advantage to AMD, which will put pressure on Intel to drive their prices down. :)
ClagMaster - Thursday, September 2, 2010 - link
I would respectfully disagree that performance/power ratio is an essential metric for a laptop. Double the performance for the same power is a real gain in performance.I certainly use this metric for deciding when its time to upgrade either a CPU or graphics card. I limit the CPU power to 95W and the discrete graphics card power to 65W. Unless I get double the performance for the same power level, the component is not worth upgrading.
However, for a 2x increase in performance, it usually requires a complete overhaul of the system to realize the performance potential of all.
7Enigma - Thursday, September 2, 2010 - link
This is a pretty big disappointment since we have no clue the difference between the 6 and 12. The original review now doesn't have much validity.With the incorrect assumption that this was a 6 gpu and the fact that turbo was disabled we could be relatively sure that this was the BASE performance with the combination of the extra 6 PU's and turbo giving an increase. Now, since we don't know the impact of having 6 vs. 12 (20% increase, 50% increase?), nor the actual effect of turbo on the GPU, there really is no way to guess at the performace level of the 6PU part.
I thank you Anand for the update, but kind of wish I hadn't read the original preview.....it was like being teased with a vette and being given a beater. :(