Sandy Bridge Integrated Graphics Performance

With Clarkdale/Arrandale, Intel improved integrated graphics by a large enough margin that I can honestly say we were impressed with what Intel had done. That being said, the performance of Intel's HD Graphics was honestly not enough. For years integrated graphics have been fast enough to run games like the Sims but not quick enough to play anything more taxing, at least not at reasonable quality settings. The 'dales made Intel competitive in the integrated graphics market, but they didn't change what we thought of integrated graphics.

Sandy Bridge could be different.

Architecturally, Sandy Bridge is a significant revision from what's internally referred to as Intel Gen graphics. While the past two generations of Intel integrated graphics have been a part of the Gen 5 series, Sandy brings the first Gen 6 graphics die to market. With a tremendous increase in IPC and a large L3 cache to partake in, Sandy Bridge's graphics is another significant move forward.

Is it enough to kill all discrete graphics? No. But it's good enough to really threaten the entry level discrete market. Take a look:

Batman: Arkham Asylum

It's unclear whether or not graphics turbo was working on the part I was testing. If it was, this is the best it'll be for the 6 EU parts. If it wasn't, things will be even faster. Comparisons to current integrated graphics solutions are almost worthless. Sandy Bridge's graphics perform like a low end discrete part, not an integrated GPU. In this case, we're about 10% faster than a Radeon HD 5450.

Assuming Sandy Bridge retains the same HTPC features that Clarkdale has, I'm not sure there's a reason for these low end discrete GPUs anymore. At least not unless they get significantly faster.

Note that despite the early nature of the drivers, I didn't notice any rendering artifacts or image quality issues while testing Sandy Bridge's integrated graphics.

Dragon Age Origins

The Sandy Bridge advantage actually grows under Dragon Age. At these frame rates you can either enjoy smoother gameplay or actually up the resolution/quality settings to bring it back down to ~30 fps.

Dawn of War II

It's not always a clear victory for Sandy Bridge. In our Dawn of War II test the 5450 pulls ahead, although by only a small margin.

Call of Duty Modern Warfare 2

Sandy is one again on top of the 5450 in Modern Warfare 2. Although I'm not sure these frame rates are high enough to really up quality settings any more, they are at least smooth - which is more than I can say for the first gen HD Graphics.

BioShock 2

Intel promised to deliver a 2x improvement in integrated graphics performance with Sandy Bridge. We're getting a bit more than that here in BioShock 2.

World of Warcraft

World of Warcraft is finally playable with Intel's Sandy Bridge graphics. The Radeon HD 5450 is 10% faster here.

HAWX

Sandy Bridge Graphics Performance Summary

This is still a very early look. Drivers and hardware both aren't final, but the initial results are very promising. Sandy Bridge puts all current integrated graphics solutions to shame, and even looks to nip at the heels of low end discrete GPUs. For HTPC users, Clarkdale did a good enough job - but for light gaming there wasn't enough horsepower under the hood. With Sandy Bridge you can actually play modern titles, albeit at low quality settings.

If this is the low end of what to expect, I'm not sure we'll need more than integrated graphics for non-gaming specific notebooks. Update: It looks like all notebook Sandy Bridge parts, at least initially, will use the 12 EU IGPs. Our SB sample may also have been a 12 EU part, we're still awaiting confirmation.

The Test Photoshop & Video Encoding Performance
Comments Locked

200 Comments

View All Comments

  • DanNeely - Friday, August 27, 2010 - link

    Maybe, but IIRC Apple's biggest issue with the Clarkdale platform on smaller laptops was wanting to maintain CUDA support across their entire platform without adding a 3rd chip to the board, not general GPU performance. Unless the Intel/nVidia lawsuit concludes with nVidia getting a DMI license or Intel getting a CUDA license this isn't going to change.
  • Pinski - Saturday, August 28, 2010 - link

    I don't think it has anything to do with CUDA. I mean, they sell Mac Pros with AMD/ATI Cards in them, and they don't support CUDA. It's more of OpenCL and high enough performance. However, just looking at these new performance, I'm willing to say that it'll be the next chip for the MBP 13" easily.
  • Pinski - Saturday, August 28, 2010 - link

    Well, wait never mind. Apparently it doesn't support OpenCL, which basically puts it out of the picture for Apple to use.
  • starfalcon - Saturday, August 28, 2010 - link

    Hmm, they really want all of the systems to have OpenCL?
    I don't have OpenCL and I don't care at all and I have CUDA but have only used it once.
    320M doesn't even have OpenCl does it?
    Seems like it would be ok for the less expensive ones to have Intel graphics and the higher end ones to have CUDA, OpenCL, and better gaming performance if someone cares about those.
    They'll keep on upgrading the performance and features of Intel graphics though, who knows.
  • Veerappan - Thursday, September 2, 2010 - link

    No, just ... no.

    Nvidia implements an OpenCL run-time by translating OpenCL API calls to CUDA calls. If your card supports CUDA, it supports OpenCL.

    The 320M supports OpenCL, and every Apple laptop/desktop that has shipped in the last few years has as well.

    A large portion of the motivation for OS X 10.6 (Snow Leopard) was introducing OpenCL support.. along with increasing general performance.

    There is a large amount of speculation that OS X 10.7 will take advantage of the OpenCL groundwork that OS X 10.6 has put in place.

    Also, in the case that you have a GPU that doesn't support OpenCL (older Intel Macs with Intel IGP graphics), Apple has written a CPU-based OpenCL run-time. It'll be slower than GPU, but the programs will still run. That being said, I highly doubt that Apple will be willing to accept such a performance deficit existing in a brand new machine compared to prior hardware.
  • Penti - Saturday, August 28, 2010 - link

    It has more to do with nVidia's VP3 PureVideo engine which they rely on for video acceleration. It's as simple as that.

    Which is why they only find their place in the notebooks. It's also a low-end gpu with enough performance to say run a source game at low res. And they have more complete drivers for OS X.

    CUDA is a third party add on. OpenCL isn't.
  • burek - Friday, August 27, 2010 - link

    Will there be a "cheap"(~$300) 6-core LGA-2011 replacement for i7 920/930 or will Intel limit the 6/8 cores to the high-end/extreme price segment ($500+)?
  • DJMiggy - Friday, August 27, 2010 - link

    yea I doubt that will happen. It would be like trying to SLI/crossfire an nvidia to an ati discrete. You would need a special chip like the hyrda one.
  • DJMiggy - Friday, August 27, 2010 - link

    Hydra even. Hydra Lucid chip.
  • Touche - Friday, August 27, 2010 - link

    Questionable overclocking is bad enough, but together with...

    "There’s no nice way to put this: Sandy Bridge marks the third new socket Intel will have introduced since 2008."

    "The CPU and socket are not compatible with existing motherboards or CPUs. That’s right, if you want to buy Sandy Bridge you’ll need a new motherboard."

    "In the second half of 2011 Intel will replace LGA-1366 with LGA-2011."

    ...it is just terrible!

    I'll definitely buy AMD Bulldozer, even if it ends up a bit slower. At least they have some respect for their customers and an ability of forward thinking when designing sockets (actually, Intel probably has it too, but just likes to milk us on chipset purchases also). And I am no fanboy, 4 of my 7 PC's are Intel based (two of those 4 were my latest computer purchases).

Log in

Don't have an account? Sign up now