Sandy Bridge Integrated Graphics Performance

With Clarkdale/Arrandale, Intel improved integrated graphics by a large enough margin that I can honestly say we were impressed with what Intel had done. That being said, the performance of Intel's HD Graphics was honestly not enough. For years integrated graphics have been fast enough to run games like the Sims but not quick enough to play anything more taxing, at least not at reasonable quality settings. The 'dales made Intel competitive in the integrated graphics market, but they didn't change what we thought of integrated graphics.

Sandy Bridge could be different.

Architecturally, Sandy Bridge is a significant revision from what's internally referred to as Intel Gen graphics. While the past two generations of Intel integrated graphics have been a part of the Gen 5 series, Sandy brings the first Gen 6 graphics die to market. With a tremendous increase in IPC and a large L3 cache to partake in, Sandy Bridge's graphics is another significant move forward.

Is it enough to kill all discrete graphics? No. But it's good enough to really threaten the entry level discrete market. Take a look:

Batman: Arkham Asylum

It's unclear whether or not graphics turbo was working on the part I was testing. If it was, this is the best it'll be for the 6 EU parts. If it wasn't, things will be even faster. Comparisons to current integrated graphics solutions are almost worthless. Sandy Bridge's graphics perform like a low end discrete part, not an integrated GPU. In this case, we're about 10% faster than a Radeon HD 5450.

Assuming Sandy Bridge retains the same HTPC features that Clarkdale has, I'm not sure there's a reason for these low end discrete GPUs anymore. At least not unless they get significantly faster.

Note that despite the early nature of the drivers, I didn't notice any rendering artifacts or image quality issues while testing Sandy Bridge's integrated graphics.

Dragon Age Origins

The Sandy Bridge advantage actually grows under Dragon Age. At these frame rates you can either enjoy smoother gameplay or actually up the resolution/quality settings to bring it back down to ~30 fps.

Dawn of War II

It's not always a clear victory for Sandy Bridge. In our Dawn of War II test the 5450 pulls ahead, although by only a small margin.

Call of Duty Modern Warfare 2

Sandy is one again on top of the 5450 in Modern Warfare 2. Although I'm not sure these frame rates are high enough to really up quality settings any more, they are at least smooth - which is more than I can say for the first gen HD Graphics.

BioShock 2

Intel promised to deliver a 2x improvement in integrated graphics performance with Sandy Bridge. We're getting a bit more than that here in BioShock 2.

World of Warcraft

World of Warcraft is finally playable with Intel's Sandy Bridge graphics. The Radeon HD 5450 is 10% faster here.

HAWX

Sandy Bridge Graphics Performance Summary

This is still a very early look. Drivers and hardware both aren't final, but the initial results are very promising. Sandy Bridge puts all current integrated graphics solutions to shame, and even looks to nip at the heels of low end discrete GPUs. For HTPC users, Clarkdale did a good enough job - but for light gaming there wasn't enough horsepower under the hood. With Sandy Bridge you can actually play modern titles, albeit at low quality settings.

If this is the low end of what to expect, I'm not sure we'll need more than integrated graphics for non-gaming specific notebooks. Update: It looks like all notebook Sandy Bridge parts, at least initially, will use the 12 EU IGPs. Our SB sample may also have been a 12 EU part, we're still awaiting confirmation.

The Test Photoshop & Video Encoding Performance
Comments Locked

200 Comments

View All Comments

  • overzealot - Saturday, August 28, 2010 - link

    Now, that's a name I've not heard in a long time. A long time.
  • mapesdhs - Saturday, August 28, 2010 - link


    Seems to be Intel is slowly locking up the overclocking scene because it has no
    competition. If so, and Intel continues in that direction, then it would be a great
    chance for AMD to win back overclocking fans with something that just isn't
    locked out in the same way.

    Looking at the performance numbers, I see nothing which suggests a product that
    would beat my current 4GHz i7 860, except for the expensive top-end unlocked
    option which I wouldn't consider anyway given the price.

    Oh well, perhaps my next system will be a 6-core AMD.

    Ian.
  • LuckyKnight - Saturday, August 28, 2010 - link

    Do we have something more precise about the release date? Q1 is what - Jan/Feb/March/Apri?

    Looking to upgrade a core 2 duo at the moment - not sure whether to wait
  • mino - Saturday, August 28, 2010 - link

    Q1 (in this case) means tricle amounts in Jan/Feb, mainstream availability Mar/April and worth-buying mature mobos in May/June timeframe.
  • tatertot - Saturday, August 28, 2010 - link

    Intel has already announced that shipments for revenue will occur in Q4 of this year. So, January launch.

    They've also commented that Sandy Bridge OEM demand is very strong, and they are adjusting the 32nm ramp up to increase supply. So January should be a decent launch.

    Not surprising-- these parts have been in silicon since LAST summer.
  • chrsjav - Saturday, August 28, 2010 - link

    Do modern clock generators use a quartz resonator? How would that be put on-die?
  • iwodo - Saturday, August 28, 2010 - link

    Since you didn't get this chip directly from Intel , i suspect there were no reviews guideline for you to follow, like which test to run and what test not to run etc.

    Therefore those benchmark from Games were not a results of special optimization in drivers. Which is great, because drivers matter much more then Hardware in GPU. If these are only early indication of what Intel new GPU can do, i expect there are more to extract from drivers.

    You mention 2 Core GPU ( 12 EU ) verus 1 GPU ( 6 EU ), Any Guess as to what "E" stand for? And it seems like a SLI like tech rather then actually having more EU in one chip. The different being SLI or crossfire does not get any advantage unless drivers and games are working together. Which greatly reduces the chances of it working at full performance.

    It also seems every one fail to realize one of the greatest performance will be coming from AVX. AVX will be like MMX again when we had the Pentium. I cant think of any other SSE having as great important to performance as AVX. Once software are specially optimize for AVX we should get another major lift in performance.

    I also heard about rumors that 64bit in Sandy Bridge will work much better. But i dont know if there are anything we could test this.

    The OpenCL sounds like a Intel management decision rather then a technical decision. May be Intel will provide or work with Apple to provide OpenCL on these GPU?

    You also mention that Intel somehow support PCI -Express 2.0 with 1.0 performance. I dont get that bit there. Could you elaborate? 2.5GT/s for G45 Chipset??

    If Intel ever decide to finally work on their drivers, then their GPU will be great for entry levels.

    Are Dual Channel DDR3 1333 enough for Quad Core CPU + GPU? or even Dual core CPU.
    Is GPU memory bandwidth limited?

    Any update on Hardware Decoder? And what about transcoding part?

    Would there be ways to lock the GPU to run at Turbo Clock all the time? Or GPU gets higher priority in Turbo etc..

    How big is the Die?

    P.S - ( Any news on Intel G3 SSD? i am getting worried that next Gen Sandforce is too good for intel. )
  • ssj4Gogeta - Saturday, August 28, 2010 - link

    I believe EU means execution units.
  • DanNeely - Sunday, August 29, 2010 - link

    "You also mention that Intel somehow support PCI -Express 2.0 with 1.0 performance. I dont get that bit there. Could you elaborate? 2.5GT/s for G45 Chipset??"

    PCIE 2.0 included other low level protocol improvements in addition to the doubled clock speed. Intel only implemented the former; probably because the latter would have strangled the DMI bus.

    "Are Dual Channel DDR3 1333 enough for Quad Core CPU + GPU? or even Dual core CPU."

    Probably. The performance gains vs the previous generation isn't that large and it was enough for anything except pathological test cases (eg memory benchmarks). If it wasn't there'd be no reason why Intel couldn't officially support DDR3-1600 in their locked chipsets to give a bit of extra bandwidth.
  • chizow - Saturday, August 28, 2010 - link

    @Anand

    Could you please clarify and expand on this comment please? Is this true for all Intel chipsets that claim support for PCIe 2.0?

    [q]The other major (and welcome) change is the move to PCIe 2.0 lanes running at 5GT/s. Currently, Intel chipsets support PCIe 2.0 but they only run at 2.5GT/s, which limits them to a maximum of 250MB/s per direction per lane. This is a problem with high bandwidth USB 3.0 and 6Gbps SATA interfaces connected over PCIe x1 slots. With the move to 5GT/s, Intel is at feature parity with AMD’s chipsets and more importantly the bandwidth limits are a lot higher. A single PCIe x1 slot on a P67 motherboard can support up to 500MB/s of bandwidth in each direction (1GB/s bidirectional bandwidth).[/q]

    If this is true, current Intel chipsets do not support PCIe 2.0 as 2.5GT/s and 250MB/s is actually the same effective bandwidth as PCIe 1.1. How did you come across this information? I was looking for ways to measure PCIe bandwidth but only found obscure proprietary tools not available publicly.

    If Intel chipsets are only running at PCIe 1.1 regardless of what they're claiming externally, that would explain some of the complaints/concerns about bandwidth on older Intel chipsets.

Log in

Don't have an account? Sign up now