Intel HD Graphics 2000/3000 Performance

I dusted off two low-end graphics cards for this comparison: a Radeon HD 5450 and a Radeon HD 5570. The 5450 is a DX11 part with 80 SPs and a 64-bit memory bus. The SPs run at 650MHz and the DDR3 memory interface has a 1600MHz data rate. That’s more compute power than the Intel HD Graphics 3000 but less memory bandwidth than a Sandy Bridge if you assume the CPU cores aren’t consuming more than half of the available memory bandwidth. The 5450 will set you back $45 at Newegg and is passively cooled.

The Radeon HD 5570 is a more formidable opponent. Priced at a whopping $70, this GPU comes with 400 SPs and a 128-bit memory bus. The core clock remains at 650MHz and the DDR3 memory interface has an 1800MHz data rate. This is more memory bandwidth and much more compute than the HD Graphics 3000 can offer.

Based on what we saw in our preview I’d expect performance similar to the Radeon HD 5450 and significantly lower than the Radeon HD 5570. Both of these cards were paired with a Core i5-2500K to remove any potential CPU bottlenecks.

On the integrated side we have a few representatives. AMD’s 890GX is still the cream of the crop for AMD integrated for at least a few more months. I paired it with a 6-core 1100T to keep the CPU from impacting things.

Representing Clarkdale I have a Core i5-661 and 660. Both chips run at 3.33GHz but the 661 has a 900MHz GPU while the 660 runs at 733MHz. These are the fastest representatives with last year’s Intel HD Graphics, but given the margin of improvement I didn’t feel the need to show anything slower.

And finally from Sandy Bridge we have three chips: the Core i5-2600K and 2500K both with Intel HD Graphics 3000 (but different turbo modes) and the Core i3-2100 with HD Graphics 2000.

Nearly all of our test titles were run at the lowest quality settings available in game at 1024x768. We ran with the latest drivers available as of 12/30/2010. Note that all of the screenshots used below were taken on Intel's HD Graphics 3000. For a comparison of IQ between it and the Radeon HD 5450 I've zipped up originals of all of the images here.

Dragon Age: Origins

DAO has been a staple of our integrated graphics benchmark for some time now. The third/first person RPG is well threaded and is influenced both by CPU and GPU performance.

We ran at 1024x768 with graphics and texture quality both set to low. Our benchmark is a FRAPS runthrough of our character through a castle.

Dragon Age: Origins

The new Intel HD Graphics 2000 is roughly the same performance level as the highest clock speed HD Graphics offered with Clarkdale. The Core i3-2100 and Core i5-661 deliver about the same level of performance here. Both are faster than AMD’s 890GX and all three of them are definitely playable in this test.

HD Graphics 3000 is a huge step forward. At 71.5 fps it’s 70% faster than Clarkdale’s integrated graphics, and fast enough that you can actually crank up some quality settings if you’d like. The higher end HD Graphics 3000 is also 26% faster than a Radeon HD 5450.

What Sandy Bridge integrated graphics can’t touch however is the Radeon HD 5570. At 112.5 fps, the 5570’s compute power gives it a 57% advantage over Intel’s HD Graphics 3000.

Dawn of War II

Dawn of War II is an RTS title that ships with a built in performance test. I ran at the lowest quality settings at 1024x768.

Dawn of War II

Here the Core i7-2600K and 2500K fall behind the Radeon HD 5450. The 5450 manages a 25% lead over the HD Graphics 3000 on the 2600K. It's interesting to note the tangible performance difference enabled by the higher max graphics turbo frequency of the 2600K (1350MHz vs. 1100MHz). It would appear that Dawn of War II is largely compute bound on these low-end GPUs.

Compared to last year's Intel HD Graphics, the performance improvement is huge. Even the HD Graphics 2000 is almost 30% faster than the fastest Intel offered with Clarkdale. While I wouldn't view Clarkdale as being useful graphics, at the performance levels we're talking about now game developers should at least be paying attention to Intel's integrated graphics.

Call of Duty: Modern Warfare 2

Our Modern Warfare 2 benchmark is a quick FRAPS run through a multiplayer map. All settings were turned down/off and we ran at 1024x768.

Call of Duty: Modern Warfare 2

The Intel HD Graphics 3000 enabled chips are able to outpace the Radeon HD 5450 by at least 5%. The 2000 model isn't able to do as well, losing out to even the 890GX. On the notebook side this won't be an issue but for desktops with integrated graphics, it is a problem as most will have the lower end GPU.

The performance improvement over last year's Clarkdale IGP is at least 30%, and more if you compare to the more mainstream Clarkdale SKUs.

BioShock 2

Our test is a quick FRAPS runthrough in the first level of BioShock 2. All image quality settings are set to low, resolution is at 1024x768.

BioShock 2

Once again the HD Graphics 3000 GPUs are faster than the Radeon HD 5450; it's the 2000 model that's slower. In this case the Core i3-2100 is actually slightly slower than last year's Core i5-661.

World of Warcraft

Our WoW test is run at fair quality settings (with weather turned down all the way) on a lightly populated server in an area where no other players are present to produce repeatable results. We ran at 1024x768.

World of Warcraft

The high-end HD Graphics 3000 SKUs do very well vs. the Radeon HD 5450 once again. We're at more than playable frame rates in WoW with all of the Sandy Bridge parts, although the two K-series SKUs are obviously a bit smoother.

HAWX

Our HAWX performance tests were run with the game's built in benchmark in DX10 mode. All detail settings were turned down/off and we ran at 1024x768.

HAWX—DX10

The Radeon HD 5570 continues to be completely untouchable. While Sandy Bridge can compete in the ~$40-$50 GPU space, anything above that is completely out of its reach. That isn't too bad considering Intel spent all of 114M transistors on the SNB GPU, but I do wonder if Intel will be able to push up any higher in the product stack in future GPUs.

Once again the HD Graphics 2000 GPU is a bit too slow for my tastes, just barely edging out the fastest Clarkdale GPU.

Starcraft II

We have two Starcraft II benchmarks: a GPU and a CPU test. The GPU test is mostly a navigate-around-the-map test, as scrolling and panning around tends to be the most GPU bound in the game. Our CPU test involves a massive battle of six armies in the center of the map, stressing the CPU more than the GPU. At these low quality settings, however, both benchmarks are influenced by CPU and GPU.

Starcraft II—AT GPU Test

Starcraft II is really a strong point of Sandy Bridge's graphics. It's more than fast enough to run one of the most popular PC games out today. You can easily crank up quality settings or resolution without turning the game into a slideshow. Of course, low quality SC2 looks pretty weak compared to medium quality, but it's better than nothing.

Our CPU test actually ends up being GPU bound with Intel's integrated graphics, AMD's 890GX is actually faster here:

Starcraft II—AT CPU Test

Call of Duty: Black Ops

Call of Duty: Black Ops is basically unplayable on Sandy Bridge integrated graphics. I'm guessing this is not a compute bound scenario but rather an optimization problem for Intel. You'll notice there's hardly any difference between the performance of the 2000 and 3000 GPUs, indicating a bottleneck elsewhere. It could be memory bandwidth. Despite the game's near-30fps frame rate, there's way too much stuttering and jerkiness during the game to make it enjoyable.

Call of Duty: Black Ops

Mafia II

Mafia II ships with a built in benchmark which we used for our comparison.

Mafia II

Frame rates are pretty low here, definitely not what I'd consider playable. This is a fact across the board though; you need to spend at least $70 on a GPU to get a playable experience here.

Civilization V

For our Civilization V test we're using the game's built in lateGameView benchmark. The test was run in DX9 mode with everything turned down at 1024x768:

Civilization V—DX9

Performance here is pretty low. Even a Radeon HD 5450 isn't enough to get you smooth frame rates; a discrete GPU is just necessary for some games. Civ V does have the advantage of not depending on high frame rates, though; the mouse input is decoupled from rendering, so you can generally interact with the game even at low frame rates.

Metro 2033

We're using the Metro 2033 benchmark that comes with the patched game. Occasionally I noticed rendering issues at the Metro 2033 menu screen but I couldn't reproduce the problem regularly on Intel's HD Graphics.

Metro 2033

Metro 2033 and many newer titles are just not playable at smooth frame rates on anything this low-end. Intel integrated graphics as well as low-end discrete GPUs are best paired with older games.

DiRT 2

Our DiRT 2 performance numbers come from the demo's built-in benchmark:

DiRT 2

DiRT 2 is another game that needs compute power, and the faster 2600K gets a decent boost from the higher clock speed. Frame rates are relatively consistent as well, though you'll get dips into the low 20s and teens at times, so at these settings the game is borderline playable. (Drop to Ultra Low if you need higher performance.)

Intel’s Gen 6 Graphics Resolution Scaling with Intel HD Graphics 3000
Comments Locked

283 Comments

View All Comments

  • Taft12 - Tuesday, January 4, 2011 - link

    You first.
  • ReaM - Tuesday, January 4, 2011 - link

    the six core 980x still owns them in all tests where all cores are used.

    I dont know 22k in cinebench is really not a reason to buy the new i7, I reach 24k on air with i7 860 and my i5 runs on 20k on air.

    Short term performance is real good, but I dont care if I wait for a package to unpack for 7 seconds or 8, for long term like rendering, neither there is a reason to upgrade.

    I recommend you get the older 1156 off ebay and save a ton of money.

    I have the i5 on hackintosh, I am wondering if 1155 will be hackintoshable
  • Spivonious - Tuesday, January 4, 2011 - link

    I have to disagree with Anand; I feel the QuickSync image is the best of the four in all cases. Yes, there is some edge-softening going on, so you lose some of the finer detail that ATi and SNB gives you, but when viewing on a small screen such as one on an iPhone/iPod, I'd rather have the smoothed-out shapes than pixel-perfect detail.
  • wutsurstyle - Tuesday, January 4, 2011 - link

    I started my computing days with Intel but I'm so put off by the way Intel is marketing their new toys. Get this but you can't have that...buy that, but your purchase must include other things. And even after I throw my wallet to Intel, I still would not have a OC'd Sandy Bridge with useful IGP and Quicksync. But wait, throw more money on a Z68 a little later. Oh...and there's a shiny new LGA2011 in the works. Anyone worried that they started naming sockets after the year it comes out? Yay for spending!

    AMD..please save us!
  • MrCrispy - Tuesday, January 4, 2011 - link

    Why the bloody hell don't the K parts support VT-d ?! I can only imagine it will be introduced at a price premium in a later part.
  • slick121 - Tuesday, January 4, 2011 - link

    Wow I just realized this. I really hate this type of market segmentation.
  • Navier - Tuesday, January 4, 2011 - link

    I'm a little confused why Quick Sync needs to have a monitor connected to the MB to work. I'm trying to understand why having a monitor connected is so important for video transcoding, vs. playback etc.

    Is this a software limitation? Either in the UEFI (BIOS) or drivers? Or something more systemic in the hardware.

    What happens on a P67 motherboard? Does the P67 board disable the on die GPU? Effectively disabling Quick Sync support? This seems a very unfortunate over-site for such a promising feature. Will a future driver/firmware update resolve this limitation?

    Thanks
  • NUSNA_moebius - Tuesday, January 4, 2011 - link

    Intel HD 3000 - ~115 Million transistors
    AMD Radeon HD 3450 - 181 Million transistors - 8 SIMDs
    AMD Radeon HD 4550 - 242 Million transistors - 16 SIMDs
    AMD Radeon HD 5450 - 292 Million transistors - 16 SIMDs
    AMD Xenos (Xbox 360 GPU) - 232 Million transistors + 105 Million (eDRAM daughter die) = 337 Million transistors - 48 SIMDs

    Xenos I think in the end is still a good two, two and a half times more powerful than the Radeon 5450. Xenos does not have to be OpenCL, Direct Compute, DX11 nor fully DX10 compliant (a 50 million jump from the 4550 going from DX10.1 to 11), nor contains hardware video decode, integrated HDMI output with 5.1 audio controller (even the old Radeon 3200 clocks in at 150 million + transistors). What I would like some clarification on is if the transistor count for the Xenos includes Northbridge functions..............

    Clearly PC GPUs have insane transistor counts in order to be highly compatible. It is commendable how well the Intel HD 3000 does with only 115 Million, but it's important to note that older products like the X1900 had 384 Million transistors, back when DX9.0c was the aim and in pure throughput, it should match or closely trail Xenos at 500 MHz. Going from the 3450 to 4550 GPUs, we go up another 60 million for 8 more SIMDs of a similar DX10.1 compatible nature, as well as the probable increases for hardware video decode, etc. So basically, to come into similar order as the Xenos in terms of SIMD counts (of which Xenos is 48 of it's own type I must emphasize), we would need 60 million transistors per 8 SIMDs, which would put us at about 360 million transistors for a 48 SIMD (240 SP) AMD part that is DX 10.1 compatible and not equipped with anything unrelated to graphics processing.

    Yes, it's a most basic comparison (and probably fundamentally wrong in some regards), but I think it sheds some light on the idea that the Radeon HD 5450 really still pales in comparison to the Xenos. We have much better GPUs like Redwood that are twice as powerful with their higher clock speeds + 400 SPs (627 Million transistors total) and consume less energy than Xenos ever did. Of course, this isn't taking memory bandwidth or framebuffer size into account, nor the added benefits of console optimization.
  • frankanderson - Tuesday, January 4, 2011 - link

    I'm still rocking my Q6600 + Gigabyte X38 DS5 board, upgraded to a GTX580 and been waiting for Sandy, definitely looking forward to this once the dust settles..

    Thanks Anand...
  • Spivonious - Wednesday, January 5, 2011 - link

    I'm still on E6600 + P965 board. Honestly, I would upgrade my video card (HD3850) before doing a complete system upgrade, even with Sandy Bridge being so much faster than my old Conroe. I have yet to run a game that wasn't playable at full detail. Maybe my standards are just lower than others.

Log in

Don't have an account? Sign up now