Intel HD Graphics 2000/3000 Performance

I dusted off two low-end graphics cards for this comparison: a Radeon HD 5450 and a Radeon HD 5570. The 5450 is a DX11 part with 80 SPs and a 64-bit memory bus. The SPs run at 650MHz and the DDR3 memory interface has a 1600MHz data rate. That’s more compute power than the Intel HD Graphics 3000 but less memory bandwidth than a Sandy Bridge if you assume the CPU cores aren’t consuming more than half of the available memory bandwidth. The 5450 will set you back $45 at Newegg and is passively cooled.

The Radeon HD 5570 is a more formidable opponent. Priced at a whopping $70, this GPU comes with 400 SPs and a 128-bit memory bus. The core clock remains at 650MHz and the DDR3 memory interface has an 1800MHz data rate. This is more memory bandwidth and much more compute than the HD Graphics 3000 can offer.

Based on what we saw in our preview I’d expect performance similar to the Radeon HD 5450 and significantly lower than the Radeon HD 5570. Both of these cards were paired with a Core i5-2500K to remove any potential CPU bottlenecks.

On the integrated side we have a few representatives. AMD’s 890GX is still the cream of the crop for AMD integrated for at least a few more months. I paired it with a 6-core 1100T to keep the CPU from impacting things.

Representing Clarkdale I have a Core i5-661 and 660. Both chips run at 3.33GHz but the 661 has a 900MHz GPU while the 660 runs at 733MHz. These are the fastest representatives with last year’s Intel HD Graphics, but given the margin of improvement I didn’t feel the need to show anything slower.

And finally from Sandy Bridge we have three chips: the Core i5-2600K and 2500K both with Intel HD Graphics 3000 (but different turbo modes) and the Core i3-2100 with HD Graphics 2000.

Nearly all of our test titles were run at the lowest quality settings available in game at 1024x768. We ran with the latest drivers available as of 12/30/2010. Note that all of the screenshots used below were taken on Intel's HD Graphics 3000. For a comparison of IQ between it and the Radeon HD 5450 I've zipped up originals of all of the images here.

Dragon Age: Origins

DAO has been a staple of our integrated graphics benchmark for some time now. The third/first person RPG is well threaded and is influenced both by CPU and GPU performance.

We ran at 1024x768 with graphics and texture quality both set to low. Our benchmark is a FRAPS runthrough of our character through a castle.

Dragon Age: Origins

The new Intel HD Graphics 2000 is roughly the same performance level as the highest clock speed HD Graphics offered with Clarkdale. The Core i3-2100 and Core i5-661 deliver about the same level of performance here. Both are faster than AMD’s 890GX and all three of them are definitely playable in this test.

HD Graphics 3000 is a huge step forward. At 71.5 fps it’s 70% faster than Clarkdale’s integrated graphics, and fast enough that you can actually crank up some quality settings if you’d like. The higher end HD Graphics 3000 is also 26% faster than a Radeon HD 5450.

What Sandy Bridge integrated graphics can’t touch however is the Radeon HD 5570. At 112.5 fps, the 5570’s compute power gives it a 57% advantage over Intel’s HD Graphics 3000.

Dawn of War II

Dawn of War II is an RTS title that ships with a built in performance test. I ran at the lowest quality settings at 1024x768.

Dawn of War II

Here the Core i7-2600K and 2500K fall behind the Radeon HD 5450. The 5450 manages a 25% lead over the HD Graphics 3000 on the 2600K. It's interesting to note the tangible performance difference enabled by the higher max graphics turbo frequency of the 2600K (1350MHz vs. 1100MHz). It would appear that Dawn of War II is largely compute bound on these low-end GPUs.

Compared to last year's Intel HD Graphics, the performance improvement is huge. Even the HD Graphics 2000 is almost 30% faster than the fastest Intel offered with Clarkdale. While I wouldn't view Clarkdale as being useful graphics, at the performance levels we're talking about now game developers should at least be paying attention to Intel's integrated graphics.

Call of Duty: Modern Warfare 2

Our Modern Warfare 2 benchmark is a quick FRAPS run through a multiplayer map. All settings were turned down/off and we ran at 1024x768.

Call of Duty: Modern Warfare 2

The Intel HD Graphics 3000 enabled chips are able to outpace the Radeon HD 5450 by at least 5%. The 2000 model isn't able to do as well, losing out to even the 890GX. On the notebook side this won't be an issue but for desktops with integrated graphics, it is a problem as most will have the lower end GPU.

The performance improvement over last year's Clarkdale IGP is at least 30%, and more if you compare to the more mainstream Clarkdale SKUs.

BioShock 2

Our test is a quick FRAPS runthrough in the first level of BioShock 2. All image quality settings are set to low, resolution is at 1024x768.

BioShock 2

Once again the HD Graphics 3000 GPUs are faster than the Radeon HD 5450; it's the 2000 model that's slower. In this case the Core i3-2100 is actually slightly slower than last year's Core i5-661.

World of Warcraft

Our WoW test is run at fair quality settings (with weather turned down all the way) on a lightly populated server in an area where no other players are present to produce repeatable results. We ran at 1024x768.

World of Warcraft

The high-end HD Graphics 3000 SKUs do very well vs. the Radeon HD 5450 once again. We're at more than playable frame rates in WoW with all of the Sandy Bridge parts, although the two K-series SKUs are obviously a bit smoother.

HAWX

Our HAWX performance tests were run with the game's built in benchmark in DX10 mode. All detail settings were turned down/off and we ran at 1024x768.

HAWX—DX10

The Radeon HD 5570 continues to be completely untouchable. While Sandy Bridge can compete in the ~$40-$50 GPU space, anything above that is completely out of its reach. That isn't too bad considering Intel spent all of 114M transistors on the SNB GPU, but I do wonder if Intel will be able to push up any higher in the product stack in future GPUs.

Once again the HD Graphics 2000 GPU is a bit too slow for my tastes, just barely edging out the fastest Clarkdale GPU.

Starcraft II

We have two Starcraft II benchmarks: a GPU and a CPU test. The GPU test is mostly a navigate-around-the-map test, as scrolling and panning around tends to be the most GPU bound in the game. Our CPU test involves a massive battle of six armies in the center of the map, stressing the CPU more than the GPU. At these low quality settings, however, both benchmarks are influenced by CPU and GPU.

Starcraft II—AT GPU Test

Starcraft II is really a strong point of Sandy Bridge's graphics. It's more than fast enough to run one of the most popular PC games out today. You can easily crank up quality settings or resolution without turning the game into a slideshow. Of course, low quality SC2 looks pretty weak compared to medium quality, but it's better than nothing.

Our CPU test actually ends up being GPU bound with Intel's integrated graphics, AMD's 890GX is actually faster here:

Starcraft II—AT CPU Test

Call of Duty: Black Ops

Call of Duty: Black Ops is basically unplayable on Sandy Bridge integrated graphics. I'm guessing this is not a compute bound scenario but rather an optimization problem for Intel. You'll notice there's hardly any difference between the performance of the 2000 and 3000 GPUs, indicating a bottleneck elsewhere. It could be memory bandwidth. Despite the game's near-30fps frame rate, there's way too much stuttering and jerkiness during the game to make it enjoyable.

Call of Duty: Black Ops

Mafia II

Mafia II ships with a built in benchmark which we used for our comparison.

Mafia II

Frame rates are pretty low here, definitely not what I'd consider playable. This is a fact across the board though; you need to spend at least $70 on a GPU to get a playable experience here.

Civilization V

For our Civilization V test we're using the game's built in lateGameView benchmark. The test was run in DX9 mode with everything turned down at 1024x768:

Civilization V—DX9

Performance here is pretty low. Even a Radeon HD 5450 isn't enough to get you smooth frame rates; a discrete GPU is just necessary for some games. Civ V does have the advantage of not depending on high frame rates, though; the mouse input is decoupled from rendering, so you can generally interact with the game even at low frame rates.

Metro 2033

We're using the Metro 2033 benchmark that comes with the patched game. Occasionally I noticed rendering issues at the Metro 2033 menu screen but I couldn't reproduce the problem regularly on Intel's HD Graphics.

Metro 2033

Metro 2033 and many newer titles are just not playable at smooth frame rates on anything this low-end. Intel integrated graphics as well as low-end discrete GPUs are best paired with older games.

DiRT 2

Our DiRT 2 performance numbers come from the demo's built-in benchmark:

DiRT 2

DiRT 2 is another game that needs compute power, and the faster 2600K gets a decent boost from the higher clock speed. Frame rates are relatively consistent as well, though you'll get dips into the low 20s and teens at times, so at these settings the game is borderline playable. (Drop to Ultra Low if you need higher performance.)

Intel’s Gen 6 Graphics Resolution Scaling with Intel HD Graphics 3000
POST A COMMENT

282 Comments

View All Comments

  • RagingDragon - Monday, January 03, 2011 - link

    Or perhaps CPU overclocking on H67 is not *officially* supported by Intel, but the motherboard makers are supporting it anyway? Reply
  • IanWorthington - Monday, January 03, 2011 - link

    Seems to sum it up. If you want both you have to wait until Q2.

    <face palm>
    Reply
  • 8steve8 - Monday, January 03, 2011 - link

    so if im someone who wants the best igp, but doesn't want to pay for overclockability, i still have to buy the K cpu... weird. Reply
  • beginner99 - Monday, January 03, 2011 - link

    yep. This is IMHO extremely stupid. Wanted to build a PC for someone that mainly needs CPU power (video editing). An overclocked 2600k would be ideal with QS but either wait another 3 month or go all compromise...in that case H67 probably but still paying for K part and not being able to use it.
    Intel does know how to get the most money from you...
    Reply
  • Hrel - Monday, January 03, 2011 - link

    haha, yeah that is stupid. You'd think on the CPU's you can overclock "K" they use the lower end GPU or not even use one at all. Makes for an awkward HTPC choice. Reply
  • AkumaX - Monday, January 03, 2011 - link

    omg omg omg wat do i do w/ my i7-875k... (p.s. how is this comment spam?) Reply
  • AssBall - Monday, January 03, 2011 - link

    Maybe because you sound like a 12 year old girl with ADHD. Reply
  • usernamehere - Monday, January 03, 2011 - link

    I'm surprised nobody cares there's no native USB 3.0 support coming from Intel until 2012. It's obvious they are abusing their position as the number 1 chip maker, trying to push Light Peak as a replacement to USB. The truth is, Light Peak needs USB for power, it can never live without it (unless you like to carry around a bunch of AC adapters).
    Intel wants light peak to succeed so badly, they are leaving USB 3.0 (it's competitor) by the wayside. Since Intel sits on the USB board, they have a lot of pull in the industry, and as long as Intel wont support the standard, no manufacturer will ever get behind it 100%. Sounds very anti-competitive to me.
    Considering AMD is coming out with USB 3.0 support in Llano later this year, I've already decided to jump ship and boycott Intel. Not because I'm upset with their lack of support for USB 3.0, but because their anti-competitive practices are inexcusable; holding back the market and innovation so their own proprietary format can get a headstart. I'm done with Intel.
    Reply
  • IanWorthington - Monday, January 03, 2011 - link

    Not really: the board manufacturers seem to be adding usb3 chipsets w/o real problems. Good enough. Reply
  • usernamehere - Monday, January 03, 2011 - link

    Sure, if you're building a desktop you can find plenty with USB 3.0 support (via NEC). But if you're looking for a laptop, most will still not have it. For the fact that manufacturers don't want to have to pay extra for features, when they usually get features via the chipsets already included. Asus is coming out with a handful of notebooks in 2011 with USB 3.0 (that I know of), but wide-spread adoption will not be here this year. Reply

Log in

Don't have an account? Sign up now