Intel HD Graphics 2000/3000 Performance

I dusted off two low-end graphics cards for this comparison: a Radeon HD 5450 and a Radeon HD 5570. The 5450 is a DX11 part with 80 SPs and a 64-bit memory bus. The SPs run at 650MHz and the DDR3 memory interface has a 1600MHz data rate. That’s more compute power than the Intel HD Graphics 3000 but less memory bandwidth than a Sandy Bridge if you assume the CPU cores aren’t consuming more than half of the available memory bandwidth. The 5450 will set you back $45 at Newegg and is passively cooled.

The Radeon HD 5570 is a more formidable opponent. Priced at a whopping $70, this GPU comes with 400 SPs and a 128-bit memory bus. The core clock remains at 650MHz and the DDR3 memory interface has an 1800MHz data rate. This is more memory bandwidth and much more compute than the HD Graphics 3000 can offer.

Based on what we saw in our preview I’d expect performance similar to the Radeon HD 5450 and significantly lower than the Radeon HD 5570. Both of these cards were paired with a Core i5-2500K to remove any potential CPU bottlenecks.

On the integrated side we have a few representatives. AMD’s 890GX is still the cream of the crop for AMD integrated for at least a few more months. I paired it with a 6-core 1100T to keep the CPU from impacting things.

Representing Clarkdale I have a Core i5-661 and 660. Both chips run at 3.33GHz but the 661 has a 900MHz GPU while the 660 runs at 733MHz. These are the fastest representatives with last year’s Intel HD Graphics, but given the margin of improvement I didn’t feel the need to show anything slower.

And finally from Sandy Bridge we have three chips: the Core i5-2600K and 2500K both with Intel HD Graphics 3000 (but different turbo modes) and the Core i3-2100 with HD Graphics 2000.

Nearly all of our test titles were run at the lowest quality settings available in game at 1024x768. We ran with the latest drivers available as of 12/30/2010. Note that all of the screenshots used below were taken on Intel's HD Graphics 3000. For a comparison of IQ between it and the Radeon HD 5450 I've zipped up originals of all of the images here.

Dragon Age: Origins

DAO has been a staple of our integrated graphics benchmark for some time now. The third/first person RPG is well threaded and is influenced both by CPU and GPU performance.

We ran at 1024x768 with graphics and texture quality both set to low. Our benchmark is a FRAPS runthrough of our character through a castle.

Dragon Age: Origins

The new Intel HD Graphics 2000 is roughly the same performance level as the highest clock speed HD Graphics offered with Clarkdale. The Core i3-2100 and Core i5-661 deliver about the same level of performance here. Both are faster than AMD’s 890GX and all three of them are definitely playable in this test.

HD Graphics 3000 is a huge step forward. At 71.5 fps it’s 70% faster than Clarkdale’s integrated graphics, and fast enough that you can actually crank up some quality settings if you’d like. The higher end HD Graphics 3000 is also 26% faster than a Radeon HD 5450.

What Sandy Bridge integrated graphics can’t touch however is the Radeon HD 5570. At 112.5 fps, the 5570’s compute power gives it a 57% advantage over Intel’s HD Graphics 3000.

Dawn of War II

Dawn of War II is an RTS title that ships with a built in performance test. I ran at the lowest quality settings at 1024x768.

Dawn of War II

Here the Core i7-2600K and 2500K fall behind the Radeon HD 5450. The 5450 manages a 25% lead over the HD Graphics 3000 on the 2600K. It's interesting to note the tangible performance difference enabled by the higher max graphics turbo frequency of the 2600K (1350MHz vs. 1100MHz). It would appear that Dawn of War II is largely compute bound on these low-end GPUs.

Compared to last year's Intel HD Graphics, the performance improvement is huge. Even the HD Graphics 2000 is almost 30% faster than the fastest Intel offered with Clarkdale. While I wouldn't view Clarkdale as being useful graphics, at the performance levels we're talking about now game developers should at least be paying attention to Intel's integrated graphics.

Call of Duty: Modern Warfare 2

Our Modern Warfare 2 benchmark is a quick FRAPS run through a multiplayer map. All settings were turned down/off and we ran at 1024x768.

Call of Duty: Modern Warfare 2

The Intel HD Graphics 3000 enabled chips are able to outpace the Radeon HD 5450 by at least 5%. The 2000 model isn't able to do as well, losing out to even the 890GX. On the notebook side this won't be an issue but for desktops with integrated graphics, it is a problem as most will have the lower end GPU.

The performance improvement over last year's Clarkdale IGP is at least 30%, and more if you compare to the more mainstream Clarkdale SKUs.

BioShock 2

Our test is a quick FRAPS runthrough in the first level of BioShock 2. All image quality settings are set to low, resolution is at 1024x768.

BioShock 2

Once again the HD Graphics 3000 GPUs are faster than the Radeon HD 5450; it's the 2000 model that's slower. In this case the Core i3-2100 is actually slightly slower than last year's Core i5-661.

World of Warcraft

Our WoW test is run at fair quality settings (with weather turned down all the way) on a lightly populated server in an area where no other players are present to produce repeatable results. We ran at 1024x768.

World of Warcraft

The high-end HD Graphics 3000 SKUs do very well vs. the Radeon HD 5450 once again. We're at more than playable frame rates in WoW with all of the Sandy Bridge parts, although the two K-series SKUs are obviously a bit smoother.

HAWX

Our HAWX performance tests were run with the game's built in benchmark in DX10 mode. All detail settings were turned down/off and we ran at 1024x768.

HAWX—DX10

The Radeon HD 5570 continues to be completely untouchable. While Sandy Bridge can compete in the ~$40-$50 GPU space, anything above that is completely out of its reach. That isn't too bad considering Intel spent all of 114M transistors on the SNB GPU, but I do wonder if Intel will be able to push up any higher in the product stack in future GPUs.

Once again the HD Graphics 2000 GPU is a bit too slow for my tastes, just barely edging out the fastest Clarkdale GPU.

Starcraft II

We have two Starcraft II benchmarks: a GPU and a CPU test. The GPU test is mostly a navigate-around-the-map test, as scrolling and panning around tends to be the most GPU bound in the game. Our CPU test involves a massive battle of six armies in the center of the map, stressing the CPU more than the GPU. At these low quality settings, however, both benchmarks are influenced by CPU and GPU.

Starcraft II—AT GPU Test

Starcraft II is really a strong point of Sandy Bridge's graphics. It's more than fast enough to run one of the most popular PC games out today. You can easily crank up quality settings or resolution without turning the game into a slideshow. Of course, low quality SC2 looks pretty weak compared to medium quality, but it's better than nothing.

Our CPU test actually ends up being GPU bound with Intel's integrated graphics, AMD's 890GX is actually faster here:

Starcraft II—AT CPU Test

Call of Duty: Black Ops

Call of Duty: Black Ops is basically unplayable on Sandy Bridge integrated graphics. I'm guessing this is not a compute bound scenario but rather an optimization problem for Intel. You'll notice there's hardly any difference between the performance of the 2000 and 3000 GPUs, indicating a bottleneck elsewhere. It could be memory bandwidth. Despite the game's near-30fps frame rate, there's way too much stuttering and jerkiness during the game to make it enjoyable.

Call of Duty: Black Ops

Mafia II

Mafia II ships with a built in benchmark which we used for our comparison.

Mafia II

Frame rates are pretty low here, definitely not what I'd consider playable. This is a fact across the board though; you need to spend at least $70 on a GPU to get a playable experience here.

Civilization V

For our Civilization V test we're using the game's built in lateGameView benchmark. The test was run in DX9 mode with everything turned down at 1024x768:

Civilization V—DX9

Performance here is pretty low. Even a Radeon HD 5450 isn't enough to get you smooth frame rates; a discrete GPU is just necessary for some games. Civ V does have the advantage of not depending on high frame rates, though; the mouse input is decoupled from rendering, so you can generally interact with the game even at low frame rates.

Metro 2033

We're using the Metro 2033 benchmark that comes with the patched game. Occasionally I noticed rendering issues at the Metro 2033 menu screen but I couldn't reproduce the problem regularly on Intel's HD Graphics.

Metro 2033

Metro 2033 and many newer titles are just not playable at smooth frame rates on anything this low-end. Intel integrated graphics as well as low-end discrete GPUs are best paired with older games.

DiRT 2

Our DiRT 2 performance numbers come from the demo's built-in benchmark:

DiRT 2

DiRT 2 is another game that needs compute power, and the faster 2600K gets a decent boost from the higher clock speed. Frame rates are relatively consistent as well, though you'll get dips into the low 20s and teens at times, so at these settings the game is borderline playable. (Drop to Ultra Low if you need higher performance.)

Intel’s Gen 6 Graphics Resolution Scaling with Intel HD Graphics 3000
Comments Locked

283 Comments

View All Comments

  • Loki726 - Monday, January 3, 2011 - link

    Thanks a ton Anand for adding a compiler benchmark. I spent the vast majority of my time on builds and this will help me spec out a few new machines. It's interesting to see results indicating that I should not go anywhere near a low-end Sandybridge system, and that a lot of cheap AMD cores might not be a bad idea.
  • estee - Monday, January 3, 2011 - link

    Can't believe the 23.976Hz output bug is still in SB after all this time. Several years ago, the G35 had this issue and Intel proclaimed they'll have a fix for it. Subsequently, G45 still had the problem and even the iCores, but SB? C'mon....it's a big issue for HTPC buffs, because there's too much judder from 1) LCD displays 2) 3:2 cadencing from film to video conversion, so 1:1 (or rather 5:5 for most 120Hz sets) was a must for large screen HPTC setups. Yes, the bitstreaming is good and all, but most folks are content with just 7.1 DD/DTS output. I guess we'll have to wait (again) for iB and cling on to my ol' nVidia 9300 for now. :(
  • mastrdrver - Monday, January 3, 2011 - link

    Was just looking at the pictures that are downloadable and comparing and notice a couple of differences. Maybe they are just a driver tweak but I thought I remember ATI and/or nVidia getting slammed in the past for pulling similar tactics.

    The first thing I notice was when comparing the AA shots in COD. It appears that maybe the Sandy Bridge graphics isn't applying AA to the twigs in the ground. Or is this just an appearance thing where Intel might have a different algorithm that causing this?

    The second is a little more obvious to me. In the Dirt 2 pictures I notice that Sandy Bridge is blurring and not clearly rendering the distance objects. The sign to the right side is what caught my eye.

    One last thing is the DAO pictures. I've seen someone (in the past) post up pictures of the same exact place in the game. The quality looks a lot better then what Anand has shown and I was wondering if that is correct. I don't have the game so I have no way to confirm.

    As always Anand I appreciate the time you and your staff take to do all of your articles and the quality that results. Its just one of the reasons why I've always found myself coming back here ever since the early years of your website.
  • RagingDragon - Monday, January 3, 2011 - link

    Why don't K series parts get the full suite of virtualization features?
  • xxtypersxx - Monday, January 3, 2011 - link

    Anand,
    Great review as always, I love the in depth feature analysis that Anandtech provides.

    Bios updates have been released for Gigabyte, Asus, and Intel P67 boards that correct an internal PLL overvolt issue that was artificially limiting overclocks. Users in the thread over at HWbot are reporting that processors that were stuck at 4.8 before are now hitting 5.4ghz.
    http://hwbot.org/forum/showthread.php?t=15952

    Would you be able to do a quick update on the overclocking results for your chips with the new BIOS updates?
  • Gothmoth - Monday, January 3, 2011 - link

    ".....Sandy Bridge will be worth the upgrade for Quick Sync alone."

    you say that and a few pages before you say it will not work on PC´s with a discreet grafic card.

    i don´t know you but videoencoding is done here on performance systems.
    system that have discreet GFX cards like a 460 GTX or better.

    and i think most enthusiast will buy a P67 mainboard and that would mean NO QUICK SYNC for them.

    so please do an update on your review and clarify what exactly happens when you use a P67 mainboard with a discreet GFX card.

    will quick sync really don´t work...??
  • Gothmoth - Monday, January 3, 2011 - link

    please make clear how you have tested quick sync in your review.

    i saw a few comments from people that are confused about your review.
    i guess you tested quick sync on an H67 mainboard but i did not notice that you mentioned that in the text.

    for my it looks liek intel is screwing the user who buy this 1. generation sandy bridge chipsets.

    i will wait for Z68 thats for sure......
  • Manabu - Monday, January 3, 2011 - link

    In the quick sync test I missed a comparison with x264, that is currently the fastest and highest quality encoder for H.264, on an fast CPU. For example, using the presets superfast and very slow (one for speed with reasonable quality, the other for quality with reasonable speed). Also, with an too high bitrate, even the crapiest encoder will look good...

    I also wanted to see how low you can undervolt an i5-2400 when it has hit the overclocking cap, and how is the power consumption then. The same for the other locked CPUs would be cool too. Also, what is the power consumption of the sandy bridge CPUs running the quick sync hardware encoder?
  • NJoy - Monday, January 3, 2011 - link

    Wow, what a SLAP in AMD's face! The idea they nursed for gazillion years and were set to finally release somewhere this week is brought to you, dear customer, first to the market, with a sudden change in NDA deadline to please you sooner with a hyperperformer from Intel. Who cares that NDAs make an important play in all planning activities, PR, logistics and whatever follows - what matters is that they are first to put the GPU on-die and this is what the average Joe will now know, with a bit of PR, perhaps. Snatch another design win. Hey, AMD, remember that pocket money the court ordered us to pay you? SLAP! And the licence? SLAP! Nicely planned and executed whilst everyone was so distracted with the DAAMIT versus nVidia battles and, ironically, a lack of leaks from the red camp.
    I just hope Bulldozer will kick some assess, even though I doubt it's really going to happen...
  • DanNeely - Monday, January 3, 2011 - link

    If AMD didn't put a steel toed boot into their own nuts by blowing the original 09Q3 release date for fusion I'd have more sympathy for them. Intel won because they made their launch date while the competition blew theirs by at least half a year.

Log in

Don't have an account? Sign up now