Intel’s Gen 6 Graphics

All 2nd generation Core series processors that fit into an LGA-1155 motherboard will have one of two GPUs integrated on-die: Intel’s HD Graphics 3000 or HD Graphics 2000. Intel’s upcoming Sandy Bridge E for LGA-2011 will not have an on-die GPU. All mobile 2nd generation Core series processors feature HD Graphics 3000.

The 3000 vs. 2000 comparison is pretty simple. The former has 12 cores or EUs as Intel likes to call them, while the latter only has 6. Clock speeds are the same although the higher end parts can turbo up to higher frequencies. Each EU is 128-bits wide, which makes a single EU sound a lot like a single Cayman SP.

Unlike Clarkdale, all versions of HD Graphics on Sandy Bridge support Turbo. Any TDP that is freed up by the CPU running at a lower frequency or having some of its cores shut off can be used by the GPU to turbo up. The default clock speed for both HD 2000 and 3000 on the desktop is 850MHz; however, the GPU can turbo up to 1100MHz in everything but the Core i7-2600/2600K. The top-end Sandy Bridge can run its GPU at up to 1350MHz.

Processor Intel HD Graphics EUs Quick Sync Graphics Clock Graphics Max Turbo
Intel Core i7-2600K 3000 12 Y 850MHz 1350MHz
Intel Core i7-2600 2000 6 Y 850MHz 1350MHz
Intel Core i5-2500K 3000 12 Y 850MHz 1100MHz
Intel Core i5-2500 2000 6 Y 850MHz 1100MHz
Intel Core i5-2400 2000 6 Y 850MHz 1100MHz
Intel Core i5-2300 2000 6 Y 850MHz 1100MHz
Intel Core i3-2120 2000 6 Y 850MHz 1100MHz
Intel Core i3-2100 2000 6 Y 850MHz 1100MHz
Intel Pentium G850 Intel HD Graphics 6 N 850MHz 1100MHz
Intel Pentium G840 Intel HD Graphics 6 N 850MHz 1100MHz
Intel Pentium G620 Intel HD Graphics 6 N 850MHz 1100MHz

Mobile is a bit different. The base GPU clock in all mobile SNB chips is 650MHz but the max turbo is higher at 1300MHz. The LV/ULV parts also have different max clocks, which we cover in the mobile article.

As I mentioned before, all mobile 2nd gen Core processors get the 12 EU version—Intel HD Graphics 3000. The desktop side is a bit more confusing. In desktop, the unlocked K-series SKUs get the 3000 GPU while everything else gets the 2000 GPU. That’s right: the SKUs most likely to be paired with discrete graphics are given the most powerful integrated graphics. Of course those users don’t pay any penalty for the beefier on-die GPU; when not in use the GPU is fully power gated.

Despite the odd perk for the K-series SKUs, Intel’s reasoning behind the GPU split does makes sense. The HD Graphics 2000 GPU is faster than any desktop integrated GPU on the market today, and it’s easy to add discrete graphics to a desktop system if the integrated GPU is insufficient. The 3000 is simply another feature to justify the small price adder for K-series buyers.

On the mobile side going entirely with 3000 is simply because of the quality of integrated or low-end graphics in mobile. You can’t easily add in a discrete card so Intel has to put its best foot forward to appease OEMs like Apple. I suspect the top-to-bottom use of HD Graphics 3000 in mobile is directly responsible for Apple using Sandy Bridge without a discrete GPU in its entry level notebooks in early 2011.

I’ve been careful to mention the use of HD Graphics 2000/3000 in 2nd generation Core series CPUs, as Intel will eventually bring Sandy Bridge down to the Pentium brand with the G800 and G600 series processors. These chips will feature a version of HD Graphics 2000 that Intel will simply call HD Graphics. Performance will be similar to the HD Graphics 2000 GPU, however it won’t feature Quick Sync.

Image Quality and Experience

Perhaps the best way to start this section is with a list. Between Jarred and I, these are the games we’ve tested with Intel’s on-die HD 3000 GPU:

Assassin’s Creed
Batman: Arkham Asylum
Borderlands
Battlefield: Bad Company 2
BioShock 2
Call of Duty: Black Ops
Call of Duty: Modern Warfare 2
Chronicles of Riddick: Dark Athena
Civilization V
Crysis: Warhead
Dawn of War II
DiRT 2
Dragon Age Origins
Elder Scrolls IV: Oblivion
Empire: Total War
Far Cry 2
Fallout 3
Fallout: New Vegas
FEAR 2: Project Origin
HAWX
HAWX 2
Left 4 Dead 2
Mafia II
Mass Effect 2
Metro 2033
STALKER: Call of Pripyat
Starcraft II
World of Warcraft

This is over two dozen titles, both old and new, that for the most part worked on Intel’s integrated graphics. Now for a GPU maker, this is nothing to be proud of, but given Intel’s track record with game compatibility this is a huge step forward.

We did of course run into some issues. Fallout 3 (but not New Vegas) requires a DLL hack to even run on Intel integrated graphics, and we saw some shadow rendering issues in Mafia II, but for the most part the titles—both old and new—worked.


Modern Warfare 2 in High Quality

Now the bad news. Despite huge performance gains and much improved compatibility, even the Intel HD Graphics 3000 requires that you run at fairly low detail settings to get playable frame rates in most of these games. There are a couple of exceptions but for the most part the rule of integrated graphics hasn’t changed: turn everything down before you start playing.


Modern Warfare 2 the way you have to run it on Intel HD Graphics 3000

This reality has been true for more than just Intel integrated graphics however. Even IGPs from AMD and NVIDIA had the same limitations, as well as the lowest end discrete cards on the market. The only advantage those solutions had over Intel in the past was performance.

Realistically we need at least another doubling of graphics performance before we can even begin to talk about playing games smoothly at higher quality settings. Interestingly enough, I’ve heard the performance of Intel’s HD Graphics 3000 is roughly equal to the GPU in the Xbox 360 at this point. It only took six years for Intel to get there. If Intel wants to contribute positively to PC gaming, we need to see continued doubling of processor graphics performance for at least the next couple generations. Unfortunately I’m worried that Ivy Bridge won’t bring another doubling as it only adds 4 EUs to the array.

Quick Sync: The Best Way to Transcode Intel HD Graphics 2000/3000 Performance
Comments Locked

283 Comments

View All Comments

  • auhgnist - Monday, January 17, 2011 - link

    For example, between i3-2100 and i7-2600?
  • timminata - Wednesday, January 19, 2011 - link

    I was wondering, does the integrated GPU provide any benefit if you're using it with a dedicated graphics card anyway (GTX470) or would it just be idle?
  • James5mith - Friday, January 21, 2011 - link

    Just thought I would comment with my experience. I am unable to get bluray playback, or even CableCard TV playback with the Intel integrated graphics on my new I5-2500K w/ Asus Motherboard. Why you ask? The same problem Intel has always had, it doesn't handle the EDID's correctly when there is a receiver in the path between it and the display.

    To be fair, I have an older Westinghouse Monitor, and an Onkyo TX-SR606. But the fact that all I had to do was reinstall my HD5450 (which I wanted to get rid of when I did the update to SandyBridge) and all my problems were gone kind of points to the fact that Intel still hasn't gotten it right when it comes to EDID's, HDCP handshakes, etc.

    So sad too, because otherwise I love the upgraded platform for my HTPC. Just wish I didn't have to add-in the discrete graphics.
  • palenholik - Wednesday, January 26, 2011 - link

    As i could understand from article, you have used just this one software for all these testings. And I understand why. Is it enough to conclude that CUDA causes bad or low picture quality.

    I am very interested and do researches over H.264 and x264 encoding and decoding performance, especially over GPU. I have tested Xilisoft Video Converter 6, that supports CUDA, and i didn't problems with low quality picture when using CUDA. I did these test on nVidia 8600 GT and for TV station that i work for. I was researching for solution to compress video for sending over internet with low or no quality loss.

    So, could it be that Arcsoft Media Converter co-ops bad with CUDA technology?

    And must notice here how well AMD Phenom II x6 performs well comparable to nVidia GTX 460. This means that one could buy MB with integrated graphics and AMD Phenom II x6 and have very good encoding performances in terms of speed and quality. Though, Intel is winner here no doubt, but jumping from sck. to sck. and total platform changing troubles me.

    Nice and very useful article.
  • ellarpc - Wednesday, January 26, 2011 - link

    I'm curious why bad company 2 gets left out of Anand's CPU benchmarks. It seems to be a CPU dependent game. When I play it all four cores are nearly maxed out while my GPU barely reaches 60% usage. Where most other games seem to be the opposite.
  • Kidster3001 - Friday, January 28, 2011 - link

    Nice article. It cleared up much about the new chips I had questions on.

    A suggestion. I have worked in the chip making business. Perhaps you could run an article on how bin-splits and features are affected by yields and defects. Many here seem to believe that all features work on all chips (but the company chooses to disable them) when that is not true. Some features, such as virtualization, are excluded from SKU's for a business reason. These are indeed disabled by the manufacturer inside certain chips (they usually use chips where that feature is defective anyway, but can disable other chips if the market is large enough to sell more). Other features, such as less cache or lower speeds are missing from some SKU's because those chips have a defect which causes that feature to not work or not to run as fast in those chips. Rather than throwing those chips away, companies can sell them at a cheaper price. i.e. Celeron -> 1/2 the cache in the chip doesn't work right so it's disabled.

    It works both ways though. Some of the low end chips must come from better chips that have been down-binned, otherwise there wouldn't be enough low-end chips to go around.
  • katleo123 - Tuesday, February 1, 2011 - link

    It is not expected to compete Core i7 processors to take its place.
    Sandy bridge uses fixed function processing to produce better graphics using the same power consumption as Core i series.
    visit http://www.techreign.com/2010/12/intels-sandy-brid...
  • jmascarenhas - Friday, February 4, 2011 - link

    Problem is we need to choose between using integrated GPU where we have to choose a H67 board or do some over clocking with a P67. I wonder why we have to make this option... this just means that if we dont do gaming and the 3000 is fine we have to go for the H67 and therefore cant OC the processor.....
  • jmascarenhas - Monday, February 7, 2011 - link

    and what about those who want to OC and dont need a dedicated Graphic board??? I understand Intel wanting to get money out of early adopters, but dont count on me.
  • fackamato - Sunday, February 13, 2011 - link

    Get the K version anyway? The internal GPU gets disabled when you use an external GPU AFAIK.

Log in

Don't have an account? Sign up now