Intel’s Gen 6 Graphics

All 2nd generation Core series processors that fit into an LGA-1155 motherboard will have one of two GPUs integrated on-die: Intel’s HD Graphics 3000 or HD Graphics 2000. Intel’s upcoming Sandy Bridge E for LGA-2011 will not have an on-die GPU. All mobile 2nd generation Core series processors feature HD Graphics 3000.

The 3000 vs. 2000 comparison is pretty simple. The former has 12 cores or EUs as Intel likes to call them, while the latter only has 6. Clock speeds are the same although the higher end parts can turbo up to higher frequencies. Each EU is 128-bits wide, which makes a single EU sound a lot like a single Cayman SP.

Unlike Clarkdale, all versions of HD Graphics on Sandy Bridge support Turbo. Any TDP that is freed up by the CPU running at a lower frequency or having some of its cores shut off can be used by the GPU to turbo up. The default clock speed for both HD 2000 and 3000 on the desktop is 850MHz; however, the GPU can turbo up to 1100MHz in everything but the Core i7-2600/2600K. The top-end Sandy Bridge can run its GPU at up to 1350MHz.

Processor Intel HD Graphics EUs Quick Sync Graphics Clock Graphics Max Turbo
Intel Core i7-2600K 3000 12 Y 850MHz 1350MHz
Intel Core i7-2600 2000 6 Y 850MHz 1350MHz
Intel Core i5-2500K 3000 12 Y 850MHz 1100MHz
Intel Core i5-2500 2000 6 Y 850MHz 1100MHz
Intel Core i5-2400 2000 6 Y 850MHz 1100MHz
Intel Core i5-2300 2000 6 Y 850MHz 1100MHz
Intel Core i3-2120 2000 6 Y 850MHz 1100MHz
Intel Core i3-2100 2000 6 Y 850MHz 1100MHz
Intel Pentium G850 Intel HD Graphics 6 N 850MHz 1100MHz
Intel Pentium G840 Intel HD Graphics 6 N 850MHz 1100MHz
Intel Pentium G620 Intel HD Graphics 6 N 850MHz 1100MHz

Mobile is a bit different. The base GPU clock in all mobile SNB chips is 650MHz but the max turbo is higher at 1300MHz. The LV/ULV parts also have different max clocks, which we cover in the mobile article.

As I mentioned before, all mobile 2nd gen Core processors get the 12 EU version—Intel HD Graphics 3000. The desktop side is a bit more confusing. In desktop, the unlocked K-series SKUs get the 3000 GPU while everything else gets the 2000 GPU. That’s right: the SKUs most likely to be paired with discrete graphics are given the most powerful integrated graphics. Of course those users don’t pay any penalty for the beefier on-die GPU; when not in use the GPU is fully power gated.

Despite the odd perk for the K-series SKUs, Intel’s reasoning behind the GPU split does makes sense. The HD Graphics 2000 GPU is faster than any desktop integrated GPU on the market today, and it’s easy to add discrete graphics to a desktop system if the integrated GPU is insufficient. The 3000 is simply another feature to justify the small price adder for K-series buyers.

On the mobile side going entirely with 3000 is simply because of the quality of integrated or low-end graphics in mobile. You can’t easily add in a discrete card so Intel has to put its best foot forward to appease OEMs like Apple. I suspect the top-to-bottom use of HD Graphics 3000 in mobile is directly responsible for Apple using Sandy Bridge without a discrete GPU in its entry level notebooks in early 2011.

I’ve been careful to mention the use of HD Graphics 2000/3000 in 2nd generation Core series CPUs, as Intel will eventually bring Sandy Bridge down to the Pentium brand with the G800 and G600 series processors. These chips will feature a version of HD Graphics 2000 that Intel will simply call HD Graphics. Performance will be similar to the HD Graphics 2000 GPU, however it won’t feature Quick Sync.

Image Quality and Experience

Perhaps the best way to start this section is with a list. Between Jarred and I, these are the games we’ve tested with Intel’s on-die HD 3000 GPU:

Assassin’s Creed
Batman: Arkham Asylum
Borderlands
Battlefield: Bad Company 2
BioShock 2
Call of Duty: Black Ops
Call of Duty: Modern Warfare 2
Chronicles of Riddick: Dark Athena
Civilization V
Crysis: Warhead
Dawn of War II
DiRT 2
Dragon Age Origins
Elder Scrolls IV: Oblivion
Empire: Total War
Far Cry 2
Fallout 3
Fallout: New Vegas
FEAR 2: Project Origin
HAWX
HAWX 2
Left 4 Dead 2
Mafia II
Mass Effect 2
Metro 2033
STALKER: Call of Pripyat
Starcraft II
World of Warcraft

This is over two dozen titles, both old and new, that for the most part worked on Intel’s integrated graphics. Now for a GPU maker, this is nothing to be proud of, but given Intel’s track record with game compatibility this is a huge step forward.

We did of course run into some issues. Fallout 3 (but not New Vegas) requires a DLL hack to even run on Intel integrated graphics, and we saw some shadow rendering issues in Mafia II, but for the most part the titles—both old and new—worked.


Modern Warfare 2 in High Quality

Now the bad news. Despite huge performance gains and much improved compatibility, even the Intel HD Graphics 3000 requires that you run at fairly low detail settings to get playable frame rates in most of these games. There are a couple of exceptions but for the most part the rule of integrated graphics hasn’t changed: turn everything down before you start playing.


Modern Warfare 2 the way you have to run it on Intel HD Graphics 3000

This reality has been true for more than just Intel integrated graphics however. Even IGPs from AMD and NVIDIA had the same limitations, as well as the lowest end discrete cards on the market. The only advantage those solutions had over Intel in the past was performance.

Realistically we need at least another doubling of graphics performance before we can even begin to talk about playing games smoothly at higher quality settings. Interestingly enough, I’ve heard the performance of Intel’s HD Graphics 3000 is roughly equal to the GPU in the Xbox 360 at this point. It only took six years for Intel to get there. If Intel wants to contribute positively to PC gaming, we need to see continued doubling of processor graphics performance for at least the next couple generations. Unfortunately I’m worried that Ivy Bridge won’t bring another doubling as it only adds 4 EUs to the array.

Quick Sync: The Best Way to Transcode Intel HD Graphics 2000/3000 Performance
Comments Locked

283 Comments

View All Comments

  • DanNeely - Monday, January 3, 2011 - link

    The increased power efficiency might allow Apple to squeeze a GPU onto their smaller laptop boards without loosing runtime due to the smaller battery.
  • yuhong - Monday, January 3, 2011 - link

    "Unlike P55, you can set your SATA controller to compatible/legacy IDE mode. This is something you could do on X58 but not on P55. It’s useful for running HDDERASE to secure erase your SSD for example"
    Or running old OSes.
  • DominionSeraph - Monday, January 3, 2011 - link

    "taking the original Casino Royale Blu-ray, stripping it of its DRM"

    Whoa, that's illegal.
  • RussianSensation - Monday, January 3, 2011 - link

    It would have been nice to include 1st generation Core i7 processors such as 860/870/920-975 in Starcraft 2 bench as it seems to be very CPU intensive.

    Also, perhaps a section with overclocking which shows us how far 2500k/2600k can go on air cooling with safe voltage limits (say 1.35V) would have been much appreciated.
  • Hrel - Monday, January 3, 2011 - link

    Sounds like this is SO high end it should be the server market. I mean, why make yet ANOTHER socket for servers that use basically the same CPU's? Everything's converging and I'd just really like to see server mobo's converge into "High End Desktop" mobo's. I mean seriously, my E8400 OC'd with a GTX460 is more power than I need. A quad would help with the video editing I do in HD but it works fine now, and with GPU accelerated rendering the rendering times are totally reasonable. I just can't imagine anyone NEEDING a home computer more powerful than the LGS-1155 socket can provide. Hell, 80-90% of people are probably fine with the power Sandy Bridge gives in laptops now.
  • mtoma - Monday, January 3, 2011 - link

    Perhaps it is like you say, however it's always good for buyers to decide if they want server-like features in a PC. I don't like manufacturers to dictate to me only one way to do it (like Intel does now with the odd combination of HD3000 graphics - Intel H67 chipset). Let us not forget that for a long time, all we had were 4 slots for RAM and 4-6 SATA connections (like you probably have). Intel X58 changed all that: suddenly we had the option of having 6 slots for RAM, 6-8 SATA connections and enough PCI-Express lanes.
    I only hope that LGA 2011 brings back those features, because like you said: it's not only the performance we need, but also the features.
    And, remeber that the software doesn't stay still, it usualy requires multiple processor cores (video transcoding, antivirus scanning, HDD defragmenting, modern OS, and so on...).
    All this aside, the main issue remains: Intel pus be persuaded to stop luting user's money and implement only one socket at a time. I usually support Intel, but in this regard, AMD deserves congratulations!
  • DanNeely - Monday, January 3, 2011 - link

    LGA 2011 is a high end desktop/server convergence socket. Intel started doing this in 2008, with all but the highest end server parts sharing LGA1366 with top end desktop systems. The exception was quad/octo socket CPUs, and those using enormous amounts of ram using LGA 1567.

    The main reason why LGA 1155 isn't suitable for really high end machines is that it doesn't have the memory bandwidth to feed hex and octo core CPUs. It's also limited to 16 PCIe 2.0 lanes on the CPU vs 36 PCIe 3.0 lanes on LGA2011. For most consumer systems that won't matter, but 3/4 GPU card systems will start loosing a bit of performance when running in a 4x slot (only a few percent, but people who spend $1000-2000 on GPUs want every last frame they can get), high end servers with multiple 10GB ethernet cards and PCIe SSD devices also begin running into bottlenecks.

    Not spending an extra dollar or five per system for the QPI connections only used in multi-socket systems in 1155 also adds up to major savings across the hundreds of millions of systems Intel is planning to sell.
  • Hrel - Monday, January 3, 2011 - link

    I'm confused by the upset over playing video at 23.967hz. "It makes movies look like, well, movies instead of tv shows"? What? Wouldn't recording at a lower frame rate just mean there's missed detail especially in fast action scenes? Isn't that why HD runs at 60fps instead of 30fps? Isn't more FPS good as long as it's played back at the appropriate speed? IE whatever it's filmed at? I don't understand the complaint.

    On a related note hollywood and the world need to just agree that everything gets recorded and played back at 60fps at 1920x1080. No variation AT ALL! That way everything would just work. Or better yet 120FPS and with the ability to turn 3D on and off as u see fit. Whatever FPS is best. I've always been told higher is better.
  • chokran - Monday, January 3, 2011 - link

    You are right about having more detail when filming with higher FPS, but this isn't about it being good or bad, it's more a matter of tradition and visual style.
    The look movies have these days, the one we got accustomed to, is mainly achieved by filming it in 24p or 23.967 to be precise. The look you get when filming with higher FPS just doesn't look like cinema anymore but tv. At least to me. A good article on this:
    http://www.videopia.org/index.php/read/shorts-main...
    The problem with movies looking like TV can be tested at home if you got a TV that has some kind of Motion Interpolation, eg. MotionFlow called by Sony or Intelligent Frame Creation by Panasonic. When turned on, you can see the soap opera effect by adding frames. There are people that don't see it and some that do and like it, but I have to turn it of since it doesn't look "natural" to me.
  • CyberAngel - Thursday, January 6, 2011 - link

    http://en.wikipedia.org/wiki/Showscan

Log in

Don't have an account? Sign up now