Overclocking Intel's HD Graphics

The base clock of both Intel's HD Graphics 2000 and 3000 on desktop SKUs is 850MHz. Thankfully, Intel's 32nm process allows for much headroom in both the CPU and GPU for overclocking. There are no clock locks or K-series parts to worry about when it comes to GPU overclocking; everything is unlocked. I started by trying to see how far I could push the Core i3-2100's HD Graphics 2000.

While I could get into Windows and run games at up to 1.6GHz, I needed to back down to 1.4GHz to maintain stability across all of our tests. That's a 64.7% overclock:

In some cases (Civilization V, WoW, Dawn of War II), the overclocked HD Graphics 2000 was enough to bring the 6 EU part close to the performance of the 3000 model. For the most part however the overclock just helped the Core i3-2100 perform halfway between it and the Core i5-2500K.

I tried the same experiment with the Core i5-2500K. While there's no chance it could catch up to a Radeon HD 5570, I managed to overclock my 2500K to 1.55GHz (the GPU clock can be adjusted in 50MHz increments):

Intel HD Graphics 3000 Overclocking: 1550MHz

The 82.4% increase in clock speed resulted in anywhere from a 0.6% to 33.7% increase in performance. While that's not terrible, it's also not that great. It looks like we're fairly memory bandwidth constrained here.

Resolution Scaling with Intel HD Graphics 3000 The Test
POST A COMMENT

282 Comments

View All Comments

  • dacipher - Monday, January 03, 2011 - link

    The Core i5-2500K was just what i was looking for. Performance/ Price is where it needs to be and overclocking should be a breeze. Reply
  • vol7ron - Monday, January 03, 2011 - link

    I agree.

    "As an added bonus, both K-series SKUs get Intel’s HD Graphics 3000, while the non-K series SKUs are left with the lower HD Graphics 2000 GPU."

    Doesn't it seem like Intel has this backwards? For me, I'd think to put the 3000 on the lesser performing CPUs. Users will probably have their own graphics to use with the unlocked procs, whereas the limit-locked ones will more likely be used in HTPC-like machines.
    Reply
  • DanNeely - Monday, January 03, 2011 - link

    This seems odd to me unless they're having yield problems with the GPU portion of their desktop chips. That doesn't seem too likely though because you'd expect the mobile version to have the same problem but they're all 12 EU parts. Perhaps they're binning more aggressively on TDP, and only had enough chips that met target with all 12 EUs to offer them at the top of the chart. Reply
  • dananski - Monday, January 03, 2011 - link

    I agree with both of you. This should be the ultimate upgrade for my E8400, but I can't help thinking they could've made it even better if they'd used the die space for more CPU and less graphics and video decode. The Quick Sync feature would be awesome if it could work while you're using a discrete card, but for most people who have discrete graphics, this and the HD Graphics 3000 are a complete waste of transistors. I suppose they're power gated off so the thermal headroom could maybe be used for overclocking. Reply
  • JE_Delta - Monday, January 03, 2011 - link

    WOW........

    Great review guys!
    Reply
  • vol7ron - Monday, January 03, 2011 - link

    Great review, but does anyone know how often 1 active core is used. I know this is a matter of subjection, but if you're running an anti-virus and have a bunch of standard services running in the background, are you likely to use only one core when idling?

    What should I advise people, as consumers, to really pay attention to? I know when playing games such as Counter-Strike or Battlefield: Bad Company 2, my C2D maxes out at 100%, I assume both cores are being used to achieve the 100% utilization. I'd imagine that in this age, hardly ever will there be a time to use just one core; probably 2 cores at idle.

    I would think that the 3-core figures are where the real noticeable impact is, especially in turbo, when gaming/browsing. Does anyone have any more perceived input on this?
    Reply
  • dualsmp - Monday, January 03, 2011 - link

    What resolution is tested under Gaming Performance on pg. 20? Reply
  • johnlewis - Monday, January 03, 2011 - link

    According to Bench, it looks like he used 1680×1050 for L4D, Fallout 3, Far Cry 2, Crysis Warhead, Dragon Age Origins, and Dawn of War 2, and 1024×768 for StarCraft 2. I couldn't find the tested resolution for World of Warcraft or Civilization V. I don't know why he didn't list the resolutions anywhere in the article or the graphs themselves, however. Reply
  • karlostomy - Thursday, January 06, 2011 - link

    what the hell is the point of posting gaming scores at resolutions that no one will be playing at?

    If i am not mistaken, the grahics cards in the test are:
    eVGA GeForce GTX 280 (Vista 64)
    ATI Radeon HD 5870 (Windows 7)
    MSI GeForce GTX 580 (Windows 7)

    So then, with a sandybridge processor, these resolutions are irrelevant.
    1080p or above should be standard resolution for modern setup reviews.

    Why, Anand, have you posted irrelevant resolutions for the hardware tested?
    Reply
  • dananski - Thursday, January 06, 2011 - link

    Games are usually limited in fps by the level of graphics, so processor speed doesn't make much of a difference unless you turn the graphics detail right down and use an overkill graphics card. As the point of this page was to review the CPU power, it's more representative to use low resolutions so that the CPU is the limiting factor.

    If you did this set of charts for gaming at 2560x1600 with full AA & max quality, all the processors would be stuck at about the same rate because the graphics card is the limiting factor.

    I expect Civ 5 would be an exception to this because it has really counter-intuitive performance.
    Reply

Log in

Don't have an account? Sign up now