Overclocking Intel’s HD Graphics - It Works...Very Well

The coolest part of my job is being able to work with some ridiculously smart people. One such person gave me the idea to try overclocking the Intel HD graphics core on Clarkdale a few weeks ago. I didn’t get time to do it with the Core i5 661, but today is a different day.

Clarkdale offers three different GPU clocks depending on the model:

Processor Intel HD Graphics Clock
Intel Core i5-670 733MHz
Intel Core i5-661 900MHz
Intel Core i5-660 733MHz
Intel Core i5-650 733MHz
Intel Core i3-540 733MHz
Intel Core i3-530 733MHz
Intel Pentium G9650 533MHz

 

The Core i5 661 runs it at the highest speed - 900MHz. The rest of the Core i5 and i3 processors pick 733MHz. And the Pentium G6950 has a 533MHz graphics clock.

Remember that the Intel HD Graphics die is physically separate from the CPU die on Clarkdale. It’s a separate 45nm package and I’m guessing it’s not all that difficult to make. If AMD can reliably ship GPUs with hundreds of shader processors, Intel can probably make a chip with 12 without much complaining.

So the theory is that these graphics cores are easily overclockable. I fired up our testbed and adjusted the GPU clock. It’s a single BIOS option and without any changes to voltage or cooling I managed to get our Core i3 530’s GPU running at 1200MHz. That’s a 64% overclock!

I could push the core as high as 1400MHz and still get into Windows, but the system stopped being able to render any 3D games at that point.

I benchmarked World of Warcraft with the Core i3 running at three different GPU clocks to show the potential for improvement:

CPU (Graphics Clock) World of Warcraft
Intel Core i5 661 (900MHz gfx) 14.8 fps
Intel Core i3 530 (733MHz gfx) 12.5 fpx
Intel Core i3 530 (900MHz gfx) 14.2 fps
Intel Core i3 530 (1200MHz gfx) 19.0 fps

 

A 64% overclock resulted in a 52% increase in performance. If Intel wanted to, it could easily make its on-package GPU a lot faster than it is today. I wonder if this is what we’ll see with Sandy Bridge and graphics turbo on the desktop.

Integrated Graphics - Slower than AMD, Still Perfect for an HTPC Overclocking the i3 - 4GHz with the Stock Cooler
Comments Locked

107 Comments

View All Comments

  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    No you're pretty much right on the money. If you have a good system already, you're always better off waiting until the next big thing. In my eyes Penryn wasn't a very big deal if you already had Conroe. Clarkdale's real advantages are in power consumption and threaded performance. If you already have a quad-core CPU, chances are that you'll want to be looking at Lynnfield/Phenom II X4 or wait for Sandy Bridge if you can.

    Take care,
    Anand
  • tno - Friday, January 22, 2010 - link

    Cool! A reply from the man himself! Thanks, Anand! My leap was from a 2.4GHz Celeron to a PD805 to Penryn, so Penryn seemed like a revelation, highly efficient, easy to cool, fast and quadcore. Now, if you happen to have any loose systems that you're not using and want to send my way so I can experience the Lynnfield difference myself, I won't object.

    tno
  • kwrzesien - Friday, January 22, 2010 - link

    I had an AMD 1.2 GHz single-core with an GeForce 2MX. It was a HUGE upgrade!
  • lopri - Friday, January 22, 2010 - link

    [QUOTE]We are still running into an issue with MPC-HC and video corruption with DXVA enabled on the 790GX, but haven't been able to fix it yet. Have any of you had issues with video corruption with AMD graphics and the latest stable build of MPC-HC for 64-bit Windows? Or should we chalk it up to being just another day in the AnandTech labs.[/QUOTE]

    Instead of such fleeting one-liners, how about telling us the title, format, and codec in question so that we can verify it? This is a finest example of yellow journalism.

    I'm still waiting for an answer whether 2560x1600 and dual-displays work with these CPUs. Considering the silence, however, I think I know the answer.
  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    It's a Dark Knight rip we use. Take the original Blu-ray, use AnyDVD HD to strip out the DRM, re-encode to reduce file size and toss into an mkv container. The problem appears on all H.264 content though played through MPC-HC.

    As far as resolution support goes, Intel lists 2560 x 1600 as the maximum resolution available over Display Port. For DVI/HDMI you're limited to 1920 x 1200. VGA will get you up to 2048 x 1536.

    There are four independent display ports, so in theory you should be able to run one 2560 x 1600 panel and one 1920 x 1200 (or two 25x16 panels if you had a board with dual DisplayPort outputs).

    Take care,
    Anand
  • lopri - Friday, January 22, 2010 - link

    Thank you for the explanation, but unfortunately I couldn't replicate the 'problem' (what exactly?) you've experienced. I don't have The Dark Kight, so I tried Children of Men on a neighbor's 785G system I built for him. That title was chosen because its original content on the disc was encoded VC1 just like The Dark Knight. MediaInfo gave the following information:

    Video
    ID : 1
    Format : AVC
    Format/Info : Advanced Video Codec
    Format profile : High@L4.1
    Format settings, CABAC : Yes
    Format settings, ReFrames : 4 frames
    Muxing mode : Container profile=Unknown@4.1
    Codec ID : V_MPEG4/ISO/AVC
    Duration : 1h 49mn
    Bit rate : 14.2 Mbps
    Nominal bit rate : 14.5 Mbps
    Width : 1 920 pixels
    Height : 1 040 pixels
    Display aspect ratio : 16:9
    Frame rate : 23.976 fps
    Resolution : 24 bits
    Colorimetry : 4:2:0
    Scan type : Progressive
    Bits/(Pixel*Frame) : 0.296
    Stream size : 10.8 GiB (88%)
    Title : Video @ 14489 kbps
    Writing library : x264 core 67 r1165M 6841c5e

    Flawless playback both in Windowed mode as well as full-screen mode, on a 30" LCD. Just to be sure, I tested The Dark Knight trailer which is a VC1 clip, and various H.264 content in .mkv, .mp4, and .m2ts. Using MPC-HC svn 1.3.3347 32-bit AND 64-bit binaries. System had an WHQL driver dated 8/17/2009, installed via Windows Updates. Only codecs installed are Matroksa Splitter and AC3filter.

    So there. Now, what exactly is the problem that I don't see but you do?

    WRT resolutions - Intel listed 2560x1600 on G45 as well. I even got an ADD2 (interesting choice of name, btw) card off eBay hoping it'd work, but that was simply waste of money. I am as skeptical as can be on GMA after my bitter experiences with G35/G45, and it is puzzling why you can't verify that in your lab instead of being a messenger. ("Intel says so")

    Would you feel bad at all if I say I purchased G35/G45 based on your reviews, only to be greatly disappointed? I couldn't even give away a G35 based system to a junior-high kid, because the kid is someone I see on and off and I feared a potential embarrassment and unexpected calls for support.

    Your reviews are full of contradictions one after another, and I am concerned whether you've lost the sense and connection to the real world.
  • Shadowmaster625 - Friday, January 22, 2010 - link

    Given the level of integration, what is making these motherboards so expensive? When are we going to see $35 motherboards? What would keep the prices from coming down that low?
  • strikeback03 - Friday, January 22, 2010 - link

    IIRC the chipset itself currently costs $40.
  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    Correct. Despite moving much of the "chipset" on-package, the actual H5x chipsets are no cheaper than their predecessors. Remember that as AMD and Intel integrate more onto the CPU they still want to preserve or increase profit margins. It's just fortunate for all of us that in the process of integration we actually get a performance benefit.

    Take care,
    Anand
  • Taft12 - Friday, January 22, 2010 - link

    Sounds like we are very much in need of competition in the 3rd party chipset market like the good old days!

    Things are going in the wrong direction with NVIDIA exiting the market, Via and SiS long gone...

Log in

Don't have an account? Sign up now