Overclocking the i3 - 4GHz with the Stock Cooler

I’ve become a fan of stock voltage overclocking over the past few years. As power consumption and efficiency has become more important, and manufacturing processes improved, how far you can push a CPU without increasing its core voltage appears to be the most efficient way to overclock. You minimize any increases in power consumption while maximizing performance. You really find out whether or not you’ve been sold a chip that’s artificially binned lower than it could have.

With Bloomfield, Intel hit a new peak for how far you can expect to push a CPU without increasing voltage. AMD followed with the Phenom II, but Lynnfield took a step back. Thanks to its on-die PCIe controller, Lynnfield needed some amount of additional voltage to overclock well. Clarkdale is somewhere in between. It lacks the crippling on-die PCIe controller, but it’s also a much higher volume part which by definition shouldn’t be as overclockable.

The Core i3 530 runs at 2.93GHz by default, with no available turbo boost. Without swapping coolers or feeding the chip any additional voltage, the most I got out of it was 3.3GHz (150MHz BCLK x 22). Hardly impressive.

I added another 0.16V to the CPU’s core voltage. That’s just under 14%. And here’s what I was able to do:

That’s 4GHz, stable using the stock heatsink/fan. Part of the trick to overclocking this thing was lowering the clock multiplier. Despite always keeping the QPI and memory frequencies in spec, lowering the clock multiplier on the chip improved stability significantly and allowed me to reach much higher frequencies.

I could push beyond 4GHz but that requires more voltage and potentially better cooling. With a stable 4GHz overclock, I was happy.

If you’ll remember from my review of the processor, my Phenom II X2 550 BE managed 3.7GHz using the stock cooler and a pound of voltage. Unfortunately it’s not enough to challenge the overclocked 530.

CPU x264 HD 3.03 - 2nd pass 7-zip KB/s Batman: AA Dawn of War II Dragon Age Origins World of Warcraft
Intel Core i3 530 @ 4GHz 18.4 fps 2822 192 fps 62.7 fps 115 fps 92 fps
AMD Phenom II X2 550 @ 3.7GHz 10.4 fps 2681 170 fps 50.9 fps 63 fps 60.8 fps
AMD Phenom II X4 965 (3.4GHz) 22.2 fps 3143 196 fps 54.3 fps 109 fps 74.1 fps

 

Even an overclocked Athlon II X4 630 isn’t going to dramatically change things. It’ll still be faster in multithreaded applications, and still the overall slower gaming CPU.

If the Core i3 530 is right for you, overclocking is just going to make it more right.

Overclocking Intel’s HD Graphics - It Works...Very Well Final Words
Comments Locked

107 Comments

View All Comments

  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    No you're pretty much right on the money. If you have a good system already, you're always better off waiting until the next big thing. In my eyes Penryn wasn't a very big deal if you already had Conroe. Clarkdale's real advantages are in power consumption and threaded performance. If you already have a quad-core CPU, chances are that you'll want to be looking at Lynnfield/Phenom II X4 or wait for Sandy Bridge if you can.

    Take care,
    Anand
  • tno - Friday, January 22, 2010 - link

    Cool! A reply from the man himself! Thanks, Anand! My leap was from a 2.4GHz Celeron to a PD805 to Penryn, so Penryn seemed like a revelation, highly efficient, easy to cool, fast and quadcore. Now, if you happen to have any loose systems that you're not using and want to send my way so I can experience the Lynnfield difference myself, I won't object.

    tno
  • kwrzesien - Friday, January 22, 2010 - link

    I had an AMD 1.2 GHz single-core with an GeForce 2MX. It was a HUGE upgrade!
  • lopri - Friday, January 22, 2010 - link

    [QUOTE]We are still running into an issue with MPC-HC and video corruption with DXVA enabled on the 790GX, but haven't been able to fix it yet. Have any of you had issues with video corruption with AMD graphics and the latest stable build of MPC-HC for 64-bit Windows? Or should we chalk it up to being just another day in the AnandTech labs.[/QUOTE]

    Instead of such fleeting one-liners, how about telling us the title, format, and codec in question so that we can verify it? This is a finest example of yellow journalism.

    I'm still waiting for an answer whether 2560x1600 and dual-displays work with these CPUs. Considering the silence, however, I think I know the answer.
  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    It's a Dark Knight rip we use. Take the original Blu-ray, use AnyDVD HD to strip out the DRM, re-encode to reduce file size and toss into an mkv container. The problem appears on all H.264 content though played through MPC-HC.

    As far as resolution support goes, Intel lists 2560 x 1600 as the maximum resolution available over Display Port. For DVI/HDMI you're limited to 1920 x 1200. VGA will get you up to 2048 x 1536.

    There are four independent display ports, so in theory you should be able to run one 2560 x 1600 panel and one 1920 x 1200 (or two 25x16 panels if you had a board with dual DisplayPort outputs).

    Take care,
    Anand
  • lopri - Friday, January 22, 2010 - link

    Thank you for the explanation, but unfortunately I couldn't replicate the 'problem' (what exactly?) you've experienced. I don't have The Dark Kight, so I tried Children of Men on a neighbor's 785G system I built for him. That title was chosen because its original content on the disc was encoded VC1 just like The Dark Knight. MediaInfo gave the following information:

    Video
    ID : 1
    Format : AVC
    Format/Info : Advanced Video Codec
    Format profile : High@L4.1
    Format settings, CABAC : Yes
    Format settings, ReFrames : 4 frames
    Muxing mode : Container profile=Unknown@4.1
    Codec ID : V_MPEG4/ISO/AVC
    Duration : 1h 49mn
    Bit rate : 14.2 Mbps
    Nominal bit rate : 14.5 Mbps
    Width : 1 920 pixels
    Height : 1 040 pixels
    Display aspect ratio : 16:9
    Frame rate : 23.976 fps
    Resolution : 24 bits
    Colorimetry : 4:2:0
    Scan type : Progressive
    Bits/(Pixel*Frame) : 0.296
    Stream size : 10.8 GiB (88%)
    Title : Video @ 14489 kbps
    Writing library : x264 core 67 r1165M 6841c5e

    Flawless playback both in Windowed mode as well as full-screen mode, on a 30" LCD. Just to be sure, I tested The Dark Knight trailer which is a VC1 clip, and various H.264 content in .mkv, .mp4, and .m2ts. Using MPC-HC svn 1.3.3347 32-bit AND 64-bit binaries. System had an WHQL driver dated 8/17/2009, installed via Windows Updates. Only codecs installed are Matroksa Splitter and AC3filter.

    So there. Now, what exactly is the problem that I don't see but you do?

    WRT resolutions - Intel listed 2560x1600 on G45 as well. I even got an ADD2 (interesting choice of name, btw) card off eBay hoping it'd work, but that was simply waste of money. I am as skeptical as can be on GMA after my bitter experiences with G35/G45, and it is puzzling why you can't verify that in your lab instead of being a messenger. ("Intel says so")

    Would you feel bad at all if I say I purchased G35/G45 based on your reviews, only to be greatly disappointed? I couldn't even give away a G35 based system to a junior-high kid, because the kid is someone I see on and off and I feared a potential embarrassment and unexpected calls for support.

    Your reviews are full of contradictions one after another, and I am concerned whether you've lost the sense and connection to the real world.
  • Shadowmaster625 - Friday, January 22, 2010 - link

    Given the level of integration, what is making these motherboards so expensive? When are we going to see $35 motherboards? What would keep the prices from coming down that low?
  • strikeback03 - Friday, January 22, 2010 - link

    IIRC the chipset itself currently costs $40.
  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    Correct. Despite moving much of the "chipset" on-package, the actual H5x chipsets are no cheaper than their predecessors. Remember that as AMD and Intel integrate more onto the CPU they still want to preserve or increase profit margins. It's just fortunate for all of us that in the process of integration we actually get a performance benefit.

    Take care,
    Anand
  • Taft12 - Friday, January 22, 2010 - link

    Sounds like we are very much in need of competition in the 3rd party chipset market like the good old days!

    Things are going in the wrong direction with NVIDIA exiting the market, Via and SiS long gone...

Log in

Don't have an account? Sign up now