Power Consumption: A Cooler SLACR?

We know from Intel's own documentation that G0 cores draw less power at idle than their B3 predecessors. In the C1E power state the G0 Q6600 is supposed to dissipate 24W compared to 50W of its B3 predecessor. On paper the savings are dramatic, but keep in mind that a processor doesn't spend all of its time in C1E - so how does G0 stack up in the real world?

In order to find out we looked at total system power consumption in two situations: idle and when running our Windows Media Encoder 9 test, a fairly CPU intensive benchmark. We measured average power consumption over the course of the test.

Power Consumption  

At idle, G0 draws 5.6 fewer watts, a reduction of just over 3%. Nothing terribly impressive, but let's look at results under load:

Power Consumption  

The G0 advantage grows to 10.5W under load, or an advantage of just under 5%. This alone isn't reason to upgrade, but lower power consumption is far from a bad thing. Does the new G0 stepping translate into better overclocking potential given its lower power consumption?

Index Overclocking: A Speedy SLACR?
Comments Locked

34 Comments

View All Comments

  • comc100 - Monday, December 31, 2007 - link

    hi i have q6600 g0 and geforce 6600 gt. do i have to upgrade my psu because mine is 305 watts
  • Drew Martell - Tuesday, January 19, 2010 - link

    I have a Q6600 and a 300 watt PSU. I overclocked to 3GHz and still running fine with stock cooling and the 300 watts :D
  • k8fox - Friday, August 6, 2010 - link

    Are you still using your Q6600? I would like to overclock mine but have no idea where to start as I have never done this. Suggestions?
  • sheeple - Monday, October 19, 2015 - link

    The Q6600 stands as Intel's greatest CPU EVER, due to it being FIRST, it's the Garagov of Cpu's!
  • tjaisv - Thursday, August 23, 2007 - link

    So what are the temp differences? Thx
  • lemonadesoda - Sunday, August 19, 2007 - link

    The system power draw is a very interesting statistic.

    But for CPU power comparisons I suggest some other analysis. Since so much of the system power is drawn by all anciliary components, the % improvement calculations show overall system improvement BUT NOT the CPU improvement. The CPU improvement is really the more interesting figure.

    Can you isolate the system power draw excl. the CPU? Perhaps the best way to do this would be to put a ULV low clock CPU into the socket and use that as the "base line" for the system draw. Alternatively, write a utility to put the CPU into "deep sleep HALT" and check the power in this condition. Use this as the baseline.

    You will probably see the baseline around 100W, so the difference betweem say 150W and 160W would be calculated as (160-100)/(150-100)-100%=20% and not the very small figures as article currently shows.
  • iamezza - Tuesday, August 21, 2007 - link

    I agree.

    Considering this article was all about a core stepping the claimed better power consumption, I think they totally underplayed the power improvements. A reduction in total system power of 5-7% soley from a CPU stepping is very impressive. This works out to around 20% for the CPU alone. And it only gets better when it's overclocked.

    Also its wrong to write off an entire stepping for overclocking potential when you have only tested 1 CPU and only used stock cooling at that.
  • cdrsft - Sunday, August 19, 2007 - link

    In the article, the author asks,
    quote:

    For $266 you now have a tough decision to make: do you buy two 3.0GHz cores or four 2.40GHz cores? In our last review we found that if you're doing any amount of 3D rendering or media encoding, the Core 2 Quad Q6600 at $266 ends up being the better value. Of course, if you want the best of both worlds you could always overclock the 2.40GHz Q6600, giving you four, much faster cores.


    I'm not sure I fully understand the answer. IF you are doing some media encoding, the Q6600 is a better value. What about regular desktop usage? For the average person, which of these is the better value?
  • VapoChill - Saturday, August 18, 2007 - link

    Why you not use the lastest Intel driver ?

    *********************************************
    * Product: Intel(R) Chipset Device Software
    * Release: Production Version
    * Version: 8.3.0.1013
    * Target Chipset#: Q33/G33/G31/P35
    * Date: March 05 2007
    *********************************************
  • Blacklash - Friday, August 17, 2007 - link

    go back and try working from an 8x multi. A lot of overclockers seem to be doing well with that approach.

Log in

Don't have an account? Sign up now