Idle Tests

We enabled the power management capabilities of each chipset in the BIOS and set our voltage selections to auto except for memory. We set memory to 1.90V to ensure stability at our timings. The boards would default to 1.80V or 1.85V, but we found 1.90V necessary for absolute stability in our configurations at the rated 4-4-4-12 timings.

On the two AMD boards, this resulted in almost identical settings with the exception being chipset voltages, although those were within a fraction of each other. Overall, each of the CPUs hovered around 1.250V and all power management options functioned perfectly on these particular board choices. We then set Vista to use either the Performance or Balanced profiles depending on test requirements.

We typically run our machines with the balanced profile. Using the Power Savings setting resulted in a decrease of 1W to 5W depending on the CPU and application tested. At idle, the Balanced and Power Savings profiles both set the minimum value for processor power management to 5%, while the maximum is set to 100% for Balanced and 50% for Power Savings. The Performance setting sets both values to 100% and is the reason for the increases in power requirements even if you have power management turned on in the BIOS.

Consumption - Minimum Spec CPU

Consumption - Dual Core

Consumption - Quad Core

The results surprised us - more like floored us. The same company that brings you global warming friendly chipsets like the 680i/780i has suddenly turned a new leaf, or at least saved the tree it came from. We see the GeForce 8200 besting the AMD 780G and Intel G35 platforms in our minimum spec configuration utilizing the Power Saving profile by 3W and 15W respectively. To be fair to Intel, we are comparing a single core AMD processor to a dual-core processor. However, these are the minimum CPUs we would utilize. (4/19/08 Update - Minimum Spec chart is correct now)

Frankly, the AMD LE1600 is just on the verge of not being an acceptable processor during HD playback. The LE1600 was able to pass all of our tests, but the menu operation was slow when choosing our movie options and CPU utilization did hit the upper 90% range on some of the more demanding titles even with the 780G or GF8200 providing hardware offload capabilities.

The pattern changes slightly with the dual-core setup having a 7W and 25W advantage for the GF8200. Our quad-core results are almost even with the GeForce 8200 board from Biostar having only a 1W difference compared to the Gigabyte 780G setup. The GeForce 8200/Phenom 9550 combo comes in with a 13W advantage over the Q9300 on the ASUS G35 board.

Putting It All Together Cranked
Comments Locked


View All Comments

  • spinportal - Friday, April 18, 2008 - link

    Hey Gary, shouldn't the last paragraph title be "Final Thoughts" instead of "First Thoughts"? Or did I read the article backwards? :)
  • Visual - Friday, April 18, 2008 - link

    They do this very often - I understand it as "the product is very new, just launching, and a lot more testing is expected; so far, this is what we think", but I've been confused by it too.
  • Visual - Friday, April 18, 2008 - link

    i am way out of the info loop now.
    are there no current nvidia igp chipsets for intel cpus?
  • smn198 - Friday, April 18, 2008 - link

    Thanks for the article. I found it interesting and glad that a better performing IGP doesn't have to mean worse power efficiency.

    I'd like the performance per watt stats I've seen you do before and also it would be good to get an indication of how much a difference in running costs each platform would have over the year having made some assumptions on typical usage.

    As you mentioned you focused on power which is important but there are many more considerations such as the materials and processes involved in making these components and the impact at EOL.

    Hope to see more like this!

Log in

Don't have an account? Sign up now