Our test setup remains the same as the idle power tests with the Vista power profile set to Balanced. Our feature title is Crank, with a video bitrate up to 37Mbps and an uncompressed PCM soundtrack rated at 6144Kbps or a Dolby Digital Surround 5.1 EX track at 640Kbps.

Consumption - Minimum Spec CPU

Consumption - Dual Core

Consumption - Quad Core

Much like our idle tests, the advantages go to the Biostar TF8200 A2+ board equipped with the GeForce 8200 chipset. The results are an 18W and 25W advantage with the low-end CPUs, 8W and 15W with the dual cores, and an outstanding 10W and 24W difference with the quad cores.

Our initial CPU utilization testing shows the GeForce 8200 and 780G chipsets to be nearly identical so we can negate any potential differences in this area. We are still analyzing the new image quality results, but they seem to be very close so far. We will temper our excitement for now, as new drivers that implement several missing features should be available shortly. We will revise our numbers if there are any significant changes.

Watting to Exhale Doh!
Comments Locked


View All Comments

  • Darth Farter - Saturday, April 19, 2008 - link

    awesome, over here where it's US$ 30cents/kWH you can understand that it will start to make a difference. Only thing I would like to see is Undervolting tho that like overclocking depends on the mileage. I'm running a G1 brisbane at 2Ghz with 0.975vcore on a 690g for 24/7 download/internet box. I wonder what it costs me/month
  • JarredWalton - Saturday, April 19, 2008 - link

    Given our earlier calculations of $0.10/kWh, tripling the cost of energy means you're looking at savings of up to $30 per year for 24/7 use and a difference of 10W. If you're running a 100W PC 24/7 for a whole year, that PC would cost $262.80 at $0.30/kWh or $87.60 at $0.10/kWh.
  • royalcrown - Saturday, April 19, 2008 - link

    What is going on with the fried mosfets also, we never did get that weekend update ;) ?
  • royalcrown - Saturday, April 19, 2008 - link

    Why don't you have ANAND buy you guys some meters and on EVERY GFX card or PROCESSOR review list the actual wattage used by the systems. This NEEDING of at LEAST a 550 watt ps is BS for those of us that will never use dual cards.

    I just calculated that my new system on FULL load should draw about 280 watts with an 8800gt, so a 400 watt supply with 450 peak is fine for me . I read than Nvidia claims 125 watts on their page and the real draw is a lot less when they use the meter.

    I for one am sick of these companies pushing monster PSU when they AREN'T needed in every case, and sites like Anandtech should give us the scoop instead of plastering ads for 1200 watt psu and not telling readers that we may not need even 550.
  • Zaranthos - Saturday, April 19, 2008 - link

    That's a fact. I'm so sick of seeing insanely large power supplies shoved down peoples throats. I keep upgrading my computer and my 300W power supply keeps running my computer just fine. You'd think that wasn't even possible by most of the reviews/ads/propaganda. I'd like to see tests showing what the minimum power supply requirements are.
  • JarredWalton - Saturday, April 19, 2008 - link

    You mean like our PSU reviews where we repeatedly state that the only way you can even come near the point where a 1000W PSU is required is if you're heavily (i.e. water- or phase-cooling) overclocking your quad-core CPU and running 3-way or 4-way GPUs?

    Most PSUs are at maximum efficiency around the 50% load mark, but even at 30% load the good PSUs are above 83% efficiency. Couple that to the fact that a 600W PSU is generally quieter delivering 150W than a 300W PSU delivering the same wattage, and there are reasons to buy higher-spec PSUs. The biggest reason to buy a higher spec PSU, of course, is that it's very difficult to find good quality PSUs rated under 400W. (Seasonic and the Seasonic-built PSUs are about the only option.)

    All that is totally overlooking the fact that *testing* with a highly-rated 520W PSU is not the same as saying the PSU is required. What's important is consistency, and here we are using the same PSU for all tests. It should have an 80-85% efficiency across the tested power requirements, which is well within the margin of error. If we drop to a 300W Seasonic, power draw might change slightly, but proportionately the results should be nearly identical to what we see in this article.

    Perhaps Gary can chime in here with some comments; I know that he sent me an initial configuration table for this article on Thursday and then changed the PSU and case later that night. The original PSU was a Seasonic unit, so perhaps he ran into some difficulties. Again, not that it really makes a difference.
  • Wirmish - Saturday, April 19, 2008 - link

    Flight Simulator X Test:
    nVidia vs AMD -> 0W to 3W, or ~2%.
    Ok... nVidia win by 2%.

    And "watt" about the FPS during these benchs ?
    Did nVidia 8200 have -2% FPS vs AMD 780G ?

    And if the 780G is faster, can you underclock it, or overclock the 8200 ?
    Try it... just to compare the consumption at the same performance level.
  • Esben - Saturday, April 19, 2008 - link

    Thanks for shedding light on the current IGP situation. It's great to see Nvidia is still competitive in the IGP-business, consumption wise. Now we eagerly await the performance numbers.

    Please keep writing about IGPs and power consumption. I'd find it very interesting if you made an articles about maximizing performance per watt, and how far in performance you can push the IGP.

    An IGP-system is fitting most peoples needs, so the interest is definitely there.
  • jacito - Friday, April 18, 2008 - link

    The artical is very well written, and this is going to sound rather stupid, but what does IGP stand for?
  • JarredWalton - Friday, April 18, 2008 - link

    IGP = Integrated Graphics Processor

Log in

Don't have an account? Sign up now