Power Consumption

A single Radeon HD 3870 is quite power efficient, but putting two on a board inevitably is going to draw a bit of current. It's no surprise that the X2 is not only the hottest card out of the bunch, but it also makes the most noise; there's simply a lot of heat to expel from an admittedly well designed card.

If cool and quiet are your goals, the 3870 X2 is not a good answer.

Power Consumption at Idle

Power Consumption under Load (Bioshock 2560 x 1600)

World in Conflict Final Words
Comments Locked

74 Comments

View All Comments

  • HilbertSpace - Monday, January 28, 2008 - link

    When giving the power consumption numbers, what is included with that? Ie. how many fans, DVD drives, HDs, etc?
  • m0mentary - Monday, January 28, 2008 - link

    I didn't see an actual noise chart in that review, but from what I understood, the 3870GX2 is louder than an 8800 SLI setup? I wonder if anyone will step in with a decent after market cooler solution. Personally I don't enjoy playing with headphones, so GPU fan noise concerns me.
  • cmdrdredd - Monday, January 28, 2008 - link

    then turn up your speakers
  • drebo - Monday, January 28, 2008 - link

    I don't know. It would have been nice to see power consumption for the 8800GT SLI setup as well as noise for all of them.

    I don't know that I buy that power consumption would scale linearly, so it'd be interesting to see the difference between the 3870 X2 and the 8800GT SLI setup.
  • Comdrpopnfresh - Monday, January 28, 2008 - link

    I'm impressed. Looking at the power consumption figures, and the gains compared to a single 3870, this is pretty good. They got some big performance gains without breaking the bank on power. How would one of these cards overclock though?
  • yehuda - Monday, January 28, 2008 - link

    No, I'm not impressed. You guys should check the isolated power consumption of a single-core 3870 card:

    http://www.xbitlabs.com/articles/video/display/rad...">http://www.xbitlabs.com/articles/video/...lay/rade...

    At idle, a single-core card draws just 18.7W (or 23W if you look at it through a 82% efficient power supply). How is it that adding a second core increases idle power draw by 41W?

    It would seem as if PowerPlay is broken.
  • erikejw - Tuesday, January 29, 2008 - link

    ATI smokes Nvidia when it comes to idle power draw.
  • Spoelie - Monday, January 28, 2008 - link

    GDDR4 consumes less power as GDDR3, given that the speed difference is not that great.
  • FITCamaro - Monday, January 28, 2008 - link

    Also you figure the extra hardware on the card itself to link the two GPUs.
  • yehuda - Tuesday, January 29, 2008 - link

    Yes, it could be that. Tech Report said the bridge chip eats 10-12 watts.

Log in

Don't have an account? Sign up now