Power Consumption

NVIDIA's idle power optimizations do a great job of keeping their very power hungry parts sitting pretty when in 2D mode. Many people I know just leave their computers on all day and generally playing games 24 hours a day is not that great for the health. Idle power is important, especially as energy costs rise, and taking steps to ensure that less power is drawn when less power is needed is a great direction to move in. AMD's 4870 hardware is less power friendly, but 4850 is pretty well balanced at idle.

Moving on to load power.

These numbers are peak power draw experienced over multiple runs of 3dmark vantage's third feature test (pixel shaders). This test heavily loads the GPU while being very light on the rest of the system so that we can get as clear a picture of relative GPU power draw as possible. Playing games will incur much higher system level power draw as the CPU, memory, drives and other hardware may also start to hit their own peak power draw at the same time. 4850 and 4870 CrossFire both require large and stable PSUs in order to play actual games.  

Clearly the 4870 is a power junky posting the second highest peak power of any card (second only to NVIDIA's GTX 280). While a single 4870 draws more power than the 9800 GX2, quad SLI does peak higher than 4870 crossfire. 4850 power draw is on par with its competitors, but 4850 crossfire does seem to have an advantage in power draw over the 9800 GTX+.

Heat and Noise

These cards get way too hot. I keep burning my hands when I try to swap them out, and Anand seems to enjoy using recently tested 4800 series cards as space heaters. We didn't look at heat data for this article, but our 4850 tests show that things get toasty. And the 4870 gets hugely hot.

The fans are kind of quiet most of the time, but some added noise for less system heat might be a good trade off. Even if it's load, making the rest of a system incredibly hot isn't really the right way to go as other fans will need to work harder and/or components might start to fail.

The noise level of the 4850 fan is alright, but when the 4870 spins up I tend to glance out the window to make sure a jet isn't just about to fly into the building. It's hugely loud at load, but it doesn't get there fast and it doesn't stay there long. It seems AMD favored cooling things down quick and then returning to quiet running.

Multi-GPU Performance in Assassin's Creed, Oblivion, The Witcher & Bioshock Final Words
Comments Locked

215 Comments

View All Comments

  • jALLAD - Wednesday, July 9, 2008 - link

    well I am looking forward to a single card setup. SLI or CF is beyond the reach of my pockets. :P

  • Grantman - Friday, July 4, 2008 - link

    Thank you very much for including the 8800gt sli figures in your benchmarks. I created an account especially so I could thank Anand Lal Shimpi & Derek Wilson as I have found no other review site including 8800gt sli info. It is very interesting to see the much cheaper 8800gt sli solution beating the gtx 280 on several occasions.
  • Grantman - Friday, July 4, 2008 - link

    When I mentioned "no other review site including 8800gt sli info" I naturally meant in comparison with the gtx280, gx2 4850 crossfire etc etc.

    Thanks again.
  • ohodownload - Wednesday, July 2, 2008 - link

    computer-hardware-zone.blogspot.com/2008/07/ati-radeon-hd4870-x2-specification.
    tml
  • DucBertus - Wednesday, July 2, 2008 - link

    Hi,

    Nice article. Could you please add the amount of graphics memory on the cards to the "The Test" page of the article. The amount of memory matters for the performance and (not unimportant) the price of the cards...

    Cheers, DucBertus.
  • hybrid2d4x4 - Sunday, June 29, 2008 - link

    Hello!
    Long-time reader here that finally decided to make an account. First off, thanks for the great review Anand and Derek, and hats off to you guys for following up to the comments on here.
    One thing that I was hoping to see mentioned in the power consumption section is if AMD has by any chance implemented their PowerXpress feature into this generation (where the discrete card can be turned off when not needed in favor of the more efficient on-board video- ie: HD3200)? I recall reading that the 780G was supposed to support this kind of functionality, but I guess it got overlooked. Have you guys heard if AMD intends to bring it back (maybe in their 780GX or other upcoming chipsets)? It'd be a shame if they didn't, seeing as how they were probably the first to bring it up and integrate it into their mobile solutions, and now even nVidia has their own version of it (Hybrid Power, as part of HybridSLI) on the desktop...
  • AcornArmy - Sunday, June 29, 2008 - link

    I honestly don't understand what Nvidia was thinking with the GTX 200 series, at least at their current prices. Several of Nvidia's own cards are better buys. Right now, you can find a 9800 GX2 at Pricewatch for almost $180 less than a GTX 280, and it'll perform as well as the 280 in almost all cases and occasionally beat the hell out of it. You can SLI two 8800 GTs for less than half the price and come close in performance.

    There really doesn't seem to be any point in even shipping the 280 or 260 at their current prices. The only people who'll buy them are those who don't do any research before they buy a video card, and if someone's that foolish they deserve to get screwed.
  • CJBTech - Sunday, June 29, 2008 - link

    Hey iamap, with the current release of HD 4870 cards, all of the manufacturers are using the reference ATI design, so they should all be pretty much identical. It boils down to individual manufacturer's warranty and support. Sapphire, VisionTek, and Powercolor have all been great for me over the years, VisionTek is offering a lifetime warranty on these cards. I've had poor experiences with HIS and Diamond, but probably wouldn't hesitate to get one of these from either of those manufactures on this particular card (or the HD 4850) because they are the same card, ATI reference.
  • Paladin1211 - Saturday, June 28, 2008 - link

    Now that the large monolithic, underperforming chip is out, leaving AMD free to grab market share, I'm so excited at what to happen. As nVidia's strategy goes, they're now scaling down the chip. But pardon me, cut the GTX 280 in half and then prices it at $324.99? That sounds so crazy!

    Anyone remembers the shock treatment of AMD with codename "Thunder"? DAAMIT has just opened "a can of whoop ass" on nVidia!
  • helldrell666 - Friday, June 27, 2008 - link

    Anand tech why didnt you use and amd 790FX board to bench the radeon cards instead of using an nvidia board for both nvidia and ATI cards.It would be more accurate to bench those cards on compatible boards .
    I think those cards would have worked better on an amd board based on the radeon express 790fx chipset.

Log in

Don't have an account? Sign up now