Power, Temperature, & Noise

Last but not least as always is our look at the power consumption, temperatures, and acoustics of the Radeon HD 6900 series. This is an area where AMD has traditionally had an advantage, as their small die strategy leads to less power hungry and cooler products compared to their direct NVIDIA counterparts. However NVIDIA has made some real progress lately with the GTX 570, while Cayman is not a small die anymore.

AMD continues to use a single reference voltage for their cards, so the voltages we see here represent what we’ll see for all reference 6900 series cards. In this case voltage also plays a big part, as PowerTune’s TDP profile is calibrated around a specific voltage.

Radeon HD 6900 Series Load Voltage
Ref 6970 Ref 6950 6970 & 6950 Idle
1.175v 1.100v 0.900v

As we discussed at the start of our look at these cards, AMD has been tweaking their designs to take advantage of TSMC’s more mature 40nm process. As a result they’ve been able to bring down idle power usage slightly, even though Cayman is a larger chip than Cypress. For this reason the 6970 and 6950 both can be found at the top of our charts, running into the efficiency limits of our 1200W PSU.

Under Crysis PowerTune is not a significant factor, as Crysis does not generate enough of a load to trigger it. Accordingly our results are rather straightforward, with the larger, more power hungry 6970 drawing around 30W more than the 5870. The 6950 meanwhile is rated 50W lower and draws almost 50W less on the dot. At 292W it’s 15W more than the 5850, or effectively tied with the GTX 460 1GB.

Between Cayman’s larger die and NVIDIA’s own improvements in power consumption, the 6970 doesn’t end up being very impressive here. True, it does draw 20W less, but with the 5000 series AMD’s higher power efficiency was much more pronounced.

It’s under FurMark that we finally see the complete ramifications of AMD’s PowerTune technology. The 6970, even with a TDP over 60W above the 5870, still ends up drawing less power than the 5870 due to PowerTune throttling. This puts our FurMark results at odds with our Crysis results which showed an increase in power usage, but as we’ve already covered PowerTune tightly clamps power usage to AMD’s TDP, keeping the 6900 series’ worst case scenario for power consumption far below the 5870. While we could increase the TDP to 300W we have no practical reason to, as even with PowerTune FurMark still accurately represents the worst case scenario for a 6900 series GPU.

Meanwhile at 320W the 6950 ends up drawing more power than its counterpart the 5850, but not by much. It’s CrossFire variant meanwhile is drawing 509W,only 19W over a single GTX 580, driving home the point that PowerTune significantly reduces power usage for high load programs such as FurMark.

At idle the 6900 series is in good company with a number of other lower power and well-built GPUs. 37-38C is typical for these cards solo, meanwhile our CrossFire numbers conveniently point out the fact that the 6900 series doesn’t do particularly well when its cards are stacked right next to each other.

When it comes to Crysis our 6900 series cards end up performing very similarly to our 5800 series cards, a tradeoff between the better vapor chamber cooler and the higher average power consumption when gaming. Ultimately it’s going to be noise that ties all of this together, but there’s certainly nothing objectionable about temperatures in the mid-to-upper 70s. Meanwhile our 6900 series CF cards approach the upper 80s, significantly worse than our 5800 series CF cards.

Faced once more with FurMark, we see the ramifications of PowerTune in action. For the 6970 this means a temperature of 83C, a few degrees better than the 5870 and 5C better than the GTX 570. Meanwhile the 6950 is at 82C in spite of the fact that it uses a similar cooler in a lower powered configuration; it’s not as amazing as the 5850, but it’s still quite reasonable.

The CF cards on the other hand are up to 91C and 92C despite the fact that PowerTune is active. This is within the cards’ thermal range, but we’re ready to blame the cards’ boxy design for the poor CF cooling performance. You really, really want to separate these cards if you can.

At idle both the 6970 and 6950 are on the verge of running in to our noise floor. With today’s idle power techniques there’s no reason a card needs to have high idle power usage, or the louder fan that often leads to.

Last but not least we have our look at load noise. Both cards end up doing quite well here, once more thanks to PowerTune. As is the case with power consumption, we’re looking at a true worst case scenario for noise, and both cards do very well. At 50.5db and 54.6db neither card is whisper quiet, but for the gaming performance they provide it’s a very good tradeoff and quieter than a number of slower cards. As for our CrossFire cards, the poor ventilation pours over in to our noise tests. Once more, if you can separate your cards you should do so for greatly improved temperature and noise performance.

Compute & Tessellation Final Thoughts
Comments Locked

168 Comments

View All Comments

  • mac2j - Wednesday, December 15, 2010 - link

    Um - if you have the money for a 580 ... pick up another $80-100 and get 2 x 6950 - you'll get nearly the best possible performance on the market at a similar cost.

    Also I agree that Nvidia will push the 580 price down as much as possible... the problem is that if you believe all of the admittedly "unofficial" breakdowns ... it costs Nvidia 1.5-2x as much to make a 580 as it costs AMD to make a 6970.

    So its hard to be sure how far Nvidia can push down the price on the 580 before it ceases to become profitable - my guess is they'll focus on making a 565 type card which has almost 570 performance but for a manufacturing cost closer to what a 460 runs them.
  • fausto412 - Wednesday, December 15, 2010 - link

    yeah. AMD let us down on this here product. We see what gtx580 is and what 6970 is...i would say if you planning to spend 500...the gtx580 is worth it.
  • truepurple - Wednesday, December 15, 2010 - link

    "support for color correction in linear space"

    What does that mean?
  • Ryan Smith - Wednesday, December 15, 2010 - link

    There are two common ways to represent color, linear and gamma.

    Linear: Used for rendering an image. More generally linear has a simple, fixed relationship between X and Y, such that if you drew the relationship it would be a straight line. A linear system is easy to work with because of the simple relationship.

    Gamma: Used for final display purposes. It's a non-linear colorspace that was originally used because CRTs are inherently non-linear devices. If you drew out the relationship, it would be a curved line. The 5000 series is unable to apply color correction in linear space and has to apply it in gamma space, which for the purposes of color correction is not as accurate.
  • IceDread - Wednesday, December 15, 2010 - link

    Yet again we do not get to see hd 5970 in crossfire despite it being a single card! Is this an nvidia site?

    Anyway, for those of you who do want to see those results, here is a link to a professional Swedish site!

    http://www.sweclockers.com/recension/13175-amd-rad...

    Maybe there is some google translation available or so if you want to understand more than the charts shows.
  • medi01 - Wednesday, December 15, 2010 - link

    Wow, 5970 in crossfire consumes less than 580 in SLI.
    http://www.sweclockers.com/recension/13175-amd-rad...
  • ggathagan - Wednesday, December 15, 2010 - link

    Absolutely!!!
    There's no way on God's green earth that Anandtech doesn't currently have a pair of 5970's on hand, so that MUST be the reason.
    I'll go talk to Anand and Ryan right now!!!!
    Oh, wait, they're on a conference call with Huang Jen-Hsun.....

    I'd like to note that I do not believe Anadtech ever did a test of two 5970's, so it's somewhat difficult to supply non-existent into any review.
    Ryan did a single card test in November 2009.That is the only review I've found of any 5970's on the site.
  • vectorm12 - Wednesday, December 15, 2010 - link

    I was not aware of the fact that the 32nm process had been canned completely and was still expecting the 6970 to blow the 580 out of the water.

    Although we can't possibly know and are unlikely to ever find out what cayman at 32nm would have performed like I suspect AMD had to give up a good chunk of performance to fit it on the 389mm^2 40nm die.

    This really makes my choice easy as I'll pickup another cheap 5870 and run my system in CF.
    I think I'll be able to live with the performance until the refreshed cayman/next gen GPUs are ready for prime time.

    Ryan: I'd really like to see what ighashgpu can do with the new 6970 cards though. Although you produce a few GPGPU charts I feel like none of them really represent the real "number-crunching" performance of the 6970/6950.

    Ivan has already posted his analysis in his blog and it seems like the change from LWIV5 to LWIV4 made a negligible impact at the most. However I'd really love to see ighashgpu included in future GPU tests to test new GPUs and architectures.

    Thanks for the site and keep up the work guys!
  • slagar - Wednesday, December 15, 2010 - link

    Gaming seems to be in the process of bursting its own bubble. Graphics of games isn't keeping up with the hardware (unless you cound gaming on 6 monitors) because most developers are still targeting consoles with much older technology.
    Consoles won't upgrade for a few more years, and even then, I'm wondering how far we are from "the final console generation". Visual improvements in graphics are becoming quite incremental, so it's harder to "wow" consumers into buying your product, and the costs for developers is increasing, so it's becoming harder for developers to meet these standards. Tools will always improve and make things easier and more streamlined over time I suppose, but still... it's going to be an interesting decade ahead of us :)
  • darckhart - Wednesday, December 15, 2010 - link

    that's not entirely true. the hardware now allows not only insanely high resolutions, but it also lets those of us with more stringent IQ requirements (large custom texture mods, SSAA modes, etc) to run at acceptable framerates at high res in intense action spots.

Log in

Don't have an account? Sign up now