Power, Temperature, & Noise

Last but not least as always is our look at the power consumption, temperatures, and acoustics of the Radeon HD 6900 series. This is an area where AMD has traditionally had an advantage, as their small die strategy leads to less power hungry and cooler products compared to their direct NVIDIA counterparts. However NVIDIA has made some real progress lately with the GTX 570, while Cayman is not a small die anymore.

AMD continues to use a single reference voltage for their cards, so the voltages we see here represent what we’ll see for all reference 6900 series cards. In this case voltage also plays a big part, as PowerTune’s TDP profile is calibrated around a specific voltage.

Radeon HD 6900 Series Load Voltage
Ref 6970 Ref 6950 6970 & 6950 Idle
1.175v 1.100v 0.900v

As we discussed at the start of our look at these cards, AMD has been tweaking their designs to take advantage of TSMC’s more mature 40nm process. As a result they’ve been able to bring down idle power usage slightly, even though Cayman is a larger chip than Cypress. For this reason the 6970 and 6950 both can be found at the top of our charts, running into the efficiency limits of our 1200W PSU.

Under Crysis PowerTune is not a significant factor, as Crysis does not generate enough of a load to trigger it. Accordingly our results are rather straightforward, with the larger, more power hungry 6970 drawing around 30W more than the 5870. The 6950 meanwhile is rated 50W lower and draws almost 50W less on the dot. At 292W it’s 15W more than the 5850, or effectively tied with the GTX 460 1GB.

Between Cayman’s larger die and NVIDIA’s own improvements in power consumption, the 6970 doesn’t end up being very impressive here. True, it does draw 20W less, but with the 5000 series AMD’s higher power efficiency was much more pronounced.

It’s under FurMark that we finally see the complete ramifications of AMD’s PowerTune technology. The 6970, even with a TDP over 60W above the 5870, still ends up drawing less power than the 5870 due to PowerTune throttling. This puts our FurMark results at odds with our Crysis results which showed an increase in power usage, but as we’ve already covered PowerTune tightly clamps power usage to AMD’s TDP, keeping the 6900 series’ worst case scenario for power consumption far below the 5870. While we could increase the TDP to 300W we have no practical reason to, as even with PowerTune FurMark still accurately represents the worst case scenario for a 6900 series GPU.

Meanwhile at 320W the 6950 ends up drawing more power than its counterpart the 5850, but not by much. It’s CrossFire variant meanwhile is drawing 509W,only 19W over a single GTX 580, driving home the point that PowerTune significantly reduces power usage for high load programs such as FurMark.

At idle the 6900 series is in good company with a number of other lower power and well-built GPUs. 37-38C is typical for these cards solo, meanwhile our CrossFire numbers conveniently point out the fact that the 6900 series doesn’t do particularly well when its cards are stacked right next to each other.

When it comes to Crysis our 6900 series cards end up performing very similarly to our 5800 series cards, a tradeoff between the better vapor chamber cooler and the higher average power consumption when gaming. Ultimately it’s going to be noise that ties all of this together, but there’s certainly nothing objectionable about temperatures in the mid-to-upper 70s. Meanwhile our 6900 series CF cards approach the upper 80s, significantly worse than our 5800 series CF cards.

Faced once more with FurMark, we see the ramifications of PowerTune in action. For the 6970 this means a temperature of 83C, a few degrees better than the 5870 and 5C better than the GTX 570. Meanwhile the 6950 is at 82C in spite of the fact that it uses a similar cooler in a lower powered configuration; it’s not as amazing as the 5850, but it’s still quite reasonable.

The CF cards on the other hand are up to 91C and 92C despite the fact that PowerTune is active. This is within the cards’ thermal range, but we’re ready to blame the cards’ boxy design for the poor CF cooling performance. You really, really want to separate these cards if you can.

At idle both the 6970 and 6950 are on the verge of running in to our noise floor. With today’s idle power techniques there’s no reason a card needs to have high idle power usage, or the louder fan that often leads to.

Last but not least we have our look at load noise. Both cards end up doing quite well here, once more thanks to PowerTune. As is the case with power consumption, we’re looking at a true worst case scenario for noise, and both cards do very well. At 50.5db and 54.6db neither card is whisper quiet, but for the gaming performance they provide it’s a very good tradeoff and quieter than a number of slower cards. As for our CrossFire cards, the poor ventilation pours over in to our noise tests. Once more, if you can separate your cards you should do so for greatly improved temperature and noise performance.

Compute & Tessellation Final Thoughts
POST A COMMENT

168 Comments

View All Comments

  • 529th - Sunday, December 19, 2010 - link

    Great job on this review. Excellent writing and easy to read.

    Thanks
    Reply
  • marc1000 - Sunday, December 19, 2010 - link

    yes, that's for sure. we will have to wait a little to see improvements from VLIW4. but my point is the "VLIW processors" count, they went up by 20%. with all other improvements, I was expecting a little more performance, just that.

    but in the other hand, I was reading the graphs, and decided that 6950 will be my next card. it has double the performance of 5770 in almost all cases. that's good enough for me.
    Reply
  • Iketh - Friday, December 24, 2010 - link

    This is how they've always reviewed new products? And perhaps the biggest reason AT stands apart from the rest? You must be new to AT?? Reply
  • WhatsTheDifference - Sunday, December 26, 2010 - link

    the 4890? I see every nvidia config, never a card overlooked there, ever, but the ATI's (then) top card is conspicuously absent. long as you include the 285, there's really no excuse for the omission. honestly, what's the problem? Reply
  • PeteRoy - Friday, December 31, 2010 - link

    All games released today are in the graphic level of the year 2006, how many games do you know that can bring the most out of this card? Crysis from 2007? Reply
  • Hrel - Tuesday, January 11, 2011 - link

    So when are all these tests going to be re-run at 1920x1080 cause quite frankly that's what I'm waiting for. I don't care about any resolution that doesn't work on my HDTV. I want 1920x1080, 1600x900 and 1280x720. If you must include uber resolutions for people with uber money then whatever; but those people know to just buy the fastest card out there anyway so they don't really need performance numbers to make up their mind. Money is no object so just buy nvidia's most expensive card and ur off. Reply
  • AKP1973 - Thursday, October 13, 2011 - link

    Have you guys noticed the "load GPU temp" of the 6870 in XFIRE?... It produced so very low heat than any enthusiast card in a multi-GPU setup. That's one of the best XFIRE card in our time today if you value price, performance, cool temp, and silence.! Reply
  • Travisryno - Wednesday, April 26, 2017 - link

    It's dishonest referring to enhanced 8x as 32x. There are industry standards for this, which AMD, NEC, 3DFX, SGI, SEGA AM2, etc..(everybody) always follow(ed), then nVidia just makes their own...
    Just look how convoluted it is..
    Reply

Log in

Don't have an account? Sign up now