Power, Temperature, & Noise

Last but not least as always is our look at the power consumption, temperatures, and acoustics of the Radeon HD 6900 series. This is an area where AMD has traditionally had an advantage, as their small die strategy leads to less power hungry and cooler products compared to their direct NVIDIA counterparts. However NVIDIA has made some real progress lately with the GTX 570, while Cayman is not a small die anymore.

AMD continues to use a single reference voltage for their cards, so the voltages we see here represent what we’ll see for all reference 6900 series cards. In this case voltage also plays a big part, as PowerTune’s TDP profile is calibrated around a specific voltage.

Radeon HD 6900 Series Load Voltage
Ref 6970 Ref 6950 6970 & 6950 Idle
1.175v 1.100v 0.900v

As we discussed at the start of our look at these cards, AMD has been tweaking their designs to take advantage of TSMC’s more mature 40nm process. As a result they’ve been able to bring down idle power usage slightly, even though Cayman is a larger chip than Cypress. For this reason the 6970 and 6950 both can be found at the top of our charts, running into the efficiency limits of our 1200W PSU.

Under Crysis PowerTune is not a significant factor, as Crysis does not generate enough of a load to trigger it. Accordingly our results are rather straightforward, with the larger, more power hungry 6970 drawing around 30W more than the 5870. The 6950 meanwhile is rated 50W lower and draws almost 50W less on the dot. At 292W it’s 15W more than the 5850, or effectively tied with the GTX 460 1GB.

Between Cayman’s larger die and NVIDIA’s own improvements in power consumption, the 6970 doesn’t end up being very impressive here. True, it does draw 20W less, but with the 5000 series AMD’s higher power efficiency was much more pronounced.

It’s under FurMark that we finally see the complete ramifications of AMD’s PowerTune technology. The 6970, even with a TDP over 60W above the 5870, still ends up drawing less power than the 5870 due to PowerTune throttling. This puts our FurMark results at odds with our Crysis results which showed an increase in power usage, but as we’ve already covered PowerTune tightly clamps power usage to AMD’s TDP, keeping the 6900 series’ worst case scenario for power consumption far below the 5870. While we could increase the TDP to 300W we have no practical reason to, as even with PowerTune FurMark still accurately represents the worst case scenario for a 6900 series GPU.

Meanwhile at 320W the 6950 ends up drawing more power than its counterpart the 5850, but not by much. It’s CrossFire variant meanwhile is drawing 509W,only 19W over a single GTX 580, driving home the point that PowerTune significantly reduces power usage for high load programs such as FurMark.

At idle the 6900 series is in good company with a number of other lower power and well-built GPUs. 37-38C is typical for these cards solo, meanwhile our CrossFire numbers conveniently point out the fact that the 6900 series doesn’t do particularly well when its cards are stacked right next to each other.

When it comes to Crysis our 6900 series cards end up performing very similarly to our 5800 series cards, a tradeoff between the better vapor chamber cooler and the higher average power consumption when gaming. Ultimately it’s going to be noise that ties all of this together, but there’s certainly nothing objectionable about temperatures in the mid-to-upper 70s. Meanwhile our 6900 series CF cards approach the upper 80s, significantly worse than our 5800 series CF cards.

Faced once more with FurMark, we see the ramifications of PowerTune in action. For the 6970 this means a temperature of 83C, a few degrees better than the 5870 and 5C better than the GTX 570. Meanwhile the 6950 is at 82C in spite of the fact that it uses a similar cooler in a lower powered configuration; it’s not as amazing as the 5850, but it’s still quite reasonable.

The CF cards on the other hand are up to 91C and 92C despite the fact that PowerTune is active. This is within the cards’ thermal range, but we’re ready to blame the cards’ boxy design for the poor CF cooling performance. You really, really want to separate these cards if you can.

At idle both the 6970 and 6950 are on the verge of running in to our noise floor. With today’s idle power techniques there’s no reason a card needs to have high idle power usage, or the louder fan that often leads to.

Last but not least we have our look at load noise. Both cards end up doing quite well here, once more thanks to PowerTune. As is the case with power consumption, we’re looking at a true worst case scenario for noise, and both cards do very well. At 50.5db and 54.6db neither card is whisper quiet, but for the gaming performance they provide it’s a very good tradeoff and quieter than a number of slower cards. As for our CrossFire cards, the poor ventilation pours over in to our noise tests. Once more, if you can separate your cards you should do so for greatly improved temperature and noise performance.

Compute & Tessellation Final Thoughts
Comments Locked

168 Comments

View All Comments

  • Remon - Wednesday, December 15, 2010 - link

    Seriously, are you using 10.10? It's not like the 10.11 have been out for a while. Oh, wait...

    They've been out for almost a month now. I'm not expecting you to use the 10.12, as these were released just 2 days ago, but you can't have an excuse about not using a month old drivers. Testing overclocked Nvidia cards against newly released cards, and now using older drivers. This site get's more biased with each release.
  • cyrusfox - Wednesday, December 15, 2010 - link

    I could be wrong, but 10.11 didn't work with the 6800 series, so I would imagine 10.11 wasn't meant for the 6900 either. If that is the case, it makes total sense why they used 10.10(cause it was the most updated driver available when they reviewed.)

    I am still using 10.10e, thinking about updating to 10.12, but why bother, things are working great at the moment. I'll probably wait for 11. or 11.2.
  • Remon - Wednesday, December 15, 2010 - link

    Nevermind, that's what you get when you read reviews early in the morning. The 10.10e was for the older AMD cards. Still, I can't understand the difference between this review and HardOCP's.
  • flyck - Wednesday, December 15, 2010 - link

    it doesn't. Anand has the same result for 25.. resolutions with max details AA and FSAA.

    Presentation on anand however is more focussed on 16x..10.. resolutions. (last graph) if you look in the first graph you'll notice the 6970/6950 performs like HardOcp. e.g. the higher the quality the smaller the gap becomes between 6950 and 570 and 6970 and 580. the lower the more 580 is running away and 6970/6950 are trailing the 570.
  • Gonemad - Wednesday, December 15, 2010 - link

    Oookay, new card from the red competitor. Welcome aboard.

    But, all of this time, I had to ask: why is Crysis is so punitive on the graphics cards? I mean, it was released eons ago, and still can't be run with everything cranked up in a single card, if you want 60fps...

    Is it sloppy coding? Does the game *really* looks better with all the eye candy? Or they built a "FPS bug" on purpose, some method of coding that was sure to torture any hardware that would be built in the next 18 months after release?

    I will get slammed for this, but for instance, the water effects on Half Life 2 look great even on lower spec cards, once you turn all the eye-candy on, and the FPS doesn't drop that much. The same for some subtle HDR effects.

    I guess I should see this game by myself and shut up about things I don't know. Yes, I enjoy some smooth gaming, but I wouldn't like to wait 2 years after release to run a game smoothly with everything cranked up.

    Another one is Dirt 2, I played it with all the eye candy to the top, my 5870 dropped to 50-ish FPS (as per benchmarks),it could be noticed eventually. I turned one or two things off, checked if they were not missing after another run, and the in game FPS meter jumped to 70. Yay.
  • BrightCandle - Wednesday, December 15, 2010 - link

    Crysis really does have some fabulous graphics. The amount of foliage in the forests is very high. Crysis kills cards because it really does push current hardware.

    I've got Dirt 2 and its not close in the level of detail. Its a decent looking game at times but its not a scratch on Crysis for the amount of stuff on screen. Half life 2 is also not bad looking but it still doesn't have the same amount of detail. The water might look good but its not as good as a PC game can look.

    You should buy Crysis, its £9.99 on steam. Its not a good game IMO but it sure is pretty.
  • fausto412 - Wednesday, December 15, 2010 - link

    yes...it's not much of a fun game but damn it is pretty
  • AnnihilatorX - Wednesday, December 15, 2010 - link

    Well original Crysis did push things too far and optimization could be used. Crysis Warhead is much better optimized while giving pretty identical visuals.
  • fausto412 - Wednesday, December 15, 2010 - link

    "I guess I should see this game by myself and shut up about things I don't know. Yes, I enjoy some smooth gaming, but I wouldn't like to wait 2 years after release to run a game smoothly with everything cranked up."

    that's probably a good idea. Crysis was made with future hardware in mind. It's like a freaking tech demo. Ahead of it's time and beaaaaaautiful. check it out on max settings,...then come back tell us what you think.
  • TimoKyyro - Wednesday, December 15, 2010 - link

    Thank you for the SmallLuxGPU test. That really made me decide to get this card. I make 3D animations with Blender in Ubuntu so the only thing holding me back is the driver support. Do these cards work in Ubuntu? Is it possible for you to test if the Linux drivers work at the time?

Log in

Don't have an account? Sign up now