Power, Temperature, and Noise: How Loud Can One Card Get?

Last but not least as always is our look at the power consumption, temperatures, and acoustics of the Radeon HD 6990 series. This is an area where AMD has traditionally had an advantage, as their small die strategy leads to less power hungry and cooler products compared to their direct NVIDIA counterparts. Dual-GPU cards like the 6990 tend to increase the benefits of lower power consumption, but heat and noise are always a wildcard.

AMD continues to use a single reference voltage for their cards, so the voltages we see here represent what we’ll see for all reference 6900 series cards. In this case voltage also plays a big part, as PowerTune’s TDP profile is calibrated around a specific voltage.

Radeon HD 6900 Series Voltage
6900 Series Idle 6970 Load 6990 Load
0.9v 1.175v 1.12v

The 6990 idles at the same 0.9v as the rest of the 6900 series. At load under default clocks it runs at 1.12v thanks to AMD’s chip binning, and is a big part of why the card uses as little power as it does for its performance. Overclocked to 880MHz however and we see the core voltage go to 1.175v, the same as the 6970. Power consumption and heat generation will shoot up accordingly, exacerbated by the fact that PowerTune is not in use here.

The 6990’s idle power is consistent with the rest of the 6900 series. At 171W it’s at parity with the 6970CF, while we see the advantage of the 6990’s lower idle TDP versus the 5970 in the form of a 9W advantage over the 5970.

With the 6990, load power under Crysis gives us our first indication that TDP alone can’t be used to predict total power consumption. With a 375W TDP the 6990 should consume less power than 2x200W 6950CF, but in practice the 6950CF setup consumes 21W less. Part of this comes down to the greater CPU load the 6990 can create by allowing for higher framerates, but this doesn’t completely explain the disparity. Compared to the 5970 the 6990 is also much higher than the TDP alone would indicate; the gap of 113W exceeds the 75W TDP difference. Clearly the 6990 truly is a more power hungry card than the 5970.

Meanwhile overclocking does send the power consumption further up, this time to 544W. This is better than the 6970CF at the cost of some performance. Do keep in mind though that at this point we’re dissipating 400W+ off of a single card, which will have repercussions.

Under FurMark PowerTune limits become the defining factor for the 6900 series. Even with PT triggering on all three 6900 cards, the numbers have the 375W 6990 drawing more than the 2x200W 6950CF, this time by 41W, with the 6970CF in turn drawing 51W more. All things considered the 6990’s power consumption is in line with its performance relative to the other 6900 series cards.

As for our 6990OC, overclocked and without PowerTune we see what the 6990 is really capable of in terms of power consumption and heat. 684W is well above the 6970CF (which has PT intact), and is approaching the 570/580 in SLI. We don’t have the ability to measure the power consumption of solely the video card, but based on our data we’re confident the 6990 is pulling at least 500W – and this is one card with one fan dissipating all of that heat. Front and rear case ventilation starts looking really good at this point.

Along with the 6900 series’ improved idle TDP, AMD’s dual-exhaust cooler makes its mark on idle temperatures versus the 5970. At 46C the 6990 is warmer than our average card but not excessively so, and in the meantime it’s 7C cooler than the 5970 which has to contend with GPU2 being cooled with already heated air. A pair of 6900 cards in CF though is still going to beat the dual-exhaust cooler.

When the 5970 came out it was warmer than the 5870CF; the 6990 reverses this trend. At stock clocks the 6990 is a small but measurable 2C cooler than the 6970CF, which as a reminder we run in a “bad” CF configuration by having the cards directly next to each other. There is a noise tradeoff to discuss, but as far as temperatures are concerned these are perfectly reasonable. Even the 6990OC is only 2C warmer.

At stock clocks FurMark does not significantly change the picture. If anything it slightly improves things as PowerTune helps to keep the 6990 in the middle of the pack. Overclock however and the story changes. Without PowerTune to keep power consumption in check that 681W power consumption catches up to us in the form of 94C core temperatures. It’s only a 5C difference, but it’s as hot as we’re willing to let the 6990 get. Further overclocking on our test bed is out of the question.

Finally there’s the matter of noise to contend with. At idle nothing is particularly surprising; the 6990 is an iota louder than the average card, presumably due to the dual-exhaust cooler.

And here’s where it all catches up to us. The Radeon HD 5970 was a loud card, the GTX 580 SLI was even louder, but nothing tops the 6990. The laws of physics are a cruel master, and at some point all the smart engineering in the world won’t completely compensate for the fact that you need a lot of airflow to dissipate 375W of heat. There’s no way around the fact that the 6990 is an extremely loud card; and while games aren’t as bad as FurMark here, it’s still noticeably louder than everything else on a relative basis. Ideally the 6990 requires good airflow and good noise isolation, but the former makes the latter difficult to achieve. Water cooled 6990s will be worth their weight in gold.

Compute Performance Final Thoughts
POST A COMMENT

130 Comments

View All Comments

  • Figaro56 - Tuesday, March 08, 2011 - link

    I have 2 XFX HD 5870 cards for sale. I have a double lifetime warranty on these so you get the use of the second lifetime warranty on these. Interested? They are very great performers I can vouch for that. I am use to upgrading my GPU on an annual basis so I am upgrading to 2 HD 6970. $230 each. Reply
  • Thanny - Tuesday, March 08, 2011 - link

    Ignoring the inappropriateness of advertising here, I submit:

    http://www.newegg.com/Product/Product.aspx?Item=N8...

    Why would someone pay you $230 for a used product that can be obtained new at $190?
    Reply
  • fausto412 - Tuesday, March 08, 2011 - link

    I kinda wanted to see a chart with the most common gaming resolution...and can we benchmark with a Q9550 just for comparison? i would love to know if i'm holding back a video card by not going i5 or i7 and by how much. Reply
  • jabber - Tuesday, March 08, 2011 - link

    If you can afford a 6990 why would you be bothering using it with a Q9550 at 1680x1050. Hence why it isnt part of this review.

    This review is to show how it works for the intended market/customer.

    As I said before, this card isnt for folks like you (or me for that matter). Sorry.
    Reply
  • 7Enigma - Tuesday, March 08, 2011 - link

    The most common gaming resolution for this card is the one Ryan tested. It is pointless to test at a lower resolution other than possibly true 24" (1920X1200). And even at that res this card is really not needed. Reply
  • Figaro56 - Tuesday, March 08, 2011 - link

    BOYA to both of those resolutions. You should be playing your games at 2560x1600. Now that's what I'm talkin about! You'd be saying hell ya. Reply
  • Jorgisven - Tuesday, March 08, 2011 - link

    It seems we're getting into the Pentium IV trap, a bit. Big, hot, power-hungry, noisy chips...personally, I'm going to pass on this generation of GPUs. I'm waiting for a revolution in either manufacturing or coding. It's all well and good to have a fast computer for getting what you need done in minimal, but at the risk of the box taking flight because the fans are now of jet engine proportion in speed and power, I'd rather not be able to hear my fans over my headphones...or risk my cat getting sucked into the intake. Reply
  • jabber - Tuesday, March 08, 2011 - link

    Well we've kinda got what we asked for. We've all gamely been buying more and more powerful graphics cards with regards to brute force rendering power.

    We've shown we love buying 750w+ power supplies with multiple GPU connectors, buying SLI and Xfire setups galore.

    So the GPU corps think we love nothing more than just piling on more and more power and wattage to solve the situation.

    It works both ways.

    What we should have been doing was challenging AMD and Nvidia to develop smarter rendering techniques. Had either of them developed PowerVR to the state we are in today we would be in a far better place. Chances are the most power hungry card we'd have today would be 5770 level.

    We need something more efficient like PowerVR to take us to the next level.

    Less brute force and more finesse.
    Reply
  • therealnickdanger - Tuesday, March 08, 2011 - link

    Are you waiting to update your test system until the SATA port issue is corrected? Seems to me that anyone wanting to buy this card would also be using an overclocked 2600K... According to the Bench numbers, the 2600K offers roughly 30% more frames than the 920, depending on the game. That indicates to me that your test system is insufficient to properly test this card.

    Granted, since the vast majority of displays are fixed at 60Hz, fps counts beyond that don't really matter, but I have to wonder what impact this would have on folks with 120Hz-native LCDs. That extra 30% could make the difference.

    ... just sayin'. :)
    Reply
  • Ryan Smith - Tuesday, March 08, 2011 - link

    At this point we're waiting on SNB-E. SNB is very nice, but for a GPU testbed the lack of PCIe bandwidth is an issue. Reply

Log in

Don't have an account? Sign up now