Power, Temperature, and Noise: How Loud Can One Card Get?

Last but not least as always is our look at the power consumption, temperatures, and acoustics of the Radeon HD 6990 series. This is an area where AMD has traditionally had an advantage, as their small die strategy leads to less power hungry and cooler products compared to their direct NVIDIA counterparts. Dual-GPU cards like the 6990 tend to increase the benefits of lower power consumption, but heat and noise are always a wildcard.

AMD continues to use a single reference voltage for their cards, so the voltages we see here represent what we’ll see for all reference 6900 series cards. In this case voltage also plays a big part, as PowerTune’s TDP profile is calibrated around a specific voltage.

Radeon HD 6900 Series Voltage
6900 Series Idle 6970 Load 6990 Load
0.9v 1.175v 1.12v

The 6990 idles at the same 0.9v as the rest of the 6900 series. At load under default clocks it runs at 1.12v thanks to AMD’s chip binning, and is a big part of why the card uses as little power as it does for its performance. Overclocked to 880MHz however and we see the core voltage go to 1.175v, the same as the 6970. Power consumption and heat generation will shoot up accordingly, exacerbated by the fact that PowerTune is not in use here.

The 6990’s idle power is consistent with the rest of the 6900 series. At 171W it’s at parity with the 6970CF, while we see the advantage of the 6990’s lower idle TDP versus the 5970 in the form of a 9W advantage over the 5970.

With the 6990, load power under Crysis gives us our first indication that TDP alone can’t be used to predict total power consumption. With a 375W TDP the 6990 should consume less power than 2x200W 6950CF, but in practice the 6950CF setup consumes 21W less. Part of this comes down to the greater CPU load the 6990 can create by allowing for higher framerates, but this doesn’t completely explain the disparity. Compared to the 5970 the 6990 is also much higher than the TDP alone would indicate; the gap of 113W exceeds the 75W TDP difference. Clearly the 6990 truly is a more power hungry card than the 5970.

Meanwhile overclocking does send the power consumption further up, this time to 544W. This is better than the 6970CF at the cost of some performance. Do keep in mind though that at this point we’re dissipating 400W+ off of a single card, which will have repercussions.

Under FurMark PowerTune limits become the defining factor for the 6900 series. Even with PT triggering on all three 6900 cards, the numbers have the 375W 6990 drawing more than the 2x200W 6950CF, this time by 41W, with the 6970CF in turn drawing 51W more. All things considered the 6990’s power consumption is in line with its performance relative to the other 6900 series cards.

As for our 6990OC, overclocked and without PowerTune we see what the 6990 is really capable of in terms of power consumption and heat. 684W is well above the 6970CF (which has PT intact), and is approaching the 570/580 in SLI. We don’t have the ability to measure the power consumption of solely the video card, but based on our data we’re confident the 6990 is pulling at least 500W – and this is one card with one fan dissipating all of that heat. Front and rear case ventilation starts looking really good at this point.

Along with the 6900 series’ improved idle TDP, AMD’s dual-exhaust cooler makes its mark on idle temperatures versus the 5970. At 46C the 6990 is warmer than our average card but not excessively so, and in the meantime it’s 7C cooler than the 5970 which has to contend with GPU2 being cooled with already heated air. A pair of 6900 cards in CF though is still going to beat the dual-exhaust cooler.

When the 5970 came out it was warmer than the 5870CF; the 6990 reverses this trend. At stock clocks the 6990 is a small but measurable 2C cooler than the 6970CF, which as a reminder we run in a “bad” CF configuration by having the cards directly next to each other. There is a noise tradeoff to discuss, but as far as temperatures are concerned these are perfectly reasonable. Even the 6990OC is only 2C warmer.

At stock clocks FurMark does not significantly change the picture. If anything it slightly improves things as PowerTune helps to keep the 6990 in the middle of the pack. Overclock however and the story changes. Without PowerTune to keep power consumption in check that 681W power consumption catches up to us in the form of 94C core temperatures. It’s only a 5C difference, but it’s as hot as we’re willing to let the 6990 get. Further overclocking on our test bed is out of the question.

Finally there’s the matter of noise to contend with. At idle nothing is particularly surprising; the 6990 is an iota louder than the average card, presumably due to the dual-exhaust cooler.

And here’s where it all catches up to us. The Radeon HD 5970 was a loud card, the GTX 580 SLI was even louder, but nothing tops the 6990. The laws of physics are a cruel master, and at some point all the smart engineering in the world won’t completely compensate for the fact that you need a lot of airflow to dissipate 375W of heat. There’s no way around the fact that the 6990 is an extremely loud card; and while games aren’t as bad as FurMark here, it’s still noticeably louder than everything else on a relative basis. Ideally the 6990 requires good airflow and good noise isolation, but the former makes the latter difficult to achieve. Water cooled 6990s will be worth their weight in gold.

Compute Performance Final Thoughts
Comments Locked

130 Comments

View All Comments

  • iamezza - Tuesday, March 8, 2011 - link

    This could make for an extremely valuable article for gamers on a budget. When does lack of PCIe bandwidth become an issue for running SLI/crossfire?

    Testing 580SLI at 2 x 8 and 2 x 16 modes would be a good place to start....
  • therealnickdanger - Tuesday, March 8, 2011 - link

    It will be curious to see what impact the bandwidth will have... then again, even with the restriction, the current Sandy Bridge systems still dominate the previous chips.

    In reality, 16/16 or 8/8 really doesn't have much impact. The difference even at 2560x1600 with all the fixins in even the most demanding games is <1%. Unless AT's new test system will feature six displays and 4K+ resolutions, I'm not sure SNB-E is worth waiting so long for (yes, that could be perceived as a challenge!)

    In any case, I'm looking forward to it! Thanks for the article!
  • shaggart5446 - Tuesday, March 8, 2011 - link

    i hope u said the same thing when ur friend nvidia release their 590 card i also do hope u say the exact words that the 590 dont make any sence since a pair of 560 or 570 can give u the same performance as the 590 i cant wait to see ur article on the 590 ill be waiting for anand tfor this because we all know that the 590 are going to be down clock
  • ClownPuncher - Tuesday, March 8, 2011 - link

    With cards designed specifically with multi monitor gaming in mind, you may want to include those resolutions. Buying this card for 1920x1200 would make zero sense.
  • 7Enigma - Wednesday, March 9, 2011 - link

    I think it was good to have both. The number of people buying this card will likely have 30" displays, but I'm sure some (competetive FPS for example) will want extremely fluid display even in busy scenes, as well as the person that doesn't yet have the cash to upgrade to a big screen but plans to in the near future.

    I would also argue that there are likely vastly more people playing on large single-screen displays than eyefinity folks so this does make more sense. And honestly when some of the games are averaging in the sub 80-100 fps range, those minimum framerates approach questionable playability depending on type of game.

    So basically as crazy as it is to say this, the graphical power isn't quite there yet to use Eyefinity at high detail settings in more recent and demanding games.
  • Nentor - Tuesday, March 8, 2011 - link

    "With but a trio of exceptions, the 6990 doesn’t make sense compared to a pair of cards in Crossfire."

    This product is not meant to make any sense from a financial, performance or even practical standpoint.

    It IS the fastest videocard and that is that.

    I was watching a video last night on youtube of a chainsaw powered by a Buick's V8 engine (hG5sTLY0-V8). It goes through a tree trunk in a blink of an eye, but it had to be lifted by TWO men.

    Sure is cool though.
  • Squuiid - Sunday, March 13, 2011 - link

    It makes complete sense if you want SLI in a small form factor, mATX and such. (as do I).
    PCIe slots are at a premium, and so is space on a mATX board/case.

    However, I think I'm going to wait and see what the 590 looks like...
  • Fhistleb - Tuesday, March 8, 2011 - link

    I didn't even think that was possible. Though with what this is pushing out its a little expected I suppose.
  • stangflyer - Tuesday, March 8, 2011 - link

    I would like to see the 6990 and 5970 comparison in crysis and metro at eyefinity and single monitor res but with the 5970 at default clocks and close to 5870 clocks. When I am playing these games I have my 5970 at 850 core and 1150 memory and it runs all day without any throttling.

    The 5970 is handicapped at the default speeds as everyone can run at or real close to 5870 speeds. The core is easy at 850 but you may need to back down memory to 1150 or 1175.

    Would love to see the true difference in the 5970 and 6990 this way.

    The framebuffer will be the big difference at eyefinity res. with any aa applied.
  • stangflyer - Tuesday, March 8, 2011 - link

    One thing I do like about the dual gpu amd cards is that I play a few games that use physx.. (I have a 5970) I have a 250gts in the second pcie slot. both my slots are 2x16. This way I have a powerfull gpu and physx! I play my games at 5040x1050 and a single card just don't cut it. I did use nvidia surround for 2 months but like my eyefinity setup better. To go crossfire and then have physx you need a motherboard that doesn't knock your pcie slot down to 8x with 3 cards which are few and expensive and also a case that has space for that 3rd card like a coolermaster haf 932X. I have a haf 932 (not X) and I could not go 3 cards unless the 3rd card is single slot.

    On a side note as to why I am sticking with my 5970 till the 28nm show up is that I like the way the cooler is set up. With the fan on the end I have my 250gts below it with about a 3/8 inch below it. BUT the 250gts is only about 7.5-8 inches long and does not cover the fan at all because the fan is at the end. I have a 120mm fan at the bottom of my haf 932 case that blows straight up into the 5970 fan.

    If I used a 6990 the 250gts would cover the 6990 fan.

    My choices would be then to sell the 250gts and get a single slot card. (450gts probably)

    I think I am just going to stay with what I have for now.

    Maybe! LOL!

Log in

Don't have an account? Sign up now