Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

AMD for their part is clearly focusing on all of this with the 7990, both compared to the 6990 before it, and to the alternative unofficial 7990s that are already on the market. On the power front their binning has enabled them to get a dual-GPU Tahiti card out at 375W. Meanwhile on the temperature front, the style of open air cooler used on the 7990 typically affords good-to-great thermal performance with very little noise for the performance. AMD has measured their 7990 as being even quieter than GTX Titan, so we’ll see just how that pans out in our tests.

Radeon HD 7990 Voltages
7990 Max Boost 7990 Base 7990 Idle
1.2v 1.17v 0.85v

With AMD binning chips to put together the 7990, it comes as no surprise that voltages are lower than the 7970GE. The load voltage in the 1GHz boost state is 1.2v, and it drops to 1.17v for 950MHz. This coupled with the lower leakage aspects of AMD’s selected GPUs is where the bulk of the hard work is done in keeping 7990’s power consumption down.

Normally this is the part where we look at voltage versus clockspeed in-depth, but as Tahiti is an older PowerTune 1.0 part, it’s still based around a handful of DPM states with a number of inferred states in between. As a result we don’t have a good idea of what the 7990 is running at for clockspeeds at any given moment; only that it frequently jumps between the boost state and the high state in most games. Looking at our performance relative to the 7970GE CF, the performance gap in most games is clearly larger than the clockspeed gap, so the 7990 is likely spending much of its time at 950MHz or lower.

This is also a loss for AMD of course, since they have to maintain their 1.2/1.17v voltages even at throttled clockspeeds. ZeroCore made idle power on the 7990 interesting, and in the next generation of AMD GPUs PowerTune 2.0 should have a noticeable impact on clockspeeds and power consumptions.

On a final note before jumping into our results, although our PowerColor 7990 was a loaner for the Titan launch and has since been returned – and as such we can’t update the performance results – the power/temp/noise results are still valid. So in our forthcoming results we’ll also be looking at how the official 7990 compares to its officially unofficial predecessor.

Idle Power Consumption

Idle power consumption is a bit of a head-scratcher at first. Power consumption is clearly down versus the 6990, and even the GTX 690 draws 5W more at the wall. On the other hand it’s 12W higher than the 7970GE CF. To be clear here the 2nd GPU is definitely powering down on the 7990 – there’s actually a small green LED on the back of the card to indicate when ZeroCore is active – so what we’re seeing is the power consumption of the first GPU, the trickle of power the second GPU pulls, and then everything else.

Among other things the 7990 doesn’t get to power its fans down like the second card in a CrossFire configuration does, so that’s part of the difference. Furthermore there’s the PLX bridge on the 7990 that has to be accounted for, and any differences due to the use of Volterra VRMs. Finally there’s the dumb luck component: some GPUs and cards are just better than others, and this goes for both load and idle.

In any case, despite our initial surprise the 7990 is still the clear winner here as far as dual-GPU cards go. It’s not as low as what we were initially expecting, but it has clearly improved over both the 6990 and PowerColor’s 7990.

Load Power Consumption - Battlefield 3

Moving on to load power, the 7990 draws right about what we expected it to. The 7990 is a 375W card, and indeed it’s drawing about 75W more than its closest competition, the 300W GTX 690. For AMD this is a mixed blessing, as it means they have 75W more thermal headroom to play with to outperform the competition, but it also means they need to actually deliver that performance to justify the power difference. In BF3 that’s not what we saw at 2560 and 5760, but of course the leader and the performance gap between the GTX 690 and 7990 changes on a per-game basis.

In any case, PowerTune and AMD’s binning are making their presence felt here. For its slightly lower performance the 7990 draws almost 100W less than the 7970GE CF at the wall, and compared to the PowerColor 7990 it’s still 21W less.

Load Power Consumption - FurMark

FurMark ended up being harder on the 7990 than we expected. Both NVIDIA and AMD throttle it of course, but we’ve never seen it throttled so hard as on the 7990. The 7990 completely stalled at times and momentarily dropped to its medium power state (500MHz) while running FurMark. The results are still enlightening since we’re clearly hitting it peak power limits regardless, but it’s not a very good sustained load in this case.

Nevertheless our power measurements were roughly as expected. Discounting the difference between AMD and NVIDIA in throttling, compared to the 7970GE CF and PowerColor 7990 the difference is 160W-180W in savings. For the slight performance loss this is a very good tradeoff and a clear example of why it’s so important AMD got into the multi-GPU game on the 7000 series, as this can only be achieved by careful binning by the GPU manufacturer.

Idle GPU Temperature

When it comes to idle temperatures the 7990 is decent, but it’s nothing particularly fantastic. 39C is warmer than what we typically see with open air cards, and for that matter it’s warmer than GTX 690 despite the latter’s higher power draw. On the other hand it’s another clear improvement over the 6990, which idles at 45C.

Load GPU Temperature - Battlefield 3

As for load temperatures, open air coolers have proven to be quite capable, but of course a lot depends on how the card has been tuned. For multi-GPU cards 80C+ is simply a given to maximize performance and minimize noise, and indeed that’s exactly what we’re seeing here. 82C puts the 7990 in the company of both the GTX Titan and GTX 690, which is good company to be in. 82C is well below the limit for Tahiti, and as an added bonus for overclockers this means there’s at least some thermal headroom to play with.

Load GPU Temperature - FurMark

Furmark provides us with similar temperatures. It’s actually a bit cooler than under BF3 due to the inconsistent load presented by AMD’s heavy throttling, but it’s not too far off the mark. It is a good reminder however of another one of the benefits of a single card over multiple cards in CrossFire in a tight case: nothing is getting suffocated by a second card.

Idle Noise Levels

Finally we’re to our look at noise levels, one of the areas AMD has heavily focused on and is rather proud of. The 6990 was a beast at both idle and load, so there’s a lot of room and a lot of need for improvement here. When it comes to idle AMD has clearly met their mark; 39.2dB is essentially in the whisper quiet area, and is actually marginally quieter than the open air cooled 7970GE we use. It’s even a bit quieter than the GTX 690, if only by 0.8dB. Compared to the 45.5dB 6990 it’s a massive difference; it’s nearly ¼ the noise in terms of power.

Load Noise Levels - Battlefield 3

Moving on to load noise however and things become more of a mixed bag than a straight-up victory. Once more AMD has greatly improved over the 6990; though we can’t plot it on this chart, in our old gaming noise test the 6990 hit 66dB in both game and pathological testing. So 54.8dB is downright quiet in comparison. Overall this is louder than our high-end single-GPU cards such as the 7970 and GTX 680, but it’s otherwise an improvement over the likes of the GTX 680 SLI and PowerColor 7990.

The downside here for AMD is that their goal was to beat Titan on noise; by our reckoning they’re well off the mark. 3.8dB isn’t an extreme difference, but it’s a very real difference. 7990 isn’t quieter than Titan, and in fact it’s essentially tied with 690. Now 690 was a quiet card for a dual-GPU card and 7990 joins it in that club, so this is still a good result for the 7990. It’s just not what they were shooting for.

Load Noise Levels - FurMark

Noise measurements under FurMark end up being a wash due to the inconsistent load presented by AMD’s throttling on the 7990. 55.3 is about right for the GTX 690 at its peak; 52.5 just isn’t right for the 7990 since we hit 54.8dB with BF3. However pulling in this data does draw out something else: the 7990 at its loudest is quieter than the reference 7970, by nearly 4dB. We’re ultimately comparing a full blower to an open air cooler, but the difference is still staggering.

Ultimately AMD hasn’t been able to best Titan in our noise measurements, but that just makes 7990 among the quietest multi-GPU cards we’ve ever tested. We’ll break down the messy issue of power versus performance in our conclusion, but for now it’s clear that AMD has delivered most of their noise goals with the 7990, making the 7990 an incredible improvement over the 6990 and the PowerColor 7990.

Compute Final Thoughts


View All Comments

  • HisDivineOrder - Wednesday, April 24, 2013 - link

    They bring a fantastic cooler that prioritizes silence and convenience to have SLI in a system that doesn't have two PCIe slots available for them. Plus, you always had the option of quad-SLI that's a little harder to do with four 680's.

    That said, I think anyone buying a 690 over a Titan now is pretty stupid. It's not about the speed difference. It's that if you're in the market for a $1k GPU, go for the one that won't be running out of memory with next year's PS4/next Xbox ports.
  • extremesheep - Wednesday, April 24, 2013 - link

    Table typo...should the first be "7990"? Reply
  • extremesheep - Wednesday, April 24, 2013 - link

    Err...should page 1, table 1, column 1 be "7990" instead of "7970"? Reply
  • Ryan Smith - Wednesday, April 24, 2013 - link

    You may be seeing an old, cached copy. That was fixed about 25 minutes ago. Reply
  • code65536 - Wednesday, April 24, 2013 - link

    Any chance we could get Tomb Raider in future benchmark tests? Reply
  • Ryan Smith - Wednesday, April 24, 2013 - link

    In the desktop tests? No. We keep the tests capped at 10 so that it's a manageable load when we need to redo everything, such as with the 7990 launch. At this point the desktop benchmark suite is set for at least the immediate future. Reply
  • VulgarDisplay - Wednesday, April 24, 2013 - link

    4th paragraph: Incorrectly stated that Tahiti has 48 rops. Reply
  • Flamencor - Wednesday, April 24, 2013 - link

    What a mediocre review! In your conclusions, you mention nothing about how AMD absolutely spanked NVIDIA in compute performance and synthetics! It is 75 watts more power hungry, and in exchange you get substantially more memory and a total win on compute and synthetics! I know synthetics aren't actual gaming numbers, but they're indicative of how the card will stand up to future games. The fact that the card has far better synthetics says a lot about it's longevity. The card looks like a great card (although quite late)! I'm no fanboy, but why can't people just write a legitimately upbeat and positive review about an amazing part? Reply
  • Warren21 - Wednesday, April 24, 2013 - link

    Ryan typically has a slight undertone of NVIDIA bias; it can be found in most of his articles. That being said, the GTX 600 series are some amazing cards. I'd love to have a GK104-based card to replace my aged 6870 1GB. Reply
  • CiccioB - Wednesday, April 24, 2013 - link

    This kind of compute benchmarks based on OpenCL are quite useless. No professional applications use OpenCL and nvidia doesn't really put all its efforts in optimizing their OpenCL drivers.
    You may be surprise to know that REAL applications that really need GPU assisted computation use CUDA. And thus you have the option to use nvidia GPU computation or nothing else.
    That's for how good is OpenCL. It may be open, it may be something AMD needs to show good (useless) graphs, but in real word none is going to use it for serious stuff.
    3D renderers are a meaningful example: apart the useless SmallLuxMark benchmark, professional engines use CUDA. AMD is not there with whatever "devasting" computational solution you may believe they have. That's why nvidia holds more than 80% of the professional market and it's the only one having GPUs solutions for HPC while AMD just struggles to sell consumer products.

    By the way, goo review, though a double Titan solution may have been added to make it more interesting (especially for power consumption) :)

Log in

Don't have an account? Sign up now