Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

For the 290X we’re going to be seeing several factors influencing the power, temperature, and noise characteristics of the resulting card. At the lowest level are those items beholden to the laws of physics: mainly, the fact that AMD has increased their die size by 20% while retaining the same manufacturing process, the same basic architecture, and the same boost clockspeeds. As a result there is nowhere for power consumption to go but up, even with leakage having been clamped down on versus 280X/Tahiti. The question of course being by how much, and is it worth the performance increase?

Meanwhile 290X also introduces the latest iteration of PowerTune, which significantly alters AMD’s power management strategy. Not only does AMD gain the ability to do fine grained clockspeed/voltage steps, and thereby improving their efficiency versus Tahiti, but alongside those improvements is the new PowerTune temperature and fan speed throttling model. AMD will of course need that as we’ll see, as they have equipped the 290X with a cooling solution almost identical to that of the 7970 despite the fact that TDP has been increased by roughly 50W, putting an even greater workload on the cooler to move all the heat Hawaii can produce.

Seeing as how we don’t have accurate voltage/VID readings at this time, we’ll jump right into clockspeeds. As we stated in our high level overview of the new PowerTune and the 290X, the 290X has two modes, quiet and uber. Both operate at the same clockspeeds and same power restrictions, but quiet mode utilizes a maximum fan speed of 40% while uber mode goes to 55%. The 15% difference conceals a roughly 1000rpm difference in the fan speed, so there are certainly good reasons for AMD to offer both, as uber mode can get very loud as we’ll see. At the same time however while quiet mode will be able to keep noise in check, it’s going to come up short on letting the 290X run at its full potential. In quiet mode throttling is inevitable; there’s simply not enough airflow to allow the 290X to sustain 1000MHz, as our clockspeed table below indicates.

Radeon R9 290X Average Clockspeeds
  Quiet (Default) Uber
Boost Clock 1000MHz 1000MHz
Metro: LL
923MHz
1000MHz
CoH2
970MHz
990MHz
Bioshock
985MHz
1000MHz
Battlefield 3
980MHz
1000MHz
Crysis 3
925MHz
1000MHz
Crysis: Warhead
910MHz
1000MHz
TW: Rome 2
907MHz
1000MHz
Hitman
990MHz
1000MHz
GRID 2
930MHz
1000MHz
Furmark
727MHz
870MHz

As we noted in our testing methodology section, these aren’t the lowest clockspeeds we’ve seen in those games but rather the average clockspeeds we hit in the final loop of our standard looped benchmark procedures. As such sustained performance can dip even lower, though by how much is going to of course depend on ambient temperatures and the cooling capabilities of the chassis itself. We believe our looping benchmarks run long enough to generally reach sustained performance numbers, but in all likelihood some of our numbers on the shortest benchmarks will skew low.

Anyhow, as we can see, in everything, even the shortest benchmark, the sustained clockspeeds are below 1000MHz. Out of all of our games Rome 2 fares the worst in this regard, dropping to 907MHz, while other games like Metro and Crysis aren’t far behind at 910MHz-930MHz. FurMark does one better yet and drops to 727MHz, which we believe to be 290X’s unlisted base clockspeed, indicating it has to drop out of boost mode entirely to bring performance/heat in check with cooling under quiet mode. 290X simply cannot sustain its peak boost clocks under quiet mode; there’s not enough cooling to handle the estimated 300W of heat 290X produces at those performance levels.

Which is why AMD has uber mode. In uber mode the fan speeds are high enough (if just so) to provide the cooling necessary to keep up with the 290X in every gaming workload. Only Company of Heroes 2 doesn’t do 1000MHz sustained, and while AMD’s utilities don’t provide all of the diagnostic data we’d like, we strongly suspect we’re TDP limited in CoH2 for a portion of the benchmark run, which is why we can’t sustain 1000MHz. In any case for most workloads uber mode should be enough to sustain the 290X’s best performance, though it’s not without a significant noise cost.

Consequently this is why we’re so dissatisfied with how AMD is publishing the specifications for the 290X. The lack of a meaningful TDP specification is bad enough, but given the video card’s out of the box (quiet mode performance) it’s disingenuous at best for the only published clockspeed number to be the boost clock. 290X simply cannot sustain 1000MHz in quiet mode under full load.

NVIDIA, when implementing GPU boost, had the sense to advertise not only the base clockspeed, but an “average” boost clock that in our experience underestimates the real clockspeeds they sustain. AMD on the other hand is advertising clockspeeds that by default cannot be sustained. And even Intel, by comparison, made sure to advertise both their base and boost GPU clockspeeds in the ARK and other specification sources, even with the vast gulf between them in some SKUs.

Given this, we find AMD’s current labeling practices troubling. Although seasoned buyers are going to turn to reviews like ours, where the value of a card will be clearly spelled out with respect to both performance and price, to only list the boost clock is being deceitful at best. AMD needs to list the base clockspeeds, and they’d be strongly advised to further list an average clockspeed similar to NVIIDA’s boost clock. Even those numbers won’t be perfect, but it will at least be a reasonable compromise over listing an “up to” number that is currently for all intents and purposes unreachable.

In any case, let’s finally get to the power, temperature, and noise data.

Idle power is not in AMD’s favor, and next to the Crossfire issues we were seeing in our gaming tests this appears to be another bug in their drivers. For what we know about GCN and Hawaii 88W at the wall is too high even after compensating for the additional memory and the larger GPU die. However if we plug in a 7970 on the Cat 13.11 beta v5 drivers and run the same power test, we find that power consumption rises about 6-8W at the wall versus Cat 13.11 beta v1. For reasons that we cannot fully determine, the v5 drivers are causing GCN setups to consume additional power at idle. This is not reflected in as a workload on the GPU nor the CPU so it’s not clear where the power leak is occurring (though temperature data points us to the GPU), but somewhere, somehow AMD has started unnecessarily burning power at idle.

We would fully expect that at some point AMD will be able to get this bug fixed, at which point idle power consumption (at the wall) for 290X should be in the low 80s range. But for the moment 88W is an accurate portrayal of 290X’s power consumption, making it several watts worse than GTX 780 at this time.

As a reminder, starting with the 290X we’ve switch from Metro: Last Light to Crysis 3 for our gaming power/temp/noise results, as Metro exhibits poor scaling on multi-GPU setups, leading to GPU utilization dropping well below 100%.

For this review Crysis 3 actually ends up working out very well as a gaming workload, due to the fact that the 290X and the GTX 780 (its closest competitor) achieve virtually identical framerates at around 52fps. As a result the power consumption from the rest of the system should be very similar, and the difference between the two in wall power should be almost entirely due to the video cards (after taking into account the usual 90% efficiency curve).

With that in mind, as we can see there’s no getting around the fact that compared to both 280X and GTX 780, power consumption has gone up. At 375W at the wall the 290X setup draws 48W more than the GTX 780, 29W more than GTX Titan, and even 32W more than the most power demanding Tahiti card, 7970GE. NVIDIA has demonstrated superior power efficiency throughout this generation and 290X, though an improvement itself, won’t be matching NVIDIA on this metric.

Overall our concern with power on high end cards has to do more with the ramifications of trying to remove/cool that additional heat than the power consumption itself – though summer does present its own problems – but still it’s clear that AMD’s 9% average performance advantage over the GTX 780 is going to come at a cost of more than a 9% increase in power consumption. Or versus the GTX Titan, which the 290X generally ties, 290X is still drawing more power. The fact that AMD is delivering better performance than a GTX 780 should not be understated, but neither should the fact that they consume more power while doing so.

FurMark, our pathological case, confirms what we were just seeing with Crysis. At this point 290X’s power consumption has fallen below GTX 780’s, but only because at this point we know that 290X has needed to significantly downclock itself to get to this point. GTX 780 throttles here too for the same reason, but not as much as 290X does. Consequently this puts the worst case power scenario for the GTX 780 at worse than the quiet mode 290X, but between this and Crysis the data suggests that the 290X is operating far closer to its limit than the GTX 780 (or GTX Titan) is.

Meanwhile we haven’t paid a lot of attention to the uber mode 290X until now, so now is a good time to do so. The 290X in uber mode still has to downclock for power reasons, but it stays in its boost state until the 290X in quiet mode. Based on this we believe the 290X uber is drawing near its peak power consumption for both FurMark and Crysis 3, which besides neatly illustrating the real world difference between quiet and uber modes in terms of how much heat they can move, means that we should be able to look at uber mode to get a good idea of what the 290X’s maximum power consumption is. To that end based on this data we believe the PowerTune/TDP limit for 290X is 300W, 50W higher than the “average gaming scenario power” they quote. This is also fairly consistent compared to the Tahiti based 7970 and its ilk, which have an official PowerTune limit of 250W.

Ultimately 300W single-GPU cards have been a rarity, and seemingly for good reason. That much heat is not easy to dissipate coming off of a single heat source (GPU), and the only other 300W cards we’ve seen are not cards with impressive acoustics. Given where AMD was with Tahiti we’re in no way surprised that power consumption has gone up with the larger GPU, but there are consequences to be had for having this much power going through a single card. Not the least of which is the fact that AMD’s reference cooler can’t actually move 300W of heat at reasonable noise levels, hence the use of quiet and uber modes.

Given the earlier idle power consumption numbers we were seeing, to see AMD’s idle temperatures run high is not unexpected. 43C isn’t a problem in and of itself, but it is indicative that the idle power leak is coming from the GPU rather than a CPU load from the drivers.

Given what we know about the new PowerTune and AMD’s design goals for 290X, the load temperatures are pretty much a given at this point. In quiet mode the 290X will hit 94C/95C and will eventually throttle under any game. We won’t completely go over the technical rationale for this (if you’ve missed our PowerTune page, please check that out first), but in short the temperatures we’re seeing, though surprising at first, are accounted for in AMD’s design. The Hawaii GPU should meet the necessary longevity targets even at 95C sustained, and static leakage should be low enough that it’s not causing a significant power consumption problem. It’s certainly a different way of thinking, but with a mature 28nm process and the very fast switching of PowerTune it’s also a completely practical design.

It’s still going to take some getting used to, though.

Moving on to our noise testing, due to the fact that the 290X reference cooler is based on the 7970’s reference cooler there’s little to surprise here. 41dB is by no means bad, but the 7970 never did particularly well here either, and neither will the 290X. This level of idle noise will not impress anyone concerned about the matter, especially when 2 GTX 780s in SLI is still quieter by 1.5dB. It’s going to be enough that the 290X is at least marginally audible at idle.

Having previously seen power consumption and temperatures under gaming, we finally get to what in most cases should be the most important factor: noise. In reusing the 7970’s reference cooler – having previously proven to be a mediocre design as far as noise is concerned – AMD has put themselves into a tough situation with the 290X. At 53.3dB the 290X is running at its 40% default fan speed limit, meaning we’re seeing both the worst case scenario for noise but also one that’s going to occur in every game. To that end it’s marginally quieter than the reference 7970 itself, and louder than everything else we’ve tested, including SLI setups.

At this point the 290X is 1.6dB louder than GTX 780 SLI, 3.1dB louder than GTX Titan, and a very significant 5.8dB louder than GTX 780. GTX 780 may border on overbuilt as far as cooling goes, but the payoff is in situations like this where the difference in noise under load is going to be very significant.

As an aside, for anyone wondering why the 290X in quiet mode and the 7970 have such similar noise levels under gaming workloads, there’s a good reason for that. The 290X quite mode’s 40% maximum fan speed was specifically chosen to match the noise characteristics of the original reference 7970, to lead to this exact outcome of it being no louder than the 7970. Meanwhile uber mode’s 55% maximum fan speed was chosen to match the noise characteristics of the reference 7970GE, which was never released in public and was absurdly loud.

Finally with FurMark, having already reached our 40% fan speed limit for the 290X we’re merely seeing every other card catch up. What little good news for the 290X here is that the gap between the GTX 780/Titan and the 290X closes to a hair over 1dB – a nearly insignificant difference – but it won’t change the fact that our gaming workload is a better representation of what to expect under a typical workload, and as a result a better representation of how much noisier the 290X is than the GTX 780 and its ilk.

In the end it’s clear that AMD needed to make tradeoffs to get the 290X out at its performance levels, and to do so at $550. That compromise has been in 290X’s power consumption, and more directly in the amount of noise 290X generates. Which is not to say that the power and noise situation fully negates what AMD has brought to the table in terms of price and performance – though it goes without saying we would have liked to see a better cooler – but it does mean buyers will need to act on those tradeoffs.

For a high end card the power consumption is not particularly concerning right now, but the noise issue will be a problem for some buyers. Everyone has their own cutoff of course, but in our book 53.3dB is at the upper range of reasonable noise levels, if not right at the edge. The 290X is no worse than (and no better than) the 7970 in this regard, which means we’re looking at an acceptable noise level that will work for some buyers and chassis and won’t work for others. For buyers specifically seeking out an ultra-quiet blower then there is no alternative to GTX 780, otherwise in the face of what 290X can do it would be very hard to justify a card $100 more expensive and $10 slower over these noise results. AMD still holds the edge overall, even if it’s not a clean sweep.

Up next, let’s talk about uber mode for a moment. We’ve focused on quiet mode for the bulk of our writeup not only because it’s the default mode, but because it’s the only mode that makes sense. Uber mode makes the 290X’s performance look even better, particularly in our most thermally stressful games, but ultimately the performance difference is never more than 5%. 5% is simply not worth the additional noise. It’s unfortunate that AMD is having to hold back the 290X’s performance like this to get noise levels to a reasonable level, but we simply can’t justify running the 290X that loud for a bit more performance.

It’s also for that reason that the 290X CF is in the tightest spot of them all, as AMD’s suggestion is that 290X CF users run in uber mode. 290X CF’s performance is great, but a pair of cards just compounds the problem. Short of running closed headphones, 290X CF in uber mode is just too much. 290X CF in quiet mode should be significantly better, just as how the single card configuration is, but that’s something we’ll have to look into at another time, as we didn’t have time to run that set of benchmarks for this article.

With all of the above in mind, we expect it will be interesting to see what AMD’s partners cook up once we see semi-custom and fully-custom designs hit the market. Open air coolers should handily outperform the AMD blower as far as noise is concerned – at the usual tradeoff of dumping that 300W of heat into the chassis – but we’d like to see one of AMD’s partners take a crack at a better blower. We’ve seen what kind of results NVIDIA can pull off with their high end blower; even if AMD won’t make such a high quality cooler the reference cooler, it would be to AMD’s benefit to have at least one partner offering something that can compete with the GTX 780 on the noise front while retaining the blower design. Whether we’ll see such a card however is another matter entirely.

Compute Final Words
Comments Locked

396 Comments

View All Comments

  • Antiflash - Thursday, October 24, 2013 - link

    I've usually prefer Nvidia Cards, but they have it well deserved when decided to price GK110 to the stratosphere just "because they can" and had no competition. That's poor way to treat your customers and taking advantage of fanboys. Full implementation of Tesla and Fermi were always priced around $500. Pricing Keppler GK110 at $650+ was stupid. It's silicon after all, you should get more performance for the same price each year. Not more performance at a premium price as Nvidia tried to do this generation. AMD is not doing anything extraordinary here they are just not following nvidia price gouging practices and $550 is their GPU at historical market prices for their flagship GPU. We would not have been having this discussion if Nvidia had done the same with GK110.
  • blitzninja - Saturday, October 26, 2013 - link

    OMG, why won't you people get it? The Titan is a COMPUTE-GAMING HYBRID card, it's for professionals who run PRO apps (ie. Adobe Media product line, 3D Modeling, CAD, etc) but are also gamers and don't want to have SLI setups for gaming + compute or they can't afford to do so.

    A Quadro card is $2500, this card has 1 less SMX unit and no PRO customer driver support but is $1000 and does both Gaming AND Compute, as far as low-level professionals are concerned this thing is the very definition of steal. Heck, you SLI two of these things and you're still up $500 from a K6000.

    What usually happens is the company they work at will have Quadro workstations and at home the employee has a Titan. Sure it's not as good but it gets the job done until you get back to work.

    Please check your shit. Everyone saying R9 290X--and yes I agree for gaming it's got some real good price/performance--destroys the Titan is ignorant and needs to do some good long research into:
    A. How well the Titan sold
    B. The size of the compute market and MISSING PRICE POINTS in said market.
    C. The amount of people doing compute who are also avid gamers.
  • chimaxi83 - Thursday, October 24, 2013 - link

    Impressive. This cards beats Nvidia on EVERY level! Price, performance, features, power..... every level. Nvidia paid the price for gouging it's customers, they are going to lose a ton of marketshare. I doubt they have anything to match this for at least a year.
  • Berzerker7 - Thursday, October 24, 2013 - link

    Sounds like a bot. The card is worse than a Titan on every point except high resolution (read: 4K), including power, temperature and noise.
  • testbug00 - Thursday, October 24, 2013 - link

    Er, the Titan beats it on being higher priced, looking nicer, having a better cooler and using less power.

    even in 1080p a 290x approxs ties (slightly ahead according to techpowerup (4%)) the Titan.

    Well, a $550 card that can tie a $1000 card in a resolution a card that fast really shouldn't be bought for (seriously, if you are playing in 1200p or less there is no reason to buy any GPU over $400 unless you plan to ugprade screens soon)
  • Sancus - Thursday, October 24, 2013 - link

    The Titan was a $1000 card when it was released.... 8 months ago. So for 8 months nvidia has had the fastest card and been able to sell it at a ridiculous price premium(even at $1000, supply of Titans was quite limited, so it's not like they would have somehow benefited from setting the price lower... in fact Titan would probably have made more money for Nvidia at an even HIGHER price).

    The fact that ATI is just barely matching Nvidia at regular resolutions and slightly beating them at 4k, 8 months later, is a baseline EXPECTATION. It's hardly an achievement. If they had released anything less than the 290X they would have completely embarrassed themselves.

    And I should point out that they're heavily marketing 4k resolution for this card and yet frame pacing in Crossfire even with their 'fixes' is still pretty terrible, and if you are seriously planning to game at 4k you need Crossfire to be actually usable, which it has never really been.
  • anubis44 - Thursday, October 24, 2013 - link

    The margin of victory for the R9 290X over the Titan at 4K resolutions is not 'slight', it's substantial. HardOCP says it's 10-15% faster on average. That's a $550 card that's 10-15% faster than a $1000 card.

    What was that about AMD being embarassed?
  • Sancus - Thursday, October 24, 2013 - link

    By the time more than 1% of the people buying this card even have 4k monitors 20nm cards will have been on sale for months. Not only that but you would basically go deaf next to a Crossfire 290x setup which is what you need for 4k. And anyway, the 290x is faster only because it's been monstrously over clocked beyond the ability of its heatsink to cool it properly. 780/Titan are still far more viable 2/3/4 GPU cards because of their superior noise and power consumption.

    All 780s overclock to considerably faster than this card at ALL resolutions so the gtx 780ti is probably just an OCed 780, and it will outperform the 290x while still being 10db quieter.
  • DMCalloway - Thursday, October 24, 2013 - link

    You mention monstrously OC'ing the 290x yet have no problem OC'ing the 780 in order to create a 780ti. Everyone knows that aftermarket coolers will keep the noise and temps. in check when released. Let's deal with the here and now, not speculate on future cards. Face it; AMD at least matches or beats a card costing $100 more which will cause Nvidia to launch the 780ti at less than current 780 prices.
  • Sancus - Thursday, October 24, 2013 - link

    You don't understand how pricing works. AMD is 8 months late to the game. They've released a card that is basically the GTX Titan, except it uses more than 50W more power and has a bargain basement heatsink. That's why it's $100 cheaper. Because AMD is the one who are far behind and the only way for them to compete is on price. They demonstrably can't compete purely based on performance, if the 290X was WAY better than the GTX Titan, AMD would have priced it higher because guess what, AMD needs to make a profit too -- and they consistently have lost money for years now.

    The company that completely owned the market to the point they could charge $1000 for a video card are the winners here, not the one that arrived out of breath at the finish line 8 months later.

    I would love for AMD to be competitive *at a competitive time* so that we didn't have to pay $650 for a GTX 780, but the fact of the matter is that they're simply not.

Log in

Don't have an account? Sign up now