Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

As we alluded to in our look at the 290’s build quality and AMD’s last minute specification change, while the 290 has great performance under the complete range of our gaming benchmarks, it’s with power, temperature, and noise that it has to pay the piper for that performance. There’s no getting around the fact that 47% on the 290 series reference cooler is going to be loud, and in this section we’ll break down those numbers and attempt to explain why that is.

First, let’s start with voltages. Ideally we’d use VIDs here, but due to the fact that none of our regular tools can read AMD’s VIDs for the 290 series, we’re left with what works. And what works is GPU-Z, which can read the VDDC coming off of the IR 3567B controller. These aren’t “perfect” numbers as we don’t have the ability to factor out fluctuations due to temperature or the impact of vDroop, but they’ll work in a pinch.

Radeon R9 290 Series Voltages (VDDC/GPU-Z)
Ref. 290X Boost Voltage Ref. 290 Boost Voltage Ref. 290 Base Voltage
1.11v 1.18v 1.14v

To that end you can immediately see that the 290 starts off in a weakened position relative to the 290X. Second tier products are a mixed bag in this regard as sometimes they’ll be composed solely of chips with damaged functional units that can be shut off and then downclocked to operate at a lower voltage, while in other cases they’ll also include chips that have worse leakage and power consumption characteristics. In the case of the 290 we have the latter.

As such the 290 is operating at a higher voltage than the 290X at both the base GPU clockspeed of 662Mhz, and the boost GPU clockspeed of 947MHz. This means at any given clockspeed the GPU on the 290 is going to be drawing more power – likely more than enough to offset the reduction from the disabled CUs – and furthermore we’re seeing that the voltage reduction from operating at lower voltages is not very significant. If these results are reasonably accurate then this means that the power costs of ramping up the clockspeed are relatively cheap, but the power savings of throttling down are relatively sparse. The GTX Titan, by comparison, sees a full 100mv decrease going from 940MHz to 836Mhz.

Having established that, we can see why AMD’s 7% fan speed increase had such a large impact on performance. Even a bit more cooling allows the card to jump to far higher clockspeeds, which significantly improves performance. With a fan speed of 47% the 290 has enough cooling to sustain 947MHz across everything except the TDP limited FurMark and Company of Heroes 2.

Radeon R9 290 Average Clockspeeds
  47% Fan (New Default)
Boost Clock 947MHz
Metro: LL
947MHz
CoH2
930MHz
Bioshock
947MHz
Battlefield 3
947MHz
Crysis 3
947MHz
Crysis: Warhead
947MHz
TW: Rome 2
947MHz
Hitman
947MHz
GRID 2
947MHz

With that out of the way, let’s dive into power, temperatures, and noise.

Idle power is essentially unchanged from the 290X. The 16 GDDR5 memory chips aren’t doing AMD any favors, but more significantly they appear to still have a power leak in their drivers at idle. Until they fix that, the 290 series will draw several watts more than any other modern single-GPU card.

Moving on to power consumption under Crysis 3, we can see just how AMD’s TDP hasn’t changed compared to the 290X. In fact because these cards are effectively tied in performance in this game, we can even see at least some of the impact of the 290’s higher voltages. By operating at higher voltages in general, and then furthermore higher clockspeeds (requiring higher voltages), the 290 draws just a wee bit more power than the 290X under our gaming workload. Power efficiency wasn’t AMD’s strongest hand to behind with on 290X, and 290 makes it just a bit worse.

This also means that the 290 isn’t competitive with the GTX 780 on the matter of power consumption and power efficiency in general. A 54W difference at the wall for identical performance in Crysis 3 – or extrapolated over our complete benchmark suite a performance advantage of 6% – is very difficult to swallow. As with everything else to come for power, temp, and noise, the GTX 780 has a very real advantage here.

Moving on to FurMark, despite the fact that we should be TDP limited the 290 actually draws more power than the 290X. To be frank we’re at a bit of a loss on this one; 290 bottoms out at 662MHz here, so it may be that we’re seeing one of the things the card can do to try to maintain its base clockspeed. Alternatively this may be the voltage effect amplified. Regardless of the reason though it’s a very repeatable scenario, and it’s a scenario that has 290 drawing 34W more at the wall than 290X.

Given the fact that the 290 and 290X are built on identical boards, the idle temperatures are consistent, if not a bit more spread out than usual. Until AMD gets their power leak under control, Hawaii isn’t going to come down below 40C at idle with the reference cooler.

Due to the mechanisms of PowerTune on the 290 series, the sustained load temperatures for the 290 and 290X are a very consistent 94C. As we laid out in our review of the 290X these temperatures are not a problem so long as AMD properly accounts for them in their power consumption projections and longevity projections. But coming from earlier cards it does take some getting used to.

At last we have our look at noise. Starting with idle noise, we can see that the 290 actually outperforms the 290X to a meaningful degree, squeaking under the 40dB mark. The fact that these cards utilize the same cooler operating at the same fan speed means that these results caught us off guard at first, but our 290 sample for whatever reason seems to be slightly better built than our 290X sample. These results match what our ears say, which is that the 290X has a bit of a grind to it that’s not present on the 290, and consequently the 290 is that much quieter.

Our Crysis 3 noise chart is something that words almost don’t do justice for. It’s something that needs to be looked at and allowed to sink in for a moment.

With the 290 AMD has thrown out any kind of reasonable noise parameters, instead choosing to chase performance and price above everything else. As a result at 57.2dB the 290 is the loudest single-GPU card in our current collection of results. It’s louder than 290X (a card that was already at the limit for reasonable), it’s louder than the 7970 Boost (an odd card that operated at too high a voltage and thankfully never saw a retail reference release), and it’s 2.5dB louder than the GTX 480, the benchmark for loud cards. Even GTX 700 series SLI setups aren’t this loud, and that’s a pair of cards.

At the end of the day the 290 is 9.7dB louder than its intended competition, the GTX 780. With a 10dB difference representing a two-fold increase in noise on a human perceptual basis, the 290 is essentially twice as loud as the GTX 780. It’s $100 cheaper and 6% faster, but all of that comes at the very high price of twice the noise.

Everyone’s cutoff for a reasonable noise level for a single-GPU card is going to be different. Ours happens to be the 7970, which on our latest testbed measures in at 53.5dB. To that end the 290 is 3.7dB louder, putting it well past what we’d consider to be reasonable for the 290. It’s a powerful card, but these noise levels are unreasonable and unacceptable for any scenario that involves being able to hear the card while it’s working.

Finally we’ll look at noise under FurMark. As loud as the 290 is under Crysis 3, the 290 was only pushed to 45% fan speed under that workload. Under FurMark the 290 ratchets up to 47% and to its peak noise level of 58.5dB. Now to the credit of the 290 this does end up being better than the 5870 and GTX 480, but as neither of those cards implements modern power throttling technology it’s at best an unfair fight. Compared to any card with power throttling, the 290 ends up once more being the worst.

Wrapping things up, the power/temp/noise situation for the 290 is rather straightforward, but unfortunately for AMD it’s not going to be in their favor. 290 is at least marginally more power hungry and quite a bit louder than 290X, never mind GTX 780. As we’ve seen in previous pages the performance is quite good, but it comes at what will in most cases be a very high cost.

Finally, since we had a bit more time to prepare for the 290 review than we did the 290X review, we used part of our time to investigate something we didn’t get to do with the 290X: what would performance look like if we equalized for noise? Earlier in this article we took at brief look at performance if we equalized for noise against the 290X – the performance hit is about 12% - but how would the 290 fare if it were equalized to the GTX 780?

The answer is not well, likely for the voltage matters we discovered earlier in this article. To get a 290 down to ~48dB requires reducing the maximum fan speed to 34%, which is something AMD’s software allows. The problem is that at 34% the effective cooling level on the 290 is so poor that even after dropping to the base GPU clockspeed of 662MHz it still generates too much heat, requiring it to ramp up the fan speed to compensate. In other words it’s simply impossible to get the 290 to match the GTX 780’s noise levels under load. Based on our data the 290 requires a minimum fan speed of 38% to maintain its base clockspeed under sustained load, which pushes noise levels out from a GTX 780-like 48dB to a GTX Titan-like 50.9dB.

With that in mind, we went ahead and ran a selection of our benchmarks with the 34% maximum fan speed. The performance hit, as you’d expect, is significant.

Radeon R9 290 Average Clockspeeds
  47% Fan 40% Fan 34% Fan
Boost Clock 947MHz 947MHz 947MHz
Metro: LL
947MHz
830MHz
662MHz
Battlefield 3
947MHz
870MHz
662MHz
Crysis 3
947MHz
720MHz
662MHz
Crysis: Warhead
947MHz
662MHz
662MHz
TW: Rome 2
947MHz
761MHz
662MHz
GRID 2
947MHz
825MHz
700MHz

290/780 Noise Equalization: 290 Relative Performance

Radeon R9 290 Relative Performance
290: 47% Fan Speed (Default) 290: 40% Fan Speed 290: ~34% Fan Speed
100% 88% 78%

To get down to the 34%-38% fan speed range, the 290 has to shed an average of 22% of its performance, peaking under a few titles at 25%. To be sure this makes the card much quieter – though not as quiet as a GTX 780 – but it also sacrifices much of the 290’s performance advantage in the process. At this point we’ve essentially reduced it to a 280X.

290/780 Noise Equalization: Load Noise Levels - Crysis 3

Looking at the resulting noise levels, you can see the full outcome of our tweaks. If we could sustain 34% we’d have a noise level consistently close to that of the GTX 780, but instead under Crysis 3 and a couple other games fan speeds level out at 38%, pushing noise levels to 50.9dB and placing them a bit higher than GTX Titan.

Based on this data it’s safe to say that the performance cost for using the fan control function to reduce the fan noise on the 290 will be moderate to severe. You can’t match GTX 780 or even GTX Titan, and doing so will reduce performance to that of a 280X. 40% on the other hand is more viable, but keep in mind we’re now at 290X noise levels for roughly 85% of the 290X’s performance, which isn’t a great outcome either.

Compute Overclocking
Comments Locked

295 Comments

View All Comments

  • TheJian - Tuesday, November 5, 2013 - link

    Simple economics...NV doesn't make as much as they did in 2007. They are not gouging anyone and should be charging more (so should AMD) and neither side should be handing out free games. Do you want them to be able to afford engineers and good drivers or NOT? AMD currently can't afford them due to your price love, so you get crap drivers that still are not fixed. It's sad people don't understand the reason you have crap drivers is they have lost $6Billion in 10yrs! R&D isn't FREE and the king of the hill gets to charge more than the putz. Why do you think their current card is 10db’s higher in noise, 50-70 watts higher and far hotter? NO R&D money.

    NV made ~550mil last 12 months (made $850 in 2007). Intel made ~10Billion (made under 7B 2007, so profits WAY UP, NV way down). Also INtel had 54B in assets 2007, now has 84billion! Who's raping you? The Nvidia hate is hilarious. I like good drivers, always improving products, and new perf/features. That means they need to PROFIT or we'll get crappy drivers from NV also.

    Microsoft 2007=14B, this year $21B (again UP HUGE!)
    Assets 2007=64B, 2013=146Billion HOLY SHITE.

    Who's raping you...IT isn't Nvidia...They are not doing nearly as well as 2007. So if they were even raping you then, now they're just asking you to show them your boobs...ROFL. MSFT/INtel on the other hand are asking you to bend over and take it like a man, oh and give me your wallet when I'm done, hey and that car too, heck sign over your house please...

    APPLE 2007=~3Bil profits 2013=41Billion (holy 13.5x the raping).
    Assets 2007=25B, wait for it...2013=176Billion!
    bend over and take it like a man, oh and give me your wallet when I'm done, hey and that car too, heck sign over your house please...Did you mention you're planning on having kids?...Name them Apple and I want them as slaves too...LOL

    Are we clear people. NV makes less now than 2007 and hasn't made near that 850mil since. Why? Because market forces are keeping them down which is only hurting them, and their R&D (that force is AMD, who by the way make ZERO). AMD is killing themselves and fools posting crap like this is why (OK, it's managements fault for charging stupidly low prices and giving out free games). You can thank the price of your card for your crappy AMD drivers

    Doesn't anyone want AMD to make money? Ask for HIGHER PRICES! Not lower, and quit demonizing NV (or AMD) who doesn't make NEAR what they did in 2007! Intel killed their chipset business and cost them a few hundred million each year. See how that works. If profits for these two companies don't start going up we're all going to get slower product releases (witness what just happened, no new cards for 2yrs if you can even call AMD's new as it just catches OLD NV cards and runs hot doing it), and we can all expect CRAP DRIVERS with those slower released cards.

    AMD finally made a profit in the last year 1/2 this Q (only 48mil and less depending on if you look gaap, non-gaap). AMD has lost over 6Billion in 10yrs. They are not charging you enough. I expect them to lose money again as they owe $200mil to GF Dec 31st which will wipe out last Q profit and kill this Q also. See the point? They need to make money. How is it possible for either side to be ripping us off if neither has made as much as they did in 2007 for 6 years with AMD losing their collective ARSE?

    I could give you the Hard Drive makers profits etc and would show the same (or worse) as MS, Apple, Google. The flood allowed them all triple profits (quad in Seagates case). YES, this is ripping us off.
  • Spunjji - Tuesday, November 5, 2013 - link

    TL;DR
  • Senti - Tuesday, November 5, 2013 - link

    Good drivers from NV? Ha-ha-ha! You've never run anything besides few overhyped games I guess. They are awful the moment you start programming them:

    You use some obscure feature of standard that is not used in games? Here, have a bluescreen! Even AMD video drivers weren't so bad recently.

    File in bug report? Ignored, who cares about you, you are not making one of top tier games.

    Want OpenCL 1.2 like the rest of the world has (even integrated Intel videocards)? No, it's not important – here you are ten new versions of CUDA!

    And how about crippled OpenGL for non-Quadro cards? Sure I know what I'm talking about since I have Quadro too.
  • dragonsqrrl - Monday, November 11, 2013 - link

    Your comment is nothing but an exercise in ambiguity. So basically what you're saying is that because Nvidia's drivers don't work perfectly for every user, AMD's drivers are superior. Really?

    The one sentence that's actually applicable to gaming is just about as vague as you can get, maybe because you don't actually know what you're talking about? Bluescreen, really? Okay... And from an end user perspective wtf does it even mean to program a driver? Are you suggesting you're something more than an end user? Are you suggesting you work for Nvidia? lol...

    Whatever driver related bsod problems you were having is unlikely to be systemic, and may not even be driver related at all. Almost all of the Nvidia driver issues I've read about in recent memory (past couple of years) have been isolated incidents that may result in instability for certain users, none of which I've experienced myself (GTX480). Optimizations compared to the competition have been spot on, and title support has been fantastic. The only exception I can think of was the fan controller issue a couple years ago. Compare that to AMD's recent driver issues (xfire, surround, 4k, frame pacing, etc), all of which are systemic and widespread. I think most AMD fanboys choose to either ignore these problems, or accept them as the norm (same with fan noise on stock coolers), which is stupid and self defeating in my opinion. That sort of attitude doesn't drive AMD or Nvidia to improve.
  • Senti - Wednesday, November 13, 2013 - link

    I'm talking from developer's point, can you even read? You probably have no clue about programming if those sounded ambiguous and you even suggested that I work for NV, lol.

    I don't say that AMD's drivers are superior, but I can surely say that "NV drivers are superior" is one big lie. They are both bad, really.

    I can tell you some fun things about AMD's drivers too: like, uploading textures of certain sizes crashes them, or how their OpenCL compiler crashes on certain C99 features. But those are just program-level crashes (not system-level) and I see texture crash only in recent beta versions, not the last stable version; haven't really investigated OpenCL ones. On the other hand I can reliably send NV drivers into bluescreen, but as already said, it's quite obscure feature and simple users have very low chance to stumble upon it.

    Conclusion: we do love AMD for their performance/price ratio while there is nothing to love NV for except being mindless fanboy. My personal view of current situation, feel free to disagree.
  • nsiboro - Wednesday, November 6, 2013 - link

    TL;DR

    Nvidia isn't doing only GPU. Do not forget Tegra - acquisition of comm IP, etc. do require $$$.

    The raping is true.

    GPU revenue dump into ARM/mobile for future survival.
  • Drumsticks - Tuesday, November 5, 2013 - link

    I like how they actually explicitly recommend against the card and the first 5 comments are praising it XD.

    That's a LOT of noise... but when we get custom coolers this will be really, really exciting.
  • RussianSensation - Tuesday, November 5, 2013 - link

    Yup, customs coolers will fix both the noise levels and temperature issues. After that, this card will be a must buy. 2 Windforce 3x (dual slot) and $699 780Ti is irrelevant. In fact, after-market versions of R9 290 will make 780/R9 290X and 780Ti very overpriced. They can't get here soon enough.
  • dragonsqrrl - Tuesday, November 5, 2013 - link

    Well, sometimes you just can't fix fanboism.
  • Tetracycloide - Tuesday, November 5, 2013 - link

    How puerile.

Log in

Don't have an account? Sign up now