Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

Starting with voltages, at least for the time being we have nothing to report on R9 Fury X as far as voltages go. AMD is either not exposing voltages in their drivers, or our existing tools (e.g. MSI Afterburner) do not know how to read the data, and as a result we cannot see any of the voltage information at this time.

Radeon R9 Fury X Average Clockspees
Game R9 Fury X
Max Boost Clock 1050MHz
Battlefield 4
Crysis 3
Civilization: BE
Dragon Age
Talos Principle
Far Cry 4
Total War: Attila
GRID Autosport
Grand Theft Auto V

Jumping straight to average clockspeeds then, with an oversized cooler and a great deal of power headroom, the R9 Fury X has no trouble hitting and sustaining its 1050MHz boost clockspeed throughout every second of our benchmark runs. The card was designed to be the pinnacle of Fiji cards, and ensuring it always runs at a high clockspeed is one of the elements in doing so. The lack of throttling means there’s really little to talk about here, but it sure gets results.

Idle Power Consumption

Idle power does not start things off especially well for the R9 Fury X, though it’s not too poor either. The 82W at the wall is a distinct increase over NVIDIA’s latest cards, and even the R9 290X. On the other hand the R9 Fury X has to run a CLLC rather than simple fans. Further complicating factors is the fact that the card idles at 300MHz for the core, but the memory doesn’t idle at all. HBM is meant to have rather low power consumption under load versus GDDR5, but one wonders just how that compares at idle.

Load Power Consumption - Crysis 3

Switching to load power consumption, we go first with Crysis 3, our gaming load test. Earlier in this article we discussed all of the steps AMD took to rein in on power consumption, and the payoff is seen here. Equipped with an R9 Fury X, our system pulls 408W at the wall, a significant amount of power, but only 20W at the wall more than the same system with a GTX 980 Ti. Given that the R9 Fury X’s framerates trail the GTX 980 Ti here, this puts AMD’s overall energy efficiency in a less-than-ideal spot, but it’s not poor either, especially compared to the R9 290X. Power consumption has essentially stayed put while performance has gone up 35%+.

On a side note, as we mentioned in our architectural breakdown, the amount of power this card draws will depend on its temperature. 408W at the wall at 65C is only 388W at the wall at 40C, as current leakage scales with GPU temperature. Ultimately the R9 Fury X will trend towards 65C, but it means that early readings can be a bit misleading.

Load Power Consumption - FurMark

As for FurMark, what we find is that power consumption at the wall is much higher, which far be it from being a problem for AMD, proves that the R9 Fury X has much greater thermal and electrical limits than the R9 290X, or the NVIDIA completion for that matter. AMD ultimately does throttle the R9 Fury X here, at around 985MHz, but the card easily draws quite a bit of power and dissipates quite a bit of heat in the process. If the card had a gaming scenario that called for greater power consumption – say BIOS modded overclocking – then these results paint a favorable picture.

Idle GPU Temperature

Moving on to temperatures, the R9 Fury X starts off looking very good. Even at minimum speeds the pump and radiator leads to the Fiji GPU idling at just 27C, cooler than anything else on this chart. More impressive still is the R9 290X comparison, where the R9 Fury X is some 15C cooler, and this is just at idle.

Load GPU Temperature - Crysis 3

Loading up a game, after a good 10 minutes or so the R9 Fury X finally reaches its equilibrium temperature of 65C. Though the default target GPU temperature is 75C, it’s at 65C that the card finally begins to ramp up the fan in order to increase cooling performance. The end result is that the card reaches equilibrium at this point and in our experience should not exceed this temperature.

Compared to the NVIDIA cards, this is an 18C advantage in AMD’s favor. GPU temperatures are not everything – ultimately it’s fan speed and noise we’re more interested in – but for AMD GPU temperatures are an important component of controlling GPU power consumption. By keeping the Fiji GPU at 65C AMD is able to keep leakage power down, and therefore energy efficiency up. R9 Fury X would undoubtedly fare worse in this respect if it got much warmer.

Finally, it’s once again remarkable to compare the R9 Fury X to the R9 290X. With the former AMD has gone cool to keep power down, whereas with the latter AMD went hot to improve cooling efficiency. As a result the R9 Fury X is some 29C cooler than the R9 290. One can only imagine what that has done for leakage.

Load GPU Temperature - FurMark

The situation is more or less the same under FurMark. The NVIDIA cards are set to cap at 83C, and the R9 Fury X is set to cap at 65C. This is regardless of whether it’s a game or a power virus like FurMark.

Idle Noise Levels

Last but not least, we have noise. Starting with idle noise, as we mentioned in our look at the build quality of the R9 Fury X, the card’s cooler is effective under load, but a bit of a liability at idle. The use of a pump brings with it pump noise, and this drives up idle noise levels by around 4dB. 41.5dB is not too terrible for a closed case, and it’s not an insufferable noise, but HTPC users will want to be weary. This if anything makes a good argument for looking forward to the R9 Nano.

Load Noise Levels - Crysis 3

Because the R9 Fury X starts out a bit loud due to pump noise, the actual noise increase under load is absolutely miniscule. The card tops out at 19% fan speed, 4% (or about 100 RPM) over its default fan speed of 15%. As a result we measure an amazing 43dB under load. For a high performance video card. For a high performance card within spitting distance of NVIDIA’s flagship and one of the best air cooled video cards of all time.

These results admittedly were not unexpected – one need only look at the R9 295X2 to get an idea of what a CLLC could do for noise – but they are none the less extremely impressive. Most midrange cards are louder than this despite offering a fraction of the R9 Fury X’s gaming performance, which puts the R9 Fury X at a whole new level for load noise from a high performance video card.

Load Noise Levels - FurMark

The trend continues under FurMark. The fan speed ramps up quite a bit further here thanks to the immense load from FurMark, but the R9 Fury X still perseveres. 46.7 dB(A) is once again better than a number of mid-range video cards, never mind the other high-end cards in this roundup. The R9 Fury X is dissipating 330W of heat and yet it’s quieter than the GTX 980 at half that heat, and around 6 dB(A) quieter than the 250W GM200 cards.

There really isn’t enough nice things I can say about the R9 Fury X’s cooler. AMD took the complaints about the R9 290 series to heart, and produced something that wasn’t just better than their previous attempt, but a complete inverse of their earlier strategy. The end result is that the R9 Fury X is well near whisper quiet under gaming, and only a bit louder under even the worst case scenario. This is a remarkable change, and one that ears everywhere will appreciate.

That said, the mediocre idle noise showing will undoubtedly dog the R9 Fury X in some situations. For most cases it will not be an issue, but it does close some doors on ultra-quiet setups. The R9 Fury X in that respect is merely very, very quiet.

Compute Overclocking


View All Comments

  • anandreader106 - Thursday, July 2, 2015 - link

    @Wreckage Not quite. Cash reserves play a role in evaluating a company's net worth. When AMD acquired ATI, they spent considerable money to do so and plunged themselves into debt. The resulting valuation of AMD was not simply the combined valuations of AMD and ATI pre-acquisition. Far from it.

    AMD is the undisputed underdog in 2015, and has been for many years before that. That is why Ryan gave so much praise to AMD in the article. For them to even be competitive at the high end, given their resources and competition, is nothing short of impressive.

    If you cannot at least acknowledge that, than your view on this product and the GPU market is completely warped. As consumers we are all better off with a Fury X in the market.
  • Yojimbo - Thursday, July 2, 2015 - link

    Yes, NVIDIA was definitely the underdog at the time of the AMD purchase of ATI. Many people were leaving NVIDIA for dead. NVIDIA had recently lost its ability to make chipsets for Intel processors, and after AMD bought ATI it was presumed (rightly so) that NVIDIA would no longer be able to make chipsets for AMD processors. It was thought that the discrete GPU market might dry up with fusion CPU/GPU chips taking over the market. Reply
  • chizow - Thursday, July 2, 2015 - link

    Yep, I remember after the merger happened most AMD/ATI fans were rejoicing as they felt it would spell the end of both Nvidia and Intel, Future is Fusion and all that promise lol. Many like myself were pointing out the fact AMD overpayed for ATI and that they would collapse under the weight of all that debt given ATI's revenue and profits didn't come close to justifying the purchase price.

    My how things have played out completely differently! It's like the incredible shrinking company. At this point it really is in AMD and their fan's best interest if they are just bought out and broken up for scraps, at least someone with deep pockets might be able to revive some of their core products and turn things around.
  • Ranger101 - Friday, July 3, 2015 - link

    Well done Mr Smith. I would go so far as to say THE best Fury X review on the internet bar
    none. The most important ingredient is BALANCE. Something that other reviews sorely lack.

    In particular the PCPer and HardOCP articles read like they were written by the green
    goblin himself and consequently suffer a MASSIVE credibility failure.

    Yes Nvidia has a better performing card in the 980TI but it was refreshing to see credit

    given to AMD where it was due. Only dolts and fanatical AMD haters (I'm not quite sure
    what category chizow falls into, probably both and a third "Nvidia shill") would deny that
    we need AMD AND Nvidia for the consumer to win.

    Thanks Anandtech.
  • Michael Bay - Friday, July 3, 2015 - link

    Except chizow never stated he wishes to see AMD dead.
    I guess it`s your butthurt talking.
  • chizow - Friday, July 3, 2015 - link

    Yep, just AMD fanboys ;)

    "What's Left of AMD" can keep making SoCs and console APUs or whatever other widgets under the umbrella of some monster conglomerate like Samsung, Qualcomm or Microsoft and I'm perfectly OK with that. Maybe I'll even buy an AMD product again.
  • medi03 - Sunday, July 5, 2015 - link

    "AMD going away won't matter to anyone but their few remaining devout fanboys'
    So kind (paid?) nVidia troll chizow is.
  • chizow - Monday, July 6, 2015 - link

    @medi03 no worries I look forward to the day (unpaid?) AMD fantroll's like you can free yourselves from the mediocrity that is AMD. Reply
  • chizow - Friday, July 3, 2015 - link

    Yet, still 3rd rate. The overwhelming majority of the market has gone on just fine without AMD being relevant in the CPU market, and recently, the same has happened in the GPU market. AMD going away won't matter to anyone but their few remaining devout fanboys like Ranger101. Reply
  • piiman - Friday, July 3, 2015 - link

    "AMD going away won't matter to anyone but their few remaining devout fanboys'

    Hmmm you'll think different when GPU prices go up up up. Competition is good for consumers and without it you will pay more, literally.

Log in

Don't have an account? Sign up now