Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

Starting with voltages, at least for the time being we have nothing to report on R9 Fury X as far as voltages go. AMD is either not exposing voltages in their drivers, or our existing tools (e.g. MSI Afterburner) do not know how to read the data, and as a result we cannot see any of the voltage information at this time.

Radeon R9 Fury X Average Clockspees
Game R9 Fury X
Max Boost Clock 1050MHz
Battlefield 4
1050MHz
Crysis 3
1050MHz
Mordor
1050MHz
Civilization: BE
1050MHz
Dragon Age
1050MHz
Talos Principle
1050MHz
Far Cry 4
1050MHz
Total War: Attila
1050MHz
GRID Autosport
1050MHz
Grand Theft Auto V
1050MHz
FurMark
985MHz

Jumping straight to average clockspeeds then, with an oversized cooler and a great deal of power headroom, the R9 Fury X has no trouble hitting and sustaining its 1050MHz boost clockspeed throughout every second of our benchmark runs. The card was designed to be the pinnacle of Fiji cards, and ensuring it always runs at a high clockspeed is one of the elements in doing so. The lack of throttling means there’s really little to talk about here, but it sure gets results.

Idle Power Consumption

Idle power does not start things off especially well for the R9 Fury X, though it’s not too poor either. The 82W at the wall is a distinct increase over NVIDIA’s latest cards, and even the R9 290X. On the other hand the R9 Fury X has to run a CLLC rather than simple fans. Further complicating factors is the fact that the card idles at 300MHz for the core, but the memory doesn’t idle at all. HBM is meant to have rather low power consumption under load versus GDDR5, but one wonders just how that compares at idle.

Load Power Consumption - Crysis 3

Switching to load power consumption, we go first with Crysis 3, our gaming load test. Earlier in this article we discussed all of the steps AMD took to rein in on power consumption, and the payoff is seen here. Equipped with an R9 Fury X, our system pulls 408W at the wall, a significant amount of power, but only 20W at the wall more than the same system with a GTX 980 Ti. Given that the R9 Fury X’s framerates trail the GTX 980 Ti here, this puts AMD’s overall energy efficiency in a less-than-ideal spot, but it’s not poor either, especially compared to the R9 290X. Power consumption has essentially stayed put while performance has gone up 35%+.

On a side note, as we mentioned in our architectural breakdown, the amount of power this card draws will depend on its temperature. 408W at the wall at 65C is only 388W at the wall at 40C, as current leakage scales with GPU temperature. Ultimately the R9 Fury X will trend towards 65C, but it means that early readings can be a bit misleading.

Load Power Consumption - FurMark

As for FurMark, what we find is that power consumption at the wall is much higher, which far be it from being a problem for AMD, proves that the R9 Fury X has much greater thermal and electrical limits than the R9 290X, or the NVIDIA completion for that matter. AMD ultimately does throttle the R9 Fury X here, at around 985MHz, but the card easily draws quite a bit of power and dissipates quite a bit of heat in the process. If the card had a gaming scenario that called for greater power consumption – say BIOS modded overclocking – then these results paint a favorable picture.

Idle GPU Temperature

Moving on to temperatures, the R9 Fury X starts off looking very good. Even at minimum speeds the pump and radiator leads to the Fiji GPU idling at just 27C, cooler than anything else on this chart. More impressive still is the R9 290X comparison, where the R9 Fury X is some 15C cooler, and this is just at idle.

Load GPU Temperature - Crysis 3

Loading up a game, after a good 10 minutes or so the R9 Fury X finally reaches its equilibrium temperature of 65C. Though the default target GPU temperature is 75C, it’s at 65C that the card finally begins to ramp up the fan in order to increase cooling performance. The end result is that the card reaches equilibrium at this point and in our experience should not exceed this temperature.

Compared to the NVIDIA cards, this is an 18C advantage in AMD’s favor. GPU temperatures are not everything – ultimately it’s fan speed and noise we’re more interested in – but for AMD GPU temperatures are an important component of controlling GPU power consumption. By keeping the Fiji GPU at 65C AMD is able to keep leakage power down, and therefore energy efficiency up. R9 Fury X would undoubtedly fare worse in this respect if it got much warmer.

Finally, it’s once again remarkable to compare the R9 Fury X to the R9 290X. With the former AMD has gone cool to keep power down, whereas with the latter AMD went hot to improve cooling efficiency. As a result the R9 Fury X is some 29C cooler than the R9 290. One can only imagine what that has done for leakage.

Load GPU Temperature - FurMark

The situation is more or less the same under FurMark. The NVIDIA cards are set to cap at 83C, and the R9 Fury X is set to cap at 65C. This is regardless of whether it’s a game or a power virus like FurMark.

Idle Noise Levels

Last but not least, we have noise. Starting with idle noise, as we mentioned in our look at the build quality of the R9 Fury X, the card’s cooler is effective under load, but a bit of a liability at idle. The use of a pump brings with it pump noise, and this drives up idle noise levels by around 4dB. 41.5dB is not too terrible for a closed case, and it’s not an insufferable noise, but HTPC users will want to be weary. This if anything makes a good argument for looking forward to the R9 Nano.

Load Noise Levels - Crysis 3

Because the R9 Fury X starts out a bit loud due to pump noise, the actual noise increase under load is absolutely miniscule. The card tops out at 19% fan speed, 4% (or about 100 RPM) over its default fan speed of 15%. As a result we measure an amazing 43dB under load. For a high performance video card. For a high performance card within spitting distance of NVIDIA’s flagship and one of the best air cooled video cards of all time.

These results admittedly were not unexpected – one need only look at the R9 295X2 to get an idea of what a CLLC could do for noise – but they are none the less extremely impressive. Most midrange cards are louder than this despite offering a fraction of the R9 Fury X’s gaming performance, which puts the R9 Fury X at a whole new level for load noise from a high performance video card.

Load Noise Levels - FurMark

The trend continues under FurMark. The fan speed ramps up quite a bit further here thanks to the immense load from FurMark, but the R9 Fury X still perseveres. 46.7 dB(A) is once again better than a number of mid-range video cards, never mind the other high-end cards in this roundup. The R9 Fury X is dissipating 330W of heat and yet it’s quieter than the GTX 980 at half that heat, and around 6 dB(A) quieter than the 250W GM200 cards.

There really isn’t enough nice things I can say about the R9 Fury X’s cooler. AMD took the complaints about the R9 290 series to heart, and produced something that wasn’t just better than their previous attempt, but a complete inverse of their earlier strategy. The end result is that the R9 Fury X is well near whisper quiet under gaming, and only a bit louder under even the worst case scenario. This is a remarkable change, and one that ears everywhere will appreciate.

That said, the mediocre idle noise showing will undoubtedly dog the R9 Fury X in some situations. For most cases it will not be an issue, but it does close some doors on ultra-quiet setups. The R9 Fury X in that respect is merely very, very quiet.

Compute Overclocking
Comments Locked

458 Comments

View All Comments

  • chizow - Monday, July 6, 2015 - link

    Oh, and also forgot his biggest mistake was vastly overpaying for ATI, leading both companies on this downward spiral of crippling debt and unrealized potential.
  • chizow - Monday, July 6, 2015 - link

    Uh...Bulldozer happened on Ruiz's watch, and he also wasn't able to capitalize on K8's early performance leadership. Beyond that he orchestrated the sale of their fabs to ATIC culminating in the usurious take or pay WSA with GloFo that still cripples them to this day. But of course, it was no surprise why he did this, he traded AMD's fabs for a position as GloFo's CEO which he was forced to resign from in shame due to insider trading allegations. Yep, Ruiz was truly a crook but AMD fanboys love to throw stones at Huang. :D
  • tipoo - Thursday, July 2, 2015 - link

    Nooo please put it back, it was so much better with Anandtech referring to AMD as the taint :P
  • HOOfan 1 - Thursday, July 2, 2015 - link

    At least he didn't spell it "perianal"
  • Wreckage - Thursday, July 2, 2015 - link

    It's silly to paint AMD as the underdog. It was not that long ago that they were able to buy ATI (a company that was bigger than NVIDIA). I remember at the time a lot of people were saying that NVIDIA was doomed and could never stand up to the might of a combined AMD + ATI. AMD is not the underdog, AMD got beat by the underdog.
  • Drumsticks - Thursday, July 2, 2015 - link

    I mean, AMD has a market cap of ~2B, compared to 11B of Nvidia and ~140B of Intel. They also have only ~25% of the dGPU market I believe. While I don't know a lot about stocks and I'm sure this doesn't tell the whole story, I'm not sure you could ever sell Nvidia as the underdog here.
  • Kjella - Thursday, July 2, 2015 - link

    Sorry but that is plain wrong as nVidia wasn't just bigger than ATI, they were bigger than AMD. Their market cap in Q2 2006 was $9.06 billion, on the purchase date AMD was worth $8.84 billion and ATI $4.2 billion. It took a massive cash/stock deal worth $5.6 billion to buy ATI, including over $2 billion in loans. AMD stretched to the limit to make this happen, three days later Intel introduced the Core 2 processor and it all went downhill from there as AMD couldn't invest more and struggled to pay interest on falling sales. And AMD made an enemy of nVidia, which Intel could use to boot nVidia out of the chipset/integrated graphics market by not licensing QPI/DMI with nVidia having nowhere to go. It cost them $1.5 billion, but Intel has made back that several times over since.
  • kspirit - Thursday, July 2, 2015 - link

    That was pretty savage of Intel, TBH. I'm impressed.
  • Iketh - Monday, July 6, 2015 - link

    or you could say AMD purposely finalized the purchase just before Core2 was introduced... after Core2, the purchase would have been impossible
  • Wreckage - Thursday, July 2, 2015 - link

    http://money.cnn.com/2006/07/24/technology/nvidia_...

    AMD was worth $8.5B and ATI was worth $5B at the time of the merger making them worth about twice what NVIDIA was worth at the time ($7B)

    In 2004 NVIDIA had a market cap of $2.4B and ATI was at $4.3B nearly twice.
    http://www.tomshardware.com/news/nvidias-market-sh...

    NVIDIA was the underdog until the combined AMD+ATI collapsed and lost most of their value. They are Goliath brought down by David.

Log in

Don't have an account? Sign up now