Overclocking

Finally, no review of a high-end video card would be complete without a look at overclocking performance.

To get right to the point here, overclockers looking at out of the box overclocking performance are going to come away disappointed. While cooling and power delivery are overbuilt, in other respects the R9 Fury X is very locked down when it comes to overclocking. There is no voltage control at this time (even unofficial), there is no official HBM clockspeed control, and the card’s voltage profile has been finely tuned to avoid needing to supply the card with more voltage than is necessary. As a result the card has relatively little overclocking potential without voltage adjustments.

So what do we get for overclocking?

Radeon R9 Fury X Overclocking
  Stock Overclocked
Boost Clock 1050Mhz 1125MHz
Memory Clock 1Gbps (500MHz DDR) 1Gbps (500MHz DDR)
Max Voltage N/A N/A

Our efforts net us 75MHz, which is actually 25MHz less than what AMD published in their reviewer’s guide. Even 100MHz would lead to artifacting in some games, requiring that we step down to a 75MHz overclock to have a safe and sustainable overclock.

The end result is that the overclocked R9 Fury X runs at 1125MHz core and 1Gbps memory, a 75MHz (7%) increase in the GPU clockspeed and 0% increase in the memory clockspeed. This puts a very narrow window on expected performance gains, as we shouldn’t exceed a 7% gain in any game, and will almost certainly come in below 7% in most games.

OC: Battlefield 4 - 3840x2160 - Ultra Quality - 0x MSAA

Our gaming benchmarks find just that. A few percent performance improvement there, a 5% improvement there. Overall we wouldn’t go as far as saying there no reason to overclock, but with such limited gains it’s hardly worth the trouble right now.

True overclocking is going to have to involve BIOS modding, a riskier and warranty-voiding strategy, but one that should be far more rewarding. With more voltage I have little doubt that R9 Fury X could clock higher, though it’s impossible to guess by how much at this time. In any case the card is certainly built for it, as the oversized cooler, high power delivery capabilities, and dual BIOS switch provide all the components necessary for such an overclocking attempt.

Meanwhile HBM is a completely different bag, and while unofficial overclocking is looking promising, as a new technology it will take some time to get a good feel for it and understand just what kind of performance improvements it can deliver. The R9 Fury X is starting out with quite a bit of memory bandwidth right off the bat (512GB/sec), so it may not be bandwidth starved as often as other cards like the R9 290X was.

Power, Temperature, & Noise Final Words
Comments Locked

458 Comments

View All Comments

  • looncraz - Friday, July 3, 2015 - link

    We don't yet know how the Fury X will overclock with unlocked voltages.

    SLI is almost just as unreliable as CF, ever peruse the forums? That, and quite often you can get profiles from the wild wired web well before the companies release their support - especially on AMD's side.
  • chizow - Friday, July 3, 2015 - link

    @looncraz

    We do know Fury X is an exceptionally poor overclocker at stock and already uses more power than the competition. Who's fault is it that we don't have proper overclocking capabilities when AMD was the one who publicly claimed this card was an "Overclocker's Dream?" Maybe they meant you could Overclock it, in your Dreams?

    SLI is not as unreliable as CF, Nvidia actually offers timely updates on Day 1 and works with the developers to implement SLI support. In cases where there isn't a Day 1 profile, SLI has always provided more granular control over SLI profile bits vs. AMD's black box approach of a loadable binary, or wholesale game profile copies (which can break other things, like AA compatibility bits).
  • silverblue - Friday, July 3, 2015 - link

    No, he did actually mention the 980Ti's excellent overclocking ability. Conversely, at no point did he mention Fury X's overclocking ability, presumably because there isn't any.
  • Refuge - Friday, July 3, 2015 - link

    He does mention it, and does say that it isn't really possible until they get modified bios with unlocked voltages.
  • e36Jeff - Thursday, July 2, 2015 - link

    first off, its 81W, not 120W(467-386). Second, unless you are running furmark as your screen saver, its pretty irrelevant. It merely serves to demonstrate the maximum amount of power the GPU is allowed to use(and given that the 980 Ti's is 1W less than in gaming, it indicates it is being artfically limited because it knows its running furmark).

    The important power number is the in game power usage, where the gap is 20W.
  • Ryan Smith - Thursday, July 2, 2015 - link

    There is no "artificial" limiting on the GTX 980 Ti in FurMark. The card has a 250W limit, and it tends to hit it in both games and FurMark. Unlike the R9 Fury X, NVIDIA did not build in a bunch of thermal/electrical headroom in to the reference design.
  • kn00tcn - Thursday, July 2, 2015 - link

    because furmark is normal usage right!? hbm magically lowers the gpu core's power right!? wtf is wrong with you
  • nandnandnand - Thursday, July 2, 2015 - link

    AMD's Fury X has failed. 980 Ti is simply better.

    In 2016 NVIDIA will ship GPUs with HBM version 2.0, which will have greater bandwidth and capacity than these HBM cards. AMD will be truly dead.
  • looncraz - Friday, July 3, 2015 - link

    You do realize HBM was designed by AMD with Hynix, right? That is why AMD got first dibs.

    Want to see that kind of innovation again in the future? You best hope AMD sticks around, because they're the only ones innovating at all.

    nVidia is like Apple, they're good at making pretty looking products and throwing the best of what others created into making it work well, then they throw their software into the mix and call it a premium product.

    Intel hasn't innovated on the CPU front since the advent of the Pentium 4. Core * CPUs are derived from the Penitum M, which was derived from the Pentium Pro.
  • Kutark - Friday, July 3, 2015 - link

    Man you are pegging the hipster meter BIG TIME. Get serious. "Intel hasn't innovated on the CPU front since the advent of the Pentium 4..." That has to be THE dumbest shit i've read in a long time.

    Say what you will about nvidia, but maxwell is a pristinely engineered chip.

    While i agree with you that AMD sticking around is good, you can't be pissed at nvidia if they become a monopoly because AMD just can't resist buying tickets on the fail train...

Log in

Don't have an account? Sign up now