Final Words

Bringing this review to a close, AMD has certainly thrown a great deal at us with the Radeon R9 Fury X. After the company’s stumble with their last single-GPU flagship, the Radeon R9 290X, they have reevaluated what they want to do, how they want to build their cards, and what kind of performance they want to aim for. As a result the R9 Fury X is an immensely interesting card.

From a technical perspective AMD has done a ton right here. The build quality is excellent, the load acoustic performance is unrivaled, the performance is great. Meanwhile although the ultimate value of High Bandwidth Memory to the buyer is only as great as the card’s performance, from a hobbyist perspective I am excited for what it means for future cards. The massive bandwidth improvements, the power savings, and the space savings have all done wonderful things for the R9 Fury X, and will do so for other cards in the future as well.

Compared to the R9 290X then, AMD has gone and done virtually everything they have needed to do in order to right what was wrong, and to compete with an NVIDIA energized by GTX Titan and the Maxwell architecture. As self-admittedly one of the harshest critics of the R9 290X and R9 290 due to the 290 series’ poor reference acoustic performance, I believe AMD has built an amazing card with the R9 Fury X. I dare say AMD finally “gets it” on card quality and so much more.

Had this card launched against the GTX Titan X a couple of months ago, where we would be today is talking about how AMD doesn’t quite dethrone the NVIDIA flagship, but instead how they serve as a massive spoiler, delivering so much of GTX Titan X’s performance for a fraction of the cost. But, unfortunately for AMD, this is not what has happened. The competition for the R9 Fury X is not an overpriced GTX Titan X, but a well-priced GTX 980 Ti, which to add insult to injury launched first, even though it was in all likelihood NVIDIA’s reaction to R9 Fury X.

The problem for AMD is that the R9 Fury X is only 90% of the way there, and without a price spoiler effect the R9 Fury X doesn’t go quite far enough. At 4K it trails the GTX 980 Ti by 4%, which is to say that AMD could not manage a strict tie or to take the lead. To be fair to AMD, a 4% difference in absolute terms is unlikely to matter in the long run, and for most practical purposes the R9 Fury X is a viable alternative to the GTX 980 Ti at 4K. None the less it does technically trail the GTX 980 Ti here, and that’s not the only issue that dogs such a capable card.

At 2560x1440 the card loses its status as a viable alternative. AMD’s performance deficit is over 10% at this point, and as we’ve seen in a couple of our games, AMD is hitting some very real CPU bottlenecking even on our high-end system. Absolute framerates are high enough that this only occurs at lower resolutions thanks to the high framerates these resolutions afford (and not a problem for 60Hz monitors), however at the same time AMD is also promoting 2560x1440@144Hz Freesync monitors, which these CPU bottlenecking issues greatly undercut.

The bigger issue, I suppose, is that while the R9 Fury X is very fast, I don’t feel we’ve reached the point where 4K gaming on a single GPU is the best way to go; too often we still need to cut back on image quality to reach playable framerates. 4K is arguably still the domain of multi-GPU setups, meanwhile cards like the R9 Fury X and GTX 980 Ti are excellent cards for 2560x1440 gaming, or even 1080p gaming for owners who want to take advantage of the image quality improvements from Virtual Super Resolution.

The last issue that dogs AMD here is VRAM capacity. At the end of the day first-generation HBM limits them to 4GB of VRAM, and while they’ve made a solid effort to work around the problem, there is only so much they can do. 4GB is enough right now, but I am concerned that R9 Fury X owners will run into VRAM capacity issues before the card is due for a replacement even under an accelerated 2 year replacement schedule.

Once you get to a straight-up comparison, the problem AMD faces is that the GTX 980 Ti is the safer bet. On average it performs better at every resolution, it has more VRAM, it consumes a bit less power, and NVIDIA’s drivers are lean enough that we aren’t seeing CPU bottlenecking that would impact owners of 144Hz displays. To that end the R9 Fury X is by no means a bad card – in fact it’s quite a good card – but NVIDIA struck first and struck with a slightly better card, and this is the situation AMD must face. At the end of the day one could do just fine with the R9 Fury X, it’s just not what I believe to be the best card at $649.

With that said, the R9 Fury X does have some advantages, that at least in comparing reference cards to reference cards, NVIDIA cannot touch, and these advantages give the R9 Fury X a great niche to reside in. The acoustic performance is absolutely amazing, and while it’s not enough to overcome some of the card’s other issues overall, if you absolutely must have the lowest load noise possible from a reference card, the R9 Fury X should easily impress you. I doubt that even the forthcoming R9 Nano can match what AMD has done with the R9 Fury X in this respect. Meanwhile, although the radiator does present its own challenges, the smaller size of the card should be a boon to small system builders who need something a bit different than standard 10.5” cards. Throw a couple of these into a Micro-ATX SFF PC, and it will be the PSU, not the video cards, that become your biggest concern.

Ultimately I believe AMD deserves every bit of credit they get for the R9 Fury X. They have put together a solid card that shows an impressive improvement over what they gave us 2 years ago with R9 290X. With that said, as someone who would like to see AMD succeed and prosper, the fact that they get so close only to be outmaneuvered by NVIDIA once again makes the current situation all the more painful; it’s one thing to lose to NVIDIA by feet, but to lose by inches only reminds you of just how close they got, how they almost upset NVIDIA. At the end of the day I think AMD can at least take home credit for forcing the GTX 980 Ti in to existence, which has benefitted the wider hobbyist community. Still, looking at AMD’s situation I can’t help but wonder what happens from here, as it seems like AMD badly needed a win they won’t quite get.

Finally, with the launch of the R9 Fury X behind us, it’s time to turn our gaze towards the future, the very near future. The R9 Fury X’s younger sibling, the R9 Fury, launches in 2 weeks. Though certainly slower by virtue of its cut-down Fiji GPU, it is also $100 cheaper, and is a more traditional air-cooled card design as well. With NVIDIA still selling the 4GB GTX 980 for $500, the playing field is going to be much different below the R9 Fury X, so I am curious to see just how things shape up on the 14th.

Overclocking
Comments Locked

458 Comments

View All Comments

  • looncraz - Friday, July 3, 2015 - link

    We don't yet know how the Fury X will overclock with unlocked voltages.

    SLI is almost just as unreliable as CF, ever peruse the forums? That, and quite often you can get profiles from the wild wired web well before the companies release their support - especially on AMD's side.
  • chizow - Friday, July 3, 2015 - link

    @looncraz

    We do know Fury X is an exceptionally poor overclocker at stock and already uses more power than the competition. Who's fault is it that we don't have proper overclocking capabilities when AMD was the one who publicly claimed this card was an "Overclocker's Dream?" Maybe they meant you could Overclock it, in your Dreams?

    SLI is not as unreliable as CF, Nvidia actually offers timely updates on Day 1 and works with the developers to implement SLI support. In cases where there isn't a Day 1 profile, SLI has always provided more granular control over SLI profile bits vs. AMD's black box approach of a loadable binary, or wholesale game profile copies (which can break other things, like AA compatibility bits).
  • silverblue - Friday, July 3, 2015 - link

    No, he did actually mention the 980Ti's excellent overclocking ability. Conversely, at no point did he mention Fury X's overclocking ability, presumably because there isn't any.
  • Refuge - Friday, July 3, 2015 - link

    He does mention it, and does say that it isn't really possible until they get modified bios with unlocked voltages.
  • e36Jeff - Thursday, July 2, 2015 - link

    first off, its 81W, not 120W(467-386). Second, unless you are running furmark as your screen saver, its pretty irrelevant. It merely serves to demonstrate the maximum amount of power the GPU is allowed to use(and given that the 980 Ti's is 1W less than in gaming, it indicates it is being artfically limited because it knows its running furmark).

    The important power number is the in game power usage, where the gap is 20W.
  • Ryan Smith - Thursday, July 2, 2015 - link

    There is no "artificial" limiting on the GTX 980 Ti in FurMark. The card has a 250W limit, and it tends to hit it in both games and FurMark. Unlike the R9 Fury X, NVIDIA did not build in a bunch of thermal/electrical headroom in to the reference design.
  • kn00tcn - Thursday, July 2, 2015 - link

    because furmark is normal usage right!? hbm magically lowers the gpu core's power right!? wtf is wrong with you
  • nandnandnand - Thursday, July 2, 2015 - link

    AMD's Fury X has failed. 980 Ti is simply better.

    In 2016 NVIDIA will ship GPUs with HBM version 2.0, which will have greater bandwidth and capacity than these HBM cards. AMD will be truly dead.
  • looncraz - Friday, July 3, 2015 - link

    You do realize HBM was designed by AMD with Hynix, right? That is why AMD got first dibs.

    Want to see that kind of innovation again in the future? You best hope AMD sticks around, because they're the only ones innovating at all.

    nVidia is like Apple, they're good at making pretty looking products and throwing the best of what others created into making it work well, then they throw their software into the mix and call it a premium product.

    Intel hasn't innovated on the CPU front since the advent of the Pentium 4. Core * CPUs are derived from the Penitum M, which was derived from the Pentium Pro.
  • Kutark - Friday, July 3, 2015 - link

    Man you are pegging the hipster meter BIG TIME. Get serious. "Intel hasn't innovated on the CPU front since the advent of the Pentium 4..." That has to be THE dumbest shit i've read in a long time.

    Say what you will about nvidia, but maxwell is a pristinely engineered chip.

    While i agree with you that AMD sticking around is good, you can't be pissed at nvidia if they become a monopoly because AMD just can't resist buying tickets on the fail train...

Log in

Don't have an account? Sign up now