Meet The Radeon R9 Fury X

Right off the bat it is safe to say that we have not seen a high-end card quite like the R9 Fury X before. We have seen small cards before, we have seen liquid cooled cards before, and we have seen well-crafted cards before. But none of those attributes have come together before in a single card like they have with the R9 Fury X. AMD has spared no expense on their new single-GPU flagship and it shows.

If we had to describe the R9 Fury X in a nutshell, the only thing that would do it any justice is a very small card with a very big radiator. Liquid cooling is not by any means new (see: R9 295X2), but thanks to the space savings of HBM there hasn’t been a card before that has paired up a small board with a large radiator in this fashion. It is, arguably, a card designed for an entirely different paradigm than the dual-slot air cooled cards we have come to know over the last decade.

But first, let’s talk about build quality and naming. To cut right to the case, NVIDIA’s initial GTX Titan project saw a large and unexpected amount of success in 2013. Named in honor of the Titan supercomputer, NVIDIA’s first major supercomputer win for their Tesla business, the GTX Titan was created to serve as a new high-end card line for the company. Harkening back to the Titan supercomputer, the GTX Titan created for NVIDIA what amounted to a luxury category for a video card, priced at an unprecedented $1000, but at the same time introducing a level of build quality and performance for a blower-type air cooler that is unmatched to this day. By all reports the GTX Titan handily exceeded NVIDIA’s expectations, and though expensive, established the viability of a higher quality video card.

With the R9 Fury X, AMD is looking to establish the same kind of luxury brand and image for their products. Similar to the Titan, AMD has dropped the number in the product name (though formally we still consider it part of the 300 series), opting instead to brand the product Fury. AMD has not been very clear on the rationale for the Fury name, and while they do not have a high-profile supercomputer to draw a name from, they do have other sources. Depending on what you want to believe the name is either a throwback to AMD’s pre-Radeon (late 1990s) video card lineup, the ATI Rage Fury family. Alternatively, in Greek mythology the Furies were deities of vengeance who had an interesting relationship with the Greek Titans that is completely unsuitable for publication in an all-ages technical magazine, and as such the Fury name may be a dig at NVIDIA’s Titan branding.

In any case, what isn’t in doubt is the quality of the R9 Fury X. AMD has opted to build a card to a similar luxury standard as the GTX Titan, with a focus on both the card and somewhat disembodied radiator. For R9 Fury X AMD has gone metal, and the card proper is composed of a multi-piece die-cast aluminum body that is built around the card’s PCB. While you won’t find any polycarbonate windows here (the pump isn’t exactly eye-catching) what you will find is soft-touch rubber along the front face of the card and its edges. Meanwhile on the back side of the card there is another soft-touch panel to serve as a backplate. Since the card lacks a fan of any kind, there are no cooling concerns here; with or without a backplate, putting R9 Fury X cards side-by-side doesn’t impact cooling in any meaningful manner.

Ultimately at this point AMD and NVIDIA are basically taking cues from the cell phone industry on design, and the end result is a card that looks good, and yes, even feels good in-hand. The R9 Fury X has a very distinctive silhouette to it that should be easily noticeable in any open or windowed case, and in case that fails, you would be hard-pressed to miss the lighting. The card features multiple LED-lit features; the first is the Radeon logo, which lights up red when the card is powered on, and meanwhile directly next to the PCIe power connectors are a bank of LEDs. The LEDs are designed to indicate the load on the card, with 8 LEDs for load, and a final, 9th LED that indicates whether the card has gone until AMD’s ultra-deep sleep ZeroCore Power mode. The 8 load LEDs can even be configured for red or blue operation to color-coordinate with the rest of a case; the only color you can’t select is green, in theory because green is reserved for the sleep LED, though in practice I think it’s safe to point out that green is not AMD’s color…

Moving on, let’s talk about the R9 Fury X’s cooling apparatus, its closed loop liquid cooling system and its rather large radiator. One of AMD’s few success stories in 2014 was the R9 295X2, the company’s dual GPU Hawaii card. After numerous struggles with air cooling for both single and dual GPU cards – R9 290X, Radeon HD 6990, Radeon HD 7990 – and facing a 500W TDP for their Hawaii card, AMD ditched straight air cooling in favor of a closed loop liquid cooler. Though the cooler AMD used was pushed to its limit by the high TDP of the R9 295X2, at the end of the day it did its job and did it well, effectively cooling the card while delivering acoustic performance simply unheard of for a reference dual GPU card.

For the R9 Fury X AMD needs to dissipate over 275W of heat. AMD’s reference cooler for the R9 290X was simply not very good, and even NVIDIA would be hard pressed to dissipate that much heat, as their Titan cooler is optimized around 250W (and things quickly get worse when pushed too far past that). As a result AMD opted to once again go the closed loop liquid cooling (CLLC) route for the R9 Fury X, taking what they learned from the R9 295X2 and improving upon it.

The end result is a bigger, better CLLC that is the only source of cooling for the card. The R9 295X2 used a fan for the VRMs and VRAM, but for the R9 Fury X AMD has slaved everything to the CLLC. The CLLC in turn is even bigger than before; it’s still based around a 120mm radiator, but the combination of radiator and fan is now 60mm thick, and in all honesty the thickest radiator we expect would fit in our closed case testbed. As a result the radiator has an immense 500W rated cooling capability, far outstripping the card’s 275W TBP, and without a doubt making it overkill for the R9 Fury X.

Cracking open the R9 Fury X’s housing to check out the pump block shows us that is built by Cooler Master. Furthermore in order to have the CLLC cool the GPU and supporting discrete components, there is an interesting metal pipe in the loop, which serves as a liquid cooled heatpipe of sorts to draw heat away from the MOSFETs used in the card’s VRM setup. Otherwise the GPU and the HBM modules situated next to it are covered by the pump block itself.

AMD’s official rating for the R9 Fury X’s cooling apparatus is that it should keep the card at 50C while playing games. In practice what we have found is that in our closed case test bed the GPU temperature gets up to 65C, at which point the CLLC fan ramps up very slightly (about another 100RPM) and reaches equilibrium. The low operating temperature of the Fiji GPU is not only a feather in AMD’s cap, but is an important part of the design of the card, as the low temperature keeps power consumption down and improves AMD’s overall energy efficiency.

Meanwhile from an operational standpoint, until the R9 295X2, where the CLLC acted independently based on the temperature of the liquid, the R9 Fury X’s CLLC is slaved into the fan controls for the card. As a result it’s possible for the card (and users) to directly control the fan speed based on GPU/VRM temperatures. The independent CLLC in the R9 295X2 was not a direct problem for that card, but with the CLLC of the R9 Fury X now responsible for the VRMs as well, letting the card control the fan is a must.

Overall the acoustic performance of the R9 Fury X is unprecedented for a high-end card, as we’ll see in our benchmark section. Unfortunately the CLLC does have one drawback, and that is idle noise. Just as with the R9 295X2, there’s no such thing as a free lunch when it comes to moving around coolant, and as a result the pump makes more noise at idle than what you’d find on an air cooled card, be it blower or open air.

Moving on, let’s get back to the card itself. The R9 Fury X ships with an official Typical Board Power (TBP of 275W, this intended to represent the amount of power it will consume during the average gaming session. That said, the power delivery circuitry of the card is designed to deliver quite a bit more energy than that, as the card features a pair of 8-pin PCIe power sockets and has an official power delivery rating of 375W. And although AMD doesn’t specify a board limit in watts, based on empirical testing we were able to get up to an estimated 330W. As a result the card has plenty of power and thermal headroom if desired.

Meanwhile the card’s VRM setup is a 6-phase design. AMD tells us that these VRMs can handle up to 400A, no doubt helped by the liquid cooling taking place. AMD’s out of the box overclocking support is limited – no voltage and no HBM clockspeed controls, only the power limit and GPU clockspeed – but it’s clear that this card was built to take some significant overclocking. To that end there is a dual BIOS switch present that can switch between a programmable BIOS and a fixed reference BIOS, and I can only imagine that AMD is expecting hardcore overclockers to look into BIOS modification to gain the necessary power and voltage controls for more extreme overclocking.

Taking a look at the naked PCB, it’s remarkable how small it is. The PCB measures just 7.5” in length, 3” shorter than the standard 10.5” found on most high-end reference cards. This smaller size is primarily enabled through the use of HBM, which brings the card’s VRAM on-chip and uses relatively tiny stacks of memory as opposed to the large cluster of 16 GDDR5 chips required for R9 290X. HBM also reduces the number of discrete components required for the power delivery system, as HBM has much simpler power delivery requirements. The end result is that a good deal of the board is on the Fiji chip itself, with limited amounts of supporting circuitry forming the rest of the board.


Top: R9 Fury X. Bottom: R9 390X/290X

To show this off, AMD let us take pictures of a bare R9 Fury X PCB next to an R9 290X PCB, and the difference is simply staggering. As a result the R9 Fury X is going to enjoy an interesting niche as a compact, high-performance card. The large radiator does invite certain challenges, but I expect OEM system builders are going to have a fun time designing some Micro-ATX sized SFF PCs around this card.

Switching gears, the Display I/O situation is an interesting one. Without a fan on the card itself, there is no need for vents on the I/O bracket, and as a result we get a solid metal bracket with holes punched out for the card’s 4 display I/O ports. Here we find 3x DisplayPort 1.2a ports, and a single HDMI 1.4a port. What you will not find is a DL-DVI port; after going down to 1 port on Radeon HD 7970 and back to 2 ports on the Radeon R9 290X, AMD has eliminated the port entirely. Physically I believe this to be based on the fact that the size of the card combined with the pump doesn’t leave room for a DVI port on the second row, though I will also note that AMD announced a few years ago that they were targeting a 2015 date to begin removing DVI ports.

What this means is that the R9 Fury X can natively only drive newer DisplayPort and HDMI equipped monitors. While DVI-only monitors are rare in 2015 (essentially everything has an HDMI port), owners of only DVI-only monitors will be in an interesting situation. With DisplayPort AMD has plenty of flexibility – it can essentially be converted to anything with the right adapter – so owners of DVI-only monitors would either need a cheap passive DP-to-SL-DVI adapter for single link DVI monitors, while they will need a more expensive DP-to-DL-DVI adapter for dual link DVI monitors. Cutting off DVI users was always going to be hard and R9 Fury X doesn’t make it any easier, but on the other hand there’s no getting around the fact that the DVI connector is large, outdated, and space-inefficient in 2015.

Wrapping things up, as far as build quality goes the R9 Fury X is without a doubt the best designed reference Radeon card to date. AMD has learned quite a bit from the R9 290X, and while there is a balance of factors in play here, there is no question that AMD has addressed the build quality issues of their past reference cards in a big way. We have moved on from the late 2000s and early 2010s, and the days of cards like the GTX 280, GTX 480, Radeon HD 6990, and Radeon R9 290X should be behind us. PC video cards, even high-end cards, do not and should not need to be that noisy ever again.

With that said, I do have some general concerns about the fact that the only cards to ship with a high-clocked fully-enabled Fiji GPU will be liquid cooled. Until now air cooling has always been the baseline and liquid cooling the niche alternative, and while I strongly favor quieter cards, there is none the less the question about what it means when AMD tells us that R9 Fury X will only be available with liquid cooling. After the limits of air cooling put a lid on GPU power consumption, will the switch to a CLLC by AMD usher in a new war of power, where everyone is able to once again ramp up power consumption thanks to the greater cooling capabilities of a CLLC?

In the meantime AMD considers the CLLC to be an advantage for them, not just for technical reasons but for marketing reasons. The bill of materials cost on the CLLC is quite high – likely around $80 even in volume for AMD – so don’t be surprised to see if AMD includes that on their cost calculus when promoting the card. They spent the money on a more effective cooler, and they want buyers to know that this is a cost baked in to the base price of the card.

That said, it’s important here to note that this is nothing NVIDIA couldn’t replicate if they really wanted to. Their partners already sell CLLC cards as premium cards, which means AMD has to tread carefully here as NVIDIA could easily go CLLC at a lower price and erase AMD’s advantage, though not without taking a hit to their margins (something I suspect AMD would be just fine with).

Finally, as we’ve already hashed out the implications of 4GB of VRAM a few pages back when addressing the 4GB question, we won’t repeat ourselves in length here. But in summary, while 4GB of VRAM is enough for now, it is only just. The R9 Fury X is likely to face VRAM pressure in under 2 years.

Today’s Review: Radeon R9 Fury X The State of Mantle, The Drivers, & The Test
Comments Locked

458 Comments

View All Comments

  • D. Lister - Thursday, July 2, 2015 - link

    "AMD had tessellation years before nVidia, but it went unused until DX11, by which time nVidia knew AMD's capabilities and intentionally designed a way to stay ahead in tessellation. AMD's own technology being used against it only because it released it so early. HBM, I fear, will be another example of this. AMD helped to develop HBM and interposer technologies and used them first, but I bet nVidia will benefit most from them."

    AMD is often first at announcing features. Nvidia is often first at implementing them properly. It is clever marketing vs clever engineering. At the end of the day, one gets more customers than the other.
  • sabrewings - Thursday, July 2, 2015 - link

    While you're right that Nvidia paid for the chips used in 980 Tis, they're still most likely not fit for Titan X use and are cut to remove the underperforming sections. Without really knowing what their GM200 yields are like, I'd be willing to be the $1000 price of the Titan X was already paying for the 980 Ti chips. So, Nvidia gets to play with binned chips to sell at $650 while AMD has to rely on fully up chips added to an expensive interposer with more expensive memory and a more expensive cooling solution to meet the same price point for performance. Nvidia definitely forced AMD into a corner here, so as I said I would say they won.

    Though, I don't necessarily say that AMD lost, they just make it look much harder to do what Nvidia was already doing and making bookoo cash at that. This only makes AMD's problems worse as they won't get the volume to gain marketshare and they're not hitting the margins needed to heavily reinvest in R&D for the next round.
  • Kutark - Friday, July 3, 2015 - link

    So basically what you're saying is Nvidia is a better run company with smarter people working there.
  • squngy - Friday, July 3, 2015 - link

    "and they cost more per chip to produce than AMD's Fiji GPU."

    Unless AMD has a genie making it for them that's impossible.
    Not only is fiji larger, it also uses a totally new technology (HBM).
  • JumpingJack - Saturday, July 4, 2015 - link

    "AMD had tessellation years before nVidia, but it went unused until DX11, by which time nVidia knew AMD's capabilities and intentionally designed a way to stay ahead in tessellation. AMD's own technology being used against it only because it released it so early. HBM, I fear, will be another example of this. AMD helped to develop HBM and interposer technologies and used them first, but I bet nVidia will benefit most from them."

    AMD fanboys make it sound like AMD can actually walk on water. AMD did work with Hynix, but the magic of HBM comes in the density from die stacking, which AMD did nothing (they are no longer the actual chipmaker as you probably know). As for interposers, this is not new technology, interposers are well established techniques for condensing an array of devices into one package.

    AMD deserves credit for bringing the technology to market, no doubt, but their actually IP contribution is quite small.
  • ianmills - Thursday, July 2, 2015 - link

    Good that you are feeling better Ryan and thanks for the review :)
    That being said Anandtech needs keep us better informed when things come up.... The way this site handled it though is gonna lose this site readers...
  • Kristian Vättö - Thursday, July 2, 2015 - link

    Ryan tweeted about the Fiji schedule several times and we were also open about it in the comments whenever someone asked, even though it wasn't relevant to the article in question. It's not like we were secretive about it and I think a full article of an article delay would be a little overkill.
  • sabrewings - Thursday, July 2, 2015 - link

    Those tweets are even featured on the site in the side bar. Not sure how much clearer it could get without an article about a delayed article.
  • testbug00 - Sunday, July 5, 2015 - link

    Pipeline story... Dunno title, but, for text, explain it there. Have a link to THG as owned by same company now if readers want to read a review immediately.

    Twitter is non-ideal.
  • funkforce - Monday, July 6, 2015 - link

    The problem isn't only with the delays, it is that since Ryan took over as Editor in Chief I suspect his workload is too large.
    Because this also happened with the Nvidia GTX 960 review. He told 5-6 people (including me) for 5 weeks that it would come, and then it didn’t and he stopped responding to inquires about it.
    Now in what way is that a good way to build a good relationship and trust between you and your readers?
    I love Ryan's writing, this article was one of the best I've read in a long time. But not everyone is good at everything, maybe Ryan needs to focus on only GPU reviews and not running the site or whatever his other responsibilities are as Edit. in Chief.

    Because the Reviews are what most ppl. come here for and what built this site. You guys are amazing, but AT never used to miss releasing articles the same day NDA was lifted in the past that I can remember. And promising things and then not delivering, sticking your head in the sand and not even apologizing isn’t a way to build up trust and uphold and strengthen the large following this site has.

    I love this site, been reading it since the 1st year it came out, and that's why I care and I want you to continue and prosper.
    Since a lot of ppl. can’t reed the twitter feed then what you did here: http://www.anandtech.com/show/8923/nvidia-launches...
    Is the way to go if something comes up, but then you have to deliver on your promises.

Log in

Don't have an account? Sign up now