Final Words

Bringing this review to a close, when I first heard that AMD was going to build a full performance dual Hawaii GPU solution, I was admittedly unsure about what to expect. The power requirements for a dual Hawaii card would pose an interesting set of challenges for AMD, and AMD’s most recent high-end air coolers were not as effective as they should have been.

In that context AMD’s decision to build a card around a closed loop liquid cooling solution makes a lot of sense for what they wanted to achieve. Like any uncommon cooling solution, the semi-exotic nature of a CLLC is a double edged sword that brings with it both benefits and drawbacks; the benefits being enhanced cooling performance, and the drawbacks being complexity and size. So it was clear from the start that given AMD’s goals and their chips, the benefits they could stand to gain could very well outweigh the drawbacks of going so far off of the beaten path.

To that end the Radeon R9 295X2 is a beast, and that goes for every sense of the word. From a performance standpoint AMD has delivered on their goals of offering the full, unthrottled performance of a Radeon R9 290X Crossfire solution. AMD has called the 295X2 an “uncompromised” card and that’s exactly what they have put together, making absolutely no performance compromises in putting a pair of Hawaii GPUs on to a single video card. In a sense it’s almost too simple – there are no real edge cases or other performance bottlenecks to discuss – but then again that’s exactly what it means to offer uncompromised performance.

“Beastly” is just as fitting for the card when it comes to its cooling too. With a maximum noise level of 50dB the 295X2’s CLLC is unlike anything we’re reviewed before, offering acoustic performance as good as or better than some of the best high end cards of this generation despite the heavy cooling workloads such a product calls for. Which brings us to the other beastly aspect, which is the card’s 500W TDP. AMD has put together a card that can put out 500W of heat and still keep itself cooled, but there’s no getting around the fact that at half a kilowatt in power consumption the 295X2 draws more power than any other single card we’ve reviewed before.

Taken altogether this puts the 295X2 in a very interesting spot. The performance offered by the 295X2 is the same performance offered by the 290X in Crossfire, no more and no less. This means that depending on whether we’re looking at 2K or 4K resolutions the 295X2 either trails a cheaper set of GTX 780 Tis in SLI by 5%, or at the kinds of resolutions that most require this much performance it can now exceed those very same GeForce cards by 5%. On the other hand NVIDIA still holds a consistent edge over AMD in frame pacing. But thanks to their XDMA engine AMD's frame pacing performance is vastly improved compared to their prior dual-GPU cards and is now good enough overall (though there's definitely room for further improvement).

But more significantly, by its very nature as a CLLC equipped dual-GPU video card the 295X2 stands alone among current video cards. There’s nothing else like it in terms of design, and that admittedly makes it difficult to properly place the 295X2 in reference to other video cards. Do we talk about how it’s one of only a handful of dual-GPU cards? Or do we talk about the price? Or do we talk about the unconventional cooler?

However perhaps it’s best to frame the 295X2 with respect to its competition, or rather the lack thereof. For all the benefits and drawbacks of AMD’s card perhaps the most unexpected thing they have going for them is that they won’t be facing any real competition from NVIDIA. NVIDIA has announced their own dual-GPU card for later this month, the GeForce GTX Titan Z, but priced at $3000 and targeted more heavily at compute users than it is gamers, the GTX Titan Z is going to reside in its own little niche, leaving the 295X2 alone in the market at half the price. We’ll see what GTX Titan Z brings to the table later this month, but no matter what AMD is going to have an incredible edge on price that we expect will make most potential buyers think twice, despite the 295X2’s own $1500 price tag.

Ultimately while this outcome does put the 295X2 in something of a “winner by default” position, it does not change the fact that AMD has put together a very solid card, and what’s by far their best dual-GPU card yet. Between the price tag and the unconventional cooler it’s certainly a departure from the norm, but for those buyers who can afford and fit this beastly card, it sets a new and very high standard for just what a dual-GPU should do.

Power, Temperature, & Noise
Comments Locked

131 Comments

View All Comments

  • mpdugas - Wednesday, April 9, 2014 - link

    Time for two power supplies in this kind of build...
  • rikm - Wednesday, April 9, 2014 - link

    huh?, no giveaway? why do I read this stuff?
    ok, seriously, love these reviews, but the thing I never understand is when they say Titan is better, but the charts seem to say the opposite, at least for compute.
  • lanskywalker - Wednesday, April 9, 2014 - link

    That card is a sexy beast.
  • jimjamjamie - Thursday, April 10, 2014 - link

    Great effort from AMD, I wish they would focus on efficiency though - I feel with the changing computing climate and the shift to mobile that power-hungry components should be niche, not the norm.

    Basically, a dual-750ti card would be lovely :)
  • IUU - Saturday, April 12, 2014 - link

    The sad thing about all this, is that the lowest resolution for these cards is considered to be the 2560x1440 one(for those who understand).
    Bigger disappointment yet, that after so many years of high expectations, the gpu still stands as a huge piece of silicon inside the pc that's firmly chained by the IT industry to serve gamers only.
    Whatever the reason for no such consumer applications,thiis is a crime, mildly put.
  • RoboJ1M - Thursday, May 1, 2014 - link

    The 4870 stories that were written here by Anand were my most memorable and favourite.
    That and the SSD saga.

    Everybody loves a good Giant Killer story.

    But the "Small Die Strategy" has long since ended?
    When did that end?
    Why did that end? I mean, it worked so well, didn't it?
  • patrickjp93 - Friday, May 2, 2014 - link

    People should be warned: the performance of this card is nowhere close to what the benchmarks or limited tests suggest. Even on the newest Asrock Motherboard the PCI v3 lanes bottleneck this about 40%. If you're just going to sequentially transform the same data once it's on the card, yes, you have this performance, which is impressive for the base cost, though entirely lousy for the Flop/Watt. But, if you're going to attempt to be moving 8GB of data to and from the CPU and GPU continuously, this card performs marginally better than the 290. The busses are bridge chips are going to need to get much faster for these cards to be really useful for anything outside purely graphical applications in the future. It's pretty much a waste for GPGPU computing.
  • patrickjp93 - Friday, May 2, 2014 - link

    *The busses AND bridge chips...* Seriously what chat forum doesn't let you edit your comments?
  • Gizmosis350k - Sunday, May 4, 2014 - link

    I wonder if Quad CF works with these
  • Blitzninjasensei - Saturday, July 12, 2014 - link

    I'm trying to imagine what kind of person would have 4 of these and why, maybe EyeFinity with 4k? Even then your CPU would bottleneck way before that, you would need some kind of motherboard with dual CPU slots and a game that can take advantage of it.

Log in

Don't have an account? Sign up now