Final Words

Bringing this review to a close, when I first heard that AMD was going to build a full performance dual Hawaii GPU solution, I was admittedly unsure about what to expect. The power requirements for a dual Hawaii card would pose an interesting set of challenges for AMD, and AMD’s most recent high-end air coolers were not as effective as they should have been.

In that context AMD’s decision to build a card around a closed loop liquid cooling solution makes a lot of sense for what they wanted to achieve. Like any uncommon cooling solution, the semi-exotic nature of a CLLC is a double edged sword that brings with it both benefits and drawbacks; the benefits being enhanced cooling performance, and the drawbacks being complexity and size. So it was clear from the start that given AMD’s goals and their chips, the benefits they could stand to gain could very well outweigh the drawbacks of going so far off of the beaten path.

To that end the Radeon R9 295X2 is a beast, and that goes for every sense of the word. From a performance standpoint AMD has delivered on their goals of offering the full, unthrottled performance of a Radeon R9 290X Crossfire solution. AMD has called the 295X2 an “uncompromised” card and that’s exactly what they have put together, making absolutely no performance compromises in putting a pair of Hawaii GPUs on to a single video card. In a sense it’s almost too simple – there are no real edge cases or other performance bottlenecks to discuss – but then again that’s exactly what it means to offer uncompromised performance.

“Beastly” is just as fitting for the card when it comes to its cooling too. With a maximum noise level of 50dB the 295X2’s CLLC is unlike anything we’re reviewed before, offering acoustic performance as good as or better than some of the best high end cards of this generation despite the heavy cooling workloads such a product calls for. Which brings us to the other beastly aspect, which is the card’s 500W TDP. AMD has put together a card that can put out 500W of heat and still keep itself cooled, but there’s no getting around the fact that at half a kilowatt in power consumption the 295X2 draws more power than any other single card we’ve reviewed before.

Taken altogether this puts the 295X2 in a very interesting spot. The performance offered by the 295X2 is the same performance offered by the 290X in Crossfire, no more and no less. This means that depending on whether we’re looking at 2K or 4K resolutions the 295X2 either trails a cheaper set of GTX 780 Tis in SLI by 5%, or at the kinds of resolutions that most require this much performance it can now exceed those very same GeForce cards by 5%. On the other hand NVIDIA still holds a consistent edge over AMD in frame pacing. But thanks to their XDMA engine AMD's frame pacing performance is vastly improved compared to their prior dual-GPU cards and is now good enough overall (though there's definitely room for further improvement).

But more significantly, by its very nature as a CLLC equipped dual-GPU video card the 295X2 stands alone among current video cards. There’s nothing else like it in terms of design, and that admittedly makes it difficult to properly place the 295X2 in reference to other video cards. Do we talk about how it’s one of only a handful of dual-GPU cards? Or do we talk about the price? Or do we talk about the unconventional cooler?

However perhaps it’s best to frame the 295X2 with respect to its competition, or rather the lack thereof. For all the benefits and drawbacks of AMD’s card perhaps the most unexpected thing they have going for them is that they won’t be facing any real competition from NVIDIA. NVIDIA has announced their own dual-GPU card for later this month, the GeForce GTX Titan Z, but priced at $3000 and targeted more heavily at compute users than it is gamers, the GTX Titan Z is going to reside in its own little niche, leaving the 295X2 alone in the market at half the price. We’ll see what GTX Titan Z brings to the table later this month, but no matter what AMD is going to have an incredible edge on price that we expect will make most potential buyers think twice, despite the 295X2’s own $1500 price tag.

Ultimately while this outcome does put the 295X2 in something of a “winner by default” position, it does not change the fact that AMD has put together a very solid card, and what’s by far their best dual-GPU card yet. Between the price tag and the unconventional cooler it’s certainly a departure from the norm, but for those buyers who can afford and fit this beastly card, it sets a new and very high standard for just what a dual-GPU should do.

Power, Temperature, & Noise
Comments Locked

131 Comments

View All Comments

  • HalloweenJack - Tuesday, April 8, 2014 - link

    cheaper set of 780ti`s? 2 of them is $1300 > $1400 and the 295 isn't even in retail yet....

    anandtech going to slate the Titan Z as much? or is the pay cheques worth too much. shame to see the bias , anandtech used to be a good site before it sold out.
  • GreenOrbs - Tuesday, April 8, 2014 - link

    Not seeing the bias--Anandtech is usually pretty fair. I think you have overlooked the fact that AMD is a sponsor not NVIDA. If anything "slating" Titan Z would be more consistent of your theory of "selling out."
  • nathanddrews - Tuesday, April 8, 2014 - link

    What bias?

    http://www.anandtech.com/bench/product/1187?vs=107...
    Two 780ti cards are cheaper than the 295x2, that's a fact.
    Two 780ti cards consume much less power than the 295x2, that's a fact.
    Two 780ti cards have better frame latency than the 295x2, that's a fact.
    Two 780ti cards have nearly identical performance to the 295x2, that's a fact.

    If someone was trying to decide between them, I'd recommend dual 780ti cards to save money and get similar performance. However, if that person only had a dual-slot available, it would be the 295x2 hands-down.

    The Titan Z isn't really any competition here - the 790 (790ti?) will be the 295x2's real competition. The real question is will NVIDIA price it less than or more than the 295x2?
  • PEJUman - Tuesday, April 8, 2014 - link

    I don't think the target market for this stuff (295x2 or Titan Z) are single GPU slots, as Ryan briefly mentioned, most people who are quite poor (myself included), will go with 780TI x 2 or 290x x 2, These cards are aimed at Quads.

    AMD have priced it appropriately, roughly equal perf. potential for 3k dual 295x2 vs 6k for dual titan-z. Unfortunately, 4GB may not be enough for Quads...

    I've ventured into multiGPUs in the past, I find these rely too much on driver updates (see how poorly 7990 runs nowadays, and AMD will be concentrating their resource on 295x2). Never again.
  • Earballs - Wednesday, April 9, 2014 - link

    With respect, any decision on what to buy should made but what your application is. Paper facts are worthless when they don't hold up to (your version of) real world tasks. Personally I've been searching for a good single card to make up for Titanfall's flaws with CF/SLI. Point is, be careful with your recommendations if they're based on facts. ;)

    Sidenote: I managed to pick up a used 290x for MSRP with the intention of adding another one once CF is fixed with Titanfall. That price:performance, which can be had today, skews the results of this round-up quite a bit IMO.
  • MisterIt - Tuesday, April 8, 2014 - link

    By drawing that much power from the PCI-lane, won't it be a fire hassard? I'v read multiple post about motherboard which take fire at bitcoin/scryptcoin mining forums due to using to many GPU without using a power riser to lower the amount of power delivered trought the pci-lane.

    Would Anandtech be willing to test the claim from AMD by running the GPU at full load for a longer period of time under a fire controlled environment?
  • Ryan Smith - Tuesday, April 8, 2014 - link

    The extra power is designed to be drawn off of the external power sockets, not the PCIe slot itself. It's roughly 215W + 215W + 75W, keeping the PCIe slot below its 75W limit.
  • MisterIt - Tuesday, April 8, 2014 - link

    Hmm allright, thanks for the reply.
    Still rather skeptical, but I'll guess there should be plenty of users reviews before the time i'm considering to upgrade my own GPU anyways.
  • CiccioB - Tuesday, April 8, 2014 - link

    Don't 8-pin molex connector specifics indicate 150W max power draw? 215W are quite out of that limit.
  • Ryan Smith - Tuesday, April 8, 2014 - link

    Yes, but it's a bit more complex than that: http://www.anandtech.com/show/4209/amds-radeon-hd-...

Log in

Don't have an account? Sign up now