The Test

Starting with today’s article we’ve made a small change to our suite of games. We are replacing our last 2012 game, Hitman: Absolution with another Square Enix title: the recently released Thief. Both games make use of many of the same graphical features, and both games include a built-in benchmark that is a good approximation of what a worst case rendering load in the game will behave like, making Thief a solid replacement for the older Hitman.

Meanwhile we’ve also updated all of our benchmark results to reflect the latest drivers from AMD and NVIDIA. For all AMD cards we are using AMD’s R9 295X2 launch drivers, Catalyst 14.4. Catalyst 14.4 appears to be a new branch of AMD’s drivers, given the version number 14.100, however we have found very few performance changes in our tests.

As for NVIDIA cards, we’re using the just-launched 337.50 drivers. These drivers contain a collection of performance improvements for NVIDIA cards and coincidentally come at just the right time for NVIDIA to counter AMD’s latest product launch.

We also need to quickly note that because AMD’s Radeon R9 295X2 uses an external 120mm radiator, we’ve had to modify our testbed to house the card. For our R9 295X2 tests we have pulled our testbed’s rear 140mm fan and replaced it with the R9 295X2 radiator. All other tests have the 140mm fan installed as normal.

CPU: Intel Core i7-4960X @ 4.2GHz
Motherboard: ASRock Fatal1ty X79 Professional
Power Supply: Corsair AX1200i
Hard Disk: Samsung SSD 840 EVO (750GB)
Memory: G.Skill RipjawZ DDR3-1866 4 x 8GB (9-10-9-26)
Case: NZXT Phantom 630 Windowed Edition
Monitor: Asus PQ321
Video Cards: AMD Radeon R9 295X2
AMD Radeon R9 290X
AMD Radeon R9 290
AMD Radeon HD 7990
AMD Radeon HD 6990
NVIDIA GeForce GTX Titan Black
NVIDIA GeForce GTX 780 Ti
NVIDIA GeForce GTX 780
NVIDIA GeForce GTX 690
NVIDIA GeForce GTX 590
Video Drivers: NVIDIA Release 337.50 Beta
AMD Catalyst 14.4 Beta
OS: Windows 8.1 Pro

 

Revisiting the Radeon HD 7990 & Frame Pacing Metro: Last Light
Comments Locked

131 Comments

View All Comments

  • HalloweenJack - Tuesday, April 8, 2014 - link

    cheaper set of 780ti`s? 2 of them is $1300 > $1400 and the 295 isn't even in retail yet....

    anandtech going to slate the Titan Z as much? or is the pay cheques worth too much. shame to see the bias , anandtech used to be a good site before it sold out.
  • GreenOrbs - Tuesday, April 8, 2014 - link

    Not seeing the bias--Anandtech is usually pretty fair. I think you have overlooked the fact that AMD is a sponsor not NVIDA. If anything "slating" Titan Z would be more consistent of your theory of "selling out."
  • nathanddrews - Tuesday, April 8, 2014 - link

    What bias?

    http://www.anandtech.com/bench/product/1187?vs=107...
    Two 780ti cards are cheaper than the 295x2, that's a fact.
    Two 780ti cards consume much less power than the 295x2, that's a fact.
    Two 780ti cards have better frame latency than the 295x2, that's a fact.
    Two 780ti cards have nearly identical performance to the 295x2, that's a fact.

    If someone was trying to decide between them, I'd recommend dual 780ti cards to save money and get similar performance. However, if that person only had a dual-slot available, it would be the 295x2 hands-down.

    The Titan Z isn't really any competition here - the 790 (790ti?) will be the 295x2's real competition. The real question is will NVIDIA price it less than or more than the 295x2?
  • PEJUman - Tuesday, April 8, 2014 - link

    I don't think the target market for this stuff (295x2 or Titan Z) are single GPU slots, as Ryan briefly mentioned, most people who are quite poor (myself included), will go with 780TI x 2 or 290x x 2, These cards are aimed at Quads.

    AMD have priced it appropriately, roughly equal perf. potential for 3k dual 295x2 vs 6k for dual titan-z. Unfortunately, 4GB may not be enough for Quads...

    I've ventured into multiGPUs in the past, I find these rely too much on driver updates (see how poorly 7990 runs nowadays, and AMD will be concentrating their resource on 295x2). Never again.
  • Earballs - Wednesday, April 9, 2014 - link

    With respect, any decision on what to buy should made but what your application is. Paper facts are worthless when they don't hold up to (your version of) real world tasks. Personally I've been searching for a good single card to make up for Titanfall's flaws with CF/SLI. Point is, be careful with your recommendations if they're based on facts. ;)

    Sidenote: I managed to pick up a used 290x for MSRP with the intention of adding another one once CF is fixed with Titanfall. That price:performance, which can be had today, skews the results of this round-up quite a bit IMO.
  • MisterIt - Tuesday, April 8, 2014 - link

    By drawing that much power from the PCI-lane, won't it be a fire hassard? I'v read multiple post about motherboard which take fire at bitcoin/scryptcoin mining forums due to using to many GPU without using a power riser to lower the amount of power delivered trought the pci-lane.

    Would Anandtech be willing to test the claim from AMD by running the GPU at full load for a longer period of time under a fire controlled environment?
  • Ryan Smith - Tuesday, April 8, 2014 - link

    The extra power is designed to be drawn off of the external power sockets, not the PCIe slot itself. It's roughly 215W + 215W + 75W, keeping the PCIe slot below its 75W limit.
  • MisterIt - Tuesday, April 8, 2014 - link

    Hmm allright, thanks for the reply.
    Still rather skeptical, but I'll guess there should be plenty of users reviews before the time i'm considering to upgrade my own GPU anyways.
  • CiccioB - Tuesday, April 8, 2014 - link

    Don't 8-pin molex connector specifics indicate 150W max power draw? 215W are quite out of that limit.
  • Ryan Smith - Tuesday, April 8, 2014 - link

    Yes, but it's a bit more complex than that: http://www.anandtech.com/show/4209/amds-radeon-hd-...

Log in

Don't have an account? Sign up now