Metro: Last Light

As always, kicking off our look at performance is 4A Games’ latest entry in their Metro series of subterranean shooters, Metro: Last Light. The original Metro: 2033 was a graphically punishing game for its time and Metro: Last Light is in its own right too. On the other hand it scales well with resolution and quality settings, so it’s still playable on lower end hardware.

Metro: Last Light - 3840x2160 - High Quality

Metro: Last Light - 3840x2160 - Low Quality

Metro: Last Light - 2560x1440 - High Quality

Our first gaming benchmark pretty much sets the tone for what we’ll be seeing in this review. In building the 295X2 AMD set out to build a single card that could match the performance of the 290X “Uber” In Crossfire, and that is exactly what we see happening here. The 295X2 and 290XU CF swap places due to run-to-run variation, but ultimately both tie together, whether it’s above the GTX 780 Ti SLI or below it.

As we’ve already seen with the 290X, thanks in part to AMD’s ROP advantage, AMD’s strong suit is in very high resolutions. This leads to the 295X2 edging out the competition at 2160p, while being edged out itself at 1440p. None the less between AMD and NVIDIA setups this is a very close fight thus far, and will be throughout. As for Metro, even at the punishing resolution of 2160, the 295X2 is fast enough to keep this game going at above 50fps.

The Test Company of Heroes 2
Comments Locked

131 Comments

View All Comments

  • HalloweenJack - Tuesday, April 8, 2014 - link

    cheaper set of 780ti`s? 2 of them is $1300 > $1400 and the 295 isn't even in retail yet....

    anandtech going to slate the Titan Z as much? or is the pay cheques worth too much. shame to see the bias , anandtech used to be a good site before it sold out.
  • GreenOrbs - Tuesday, April 8, 2014 - link

    Not seeing the bias--Anandtech is usually pretty fair. I think you have overlooked the fact that AMD is a sponsor not NVIDA. If anything "slating" Titan Z would be more consistent of your theory of "selling out."
  • nathanddrews - Tuesday, April 8, 2014 - link

    What bias?

    http://www.anandtech.com/bench/product/1187?vs=107...
    Two 780ti cards are cheaper than the 295x2, that's a fact.
    Two 780ti cards consume much less power than the 295x2, that's a fact.
    Two 780ti cards have better frame latency than the 295x2, that's a fact.
    Two 780ti cards have nearly identical performance to the 295x2, that's a fact.

    If someone was trying to decide between them, I'd recommend dual 780ti cards to save money and get similar performance. However, if that person only had a dual-slot available, it would be the 295x2 hands-down.

    The Titan Z isn't really any competition here - the 790 (790ti?) will be the 295x2's real competition. The real question is will NVIDIA price it less than or more than the 295x2?
  • PEJUman - Tuesday, April 8, 2014 - link

    I don't think the target market for this stuff (295x2 or Titan Z) are single GPU slots, as Ryan briefly mentioned, most people who are quite poor (myself included), will go with 780TI x 2 or 290x x 2, These cards are aimed at Quads.

    AMD have priced it appropriately, roughly equal perf. potential for 3k dual 295x2 vs 6k for dual titan-z. Unfortunately, 4GB may not be enough for Quads...

    I've ventured into multiGPUs in the past, I find these rely too much on driver updates (see how poorly 7990 runs nowadays, and AMD will be concentrating their resource on 295x2). Never again.
  • Earballs - Wednesday, April 9, 2014 - link

    With respect, any decision on what to buy should made but what your application is. Paper facts are worthless when they don't hold up to (your version of) real world tasks. Personally I've been searching for a good single card to make up for Titanfall's flaws with CF/SLI. Point is, be careful with your recommendations if they're based on facts. ;)

    Sidenote: I managed to pick up a used 290x for MSRP with the intention of adding another one once CF is fixed with Titanfall. That price:performance, which can be had today, skews the results of this round-up quite a bit IMO.
  • MisterIt - Tuesday, April 8, 2014 - link

    By drawing that much power from the PCI-lane, won't it be a fire hassard? I'v read multiple post about motherboard which take fire at bitcoin/scryptcoin mining forums due to using to many GPU without using a power riser to lower the amount of power delivered trought the pci-lane.

    Would Anandtech be willing to test the claim from AMD by running the GPU at full load for a longer period of time under a fire controlled environment?
  • Ryan Smith - Tuesday, April 8, 2014 - link

    The extra power is designed to be drawn off of the external power sockets, not the PCIe slot itself. It's roughly 215W + 215W + 75W, keeping the PCIe slot below its 75W limit.
  • MisterIt - Tuesday, April 8, 2014 - link

    Hmm allright, thanks for the reply.
    Still rather skeptical, but I'll guess there should be plenty of users reviews before the time i'm considering to upgrade my own GPU anyways.
  • CiccioB - Tuesday, April 8, 2014 - link

    Don't 8-pin molex connector specifics indicate 150W max power draw? 215W are quite out of that limit.
  • Ryan Smith - Tuesday, April 8, 2014 - link

    Yes, but it's a bit more complex than that: http://www.anandtech.com/show/4209/amds-radeon-hd-...

Log in

Don't have an account? Sign up now