Metro: Last Light

As always, kicking off our look at performance is 4A Games’ latest entry in their Metro series of subterranean shooters, Metro: Last Light. The original Metro: 2033 was a graphically punishing game for its time and Metro: Last Light is in its own right too. On the other hand it scales well with resolution and quality settings, so it’s still playable on lower end hardware.

For the bulk of our analysis we’re going to be focusing on our 2560x1440 results, as monitors at this resolution will be what we expect the 290 to be primarily used with. A single 290 may have the horsepower to drive 4K in at least some situations, but given the current costs of 4K monitors that’s going to be a much different usage scenario. The significant quality tradeoff for making 4K playable on a single card means that it makes far more sense to double up on GPUs, given the fact that even a pair of 290Xs would still be a fraction of the cost of a 4K, 60Hz monitor.

With that said, there are a couple of things that should be immediately obvious when looking at the performance of the 290.

  1. It’s incredibly fast for the price.
  2. Its performance is at times extremely close to the 290X

To get right to the point, because of AMD’s fan speed modification the 290 doesn’t throttle in any of our games, not even Metro or Crysis 3. The 290X in comparison sees significant throttling in both of those games, and as a result once fully warmed up the 290X is operating at clockspeeds well below its 1000MHz boost clock, or even the 290’s 947MHz boost clock. As a result rather than having a 5% clockspeed deficit as the official specs for these cards would indicate, the 290 for all intents and purposes clocks higher than the 290X. Which means that its clockspeed advantage is now offsetting the loss of shader/texturing performance due to the CU reduction, while providing a clockspeed greater than the 290X for the equally configured front-end and back-end. In practice this means that 290 has over 100% of 290X’s ROP/geometry performance, 100% of the memory bandwidth, and at least 91% of the shading performance.

So in games where we’re not significantly shader bound, and Metro at 2560 appears to be one such case, the 290 can trade blows with the 290X despite its inherent disadvantage. Now as we’ll see this is not going to be the case in every game, as not every game GPU bound in the same manner and not every game throttles on the 290X by the same degree, but it sets up a very interesting performance scenario. By pushing the 290 this hard, and by throwing any noise considerations out the window, AMD has created a card that can not only threaten the GTX 780, but can threaten the 290X too. As we’ll see by the end of our benchmarks, the 290 is only going to trail the 290X by an average of 3% at 2560x1440.

Anyhow, looking at Metro it’s a very strong start for the 290. At 55.5fps it’s essentially tied with the 290X and 12% ahead of the GTX 780. Or to make a comparison against the cards it’s actually priced closer to, the 290 is 34% faster than the GTX 770 and 31% faster than the 280X. AMD’s performance advantage will come crashing down once we revisit the power and noise aspects of the card, but looking at raw performance it’s going to look very good for the 290.

AMD's Gaming Evolved Application & The Test Company of Heroes 2
Comments Locked

295 Comments

View All Comments

  • Morawka - Tuesday, November 5, 2013 - link

    Thanks for the review, overall the performance numbers are great but the heat and noise add a asterick to every Pro this card has.

    Custom coolers will help some, but this card is still pushing out less Performance per watt than team green. Custom coolers will only offset maybe 10% of that noise and heat, which makes this card still, louder and hotter than team green.

    What makes Nvidia so attractive are their highly popular proprietary features such as Shadowplay (it's amazing), and shield streaming. I know a lot of you could care less about shield, but it is selling well and receiving rave reviews none-the-less. Those kinds of technologies are what keeps me with Nvidia. I cannot stress how much i love Shadlowplay. Being able to record anything without any sort of performance hit is amazing. And the best part, it's already encoded in h.264
  • Morawka - Tuesday, November 5, 2013 - link

    i forgot to mention Gsync, in which Anand called "a game changer"
  • EJS1980 - Tuesday, November 5, 2013 - link



    The cooling solutions on these reference cards are simply atrocious, and in my opinion, completely unacceptable. Running your flagship GPU's at over 95c and 60dB, respectively, all while consuming upwards of 400w is nothing short of ridiculous. Taking the performance crown from Nvidia is fine and dandy, but we MUST look at what was needed of AMD to do so.

    Call me an idealist, but I guess I'm alone in my thinking that next-gen GPU's should increase price/performance, while simultaneously DECREASING heat, noise and power consumption (not the other way around). Nvidia can just as easily release their GPU's with no noise/heat/TDP restrictions to increase performance, but do we really want our ASIC makers to do this?

    I for one DO NOT want to go down that road, and I can't be the only one...
  • DMCalloway - Tuesday, November 5, 2013 - link

    I agree with the price paid for the crown. We do still need competition to keep overall pricing low, and no; you aren't the only, 'one'.
  • Mondozai - Friday, December 13, 2013 - link

    Yeah, but aftermarket coolers will fix this. Most people don't buy reference cards.

    Yet not a word about that. C'mon, EJS1980, you're a notorious buttboy for Nvidia.
  • dwade123 - Tuesday, November 5, 2013 - link

    Verdict: Extremely power inefficient.
  • jonjonjonj - Tuesday, November 5, 2013 - link

    do you really care if its power inefficient? as long as they can keep the temps reasonable and the performance justifies it i say waste all the power you want.
  • DMCalloway - Wednesday, November 6, 2013 - link

    Power inefficiency = higher temps. = more noise. Who cares about the power..... I'm currently paying $0.14 per Kwh.
  • jonjonjonj - Friday, November 15, 2013 - link

    if you are so concerned with power cost maybe you shouldn't be buying expensive performance parts. i want max performance and i'm willing to pay for it.
  • James5mith - Tuesday, November 5, 2013 - link

    I love the Editor's reference to Futurama. Keep it up!

Log in

Don't have an account? Sign up now