Metro: Last Light

As always, kicking off our look at performance is 4A Games’ latest entry in their Metro series of subterranean shooters, Metro: Last Light. The original Metro: 2033 was a graphically punishing game for its time and Metro: Last Light is in its own right too. On the other hand it scales well with resolution and quality settings, so it’s still playable on lower end hardware.

For the bulk of our analysis we’re going to be focusing on our 2560x1440 results, as monitors at this resolution will be what we expect the 290 to be primarily used with. A single 290 may have the horsepower to drive 4K in at least some situations, but given the current costs of 4K monitors that’s going to be a much different usage scenario. The significant quality tradeoff for making 4K playable on a single card means that it makes far more sense to double up on GPUs, given the fact that even a pair of 290Xs would still be a fraction of the cost of a 4K, 60Hz monitor.

With that said, there are a couple of things that should be immediately obvious when looking at the performance of the 290.

  1. It’s incredibly fast for the price.
  2. Its performance is at times extremely close to the 290X

To get right to the point, because of AMD’s fan speed modification the 290 doesn’t throttle in any of our games, not even Metro or Crysis 3. The 290X in comparison sees significant throttling in both of those games, and as a result once fully warmed up the 290X is operating at clockspeeds well below its 1000MHz boost clock, or even the 290’s 947MHz boost clock. As a result rather than having a 5% clockspeed deficit as the official specs for these cards would indicate, the 290 for all intents and purposes clocks higher than the 290X. Which means that its clockspeed advantage is now offsetting the loss of shader/texturing performance due to the CU reduction, while providing a clockspeed greater than the 290X for the equally configured front-end and back-end. In practice this means that 290 has over 100% of 290X’s ROP/geometry performance, 100% of the memory bandwidth, and at least 91% of the shading performance.

So in games where we’re not significantly shader bound, and Metro at 2560 appears to be one such case, the 290 can trade blows with the 290X despite its inherent disadvantage. Now as we’ll see this is not going to be the case in every game, as not every game GPU bound in the same manner and not every game throttles on the 290X by the same degree, but it sets up a very interesting performance scenario. By pushing the 290 this hard, and by throwing any noise considerations out the window, AMD has created a card that can not only threaten the GTX 780, but can threaten the 290X too. As we’ll see by the end of our benchmarks, the 290 is only going to trail the 290X by an average of 3% at 2560x1440.

Anyhow, looking at Metro it’s a very strong start for the 290. At 55.5fps it’s essentially tied with the 290X and 12% ahead of the GTX 780. Or to make a comparison against the cards it’s actually priced closer to, the 290 is 34% faster than the GTX 770 and 31% faster than the 280X. AMD’s performance advantage will come crashing down once we revisit the power and noise aspects of the card, but looking at raw performance it’s going to look very good for the 290.

AMD's Gaming Evolved Application & The Test Company of Heroes 2
Comments Locked

295 Comments

View All Comments

  • HisDivineOrder - Tuesday, November 5, 2013 - link

    Haha, spoken like someone who's never heard a card this loud. I can't wait to see all these cards on sale on ebay and forums everywhere. "I tried it and it's not for me, sidegrading to a 780," they'll say.

    This card is so loud you're going to be shocked by it. It's going to blow people's minds and it may even convert a few fanboys.
  • Finally - Tuesday, November 5, 2013 - link

    If he buys one with a nice custom fan, there won't be anything left to complain about. Truly terrible outlook for an Nvidiot, isn't it?
  • TheJian - Tuesday, November 5, 2013 - link

    You're forgetting they are using ref NV also. You don't get that when you buy an NV card and they come overclocked on top of quiet. Also this thing will draw the same watts no matter what. It remains to be seen how good a different cooler will actually be. Did AMD really choose two terrible fans for their product launch? Seriously? I'm wondering how much they can really fix this situation. AMD had to know this would cause bad reviews about noise nearly everywhere and even on AMD loving sites. I can't believe they are completely dumb, and chose a total piece of junk for the fan/heatsink here. I really think people are putting to much faith in a fix with a fan change. They are at 95 all day basically, how much fan do you need to fix that?

    If NV runs their gpus at 95 tomorrow (and cranked up even more to meet the noise they're getting here) these cards will both be spanked. You get a better cooler on NV cards that are NOT ref also.
  • jnad32 - Tuesday, November 5, 2013 - link

    The way I look at it, AMD is looking like an absolute genius. Everyone was ripping them on the 290X for it being too hot and too loud anyway. So instead of keeping the sound levels down they just went for what they do best, price/performance. They are now blowing every other card out of the water. There isn't a card on the planet that can touch this card in price/performance. Yea its loud as hell but, at least you have to think about it now just because of the price. What I really want to see is them unleash the 290X sound threshold and see what kind of raw numbers it can put up. Lets be honest, the only people who should buy reference cards are the ones who are putting water blocks on them.

    People have been saying this about the temp since launch, and I still don't get it. If AMD designed the chip to run at those temps, what's the big deal as long as it's not damaging it.
  • swing848 - Tuesday, November 5, 2013 - link

    It will only get loud for me when playing games or the occasional benchmark. During games I wear headphones, and during benchmarks I can leave the room. I have a room dedicated to computer use and the house has good sound proofing, so, it will not bother other people.

    If I want it quiet I will use a water cooler with a large radiator and fan.

    It is better than dumping all the hot air from the video card into my case, even if it is well cooled with 200mm fans. I overclock my CPU and I do not want it, RAM, or chips on the motherboard to get any hotter than necessary.
  • zeock9 - Tuesday, November 5, 2013 - link

    The burning question on my mind at this point is why AMD is restricting board partners from releasing their own custom designed and obviously better performing coolers on this otherwise fantastic card?
  • techkitsune - Tuesday, November 5, 2013 - link

    They likely don't want to look bad.

    It's okay. It's tough doing thermal management. I cram 1,000w of LED into a 30mm x 30mm space. AMD doesn't have the cooling problems that I have. Nor does nVidia nor intel. They should be grateful. :D
  • HisDivineOrder - Tuesday, November 5, 2013 - link

    They don't have them yet. That's why they haven't made custom boards. They're just getting them right now. They're going with what they have, which right now are just the reference boards. In a month or so, they'll have QA'ed some solutions with pre-existing cooling options, assuming said cooling options are good enough to benefit these cards.

    The thing is, you have to know these cards are running REALLY, REALLY hot to hit these levels at 95 degrees, so... custom coolers may have a hard time handling these cards without some tweaks. Perhaps to get faster fans on there.

    Also, it takes time to redesign a board to add VRM's and the 290 and 290X are still very, very new. You're not going to get an MSI Lightning version overnight.

    It's a solid deal in price, but man it's a shame AMD didn't offer a better custom cooler more attuned to the very special needs of the 290 series. It's also a shame their board is being pushed so hard and so much above what it seems capable of doing with reasonable power levels.

    This is like the Bulldozer of GPU's.
  • techkitsune - Tuesday, November 5, 2013 - link

    AMD could have just spent a few more dollars and used copper instead of aluminum, I would think. They could have easily doubled or tripled thermal conductivity and thus not needed to run the reference cooler anywhere near as high, plus that would leave a LOT of extra overclocking room.

    I still would buy it for the extra $45 that would have likely entailed, though I do worry about weight at that point. My 9800 GTX+ was pretty hefty, to say the least.
  • TheinsanegamerN - Tuesday, November 5, 2013 - link

    THIS. why does amd, or heck, any manufacturer, insist on using aluminum fins on a 250 watt+ gpu? my old amd 2600xt had a full copper heatsink, and it was nowhere near as power hungry as this card (and it ran cool to boot. never over 47c).
    use the exact same heatsink, but make those fins copper. wonder how much lower the temps would go?

Log in

Don't have an account? Sign up now