Metro: Last Light

Kicking off our look at performance is 4A Games’ latest entry in their Metro series of subterranean shooters, Metro: Last Light. The original Metro: 2033 was a graphically punishing game for its time and Metro: Last Light is in its own right too. On the other hand it scales well with resolution and quality settings, so it’s still playable on lower end hardware.

Metro: Last Light - 2560x1440 - High Quality

Metro: Last Light -1920x1080 - Very High Quality

Metro: Last Light -1920x1080 - High Quality

The first benchmark in our revised benchmark suite finds our 280X cards doing well for themselves, and surprisingly not all that far off from the final averages. Setting the baseline here, as we expected the Tahiti based 280X performs in between the original 7970 and 7970 GHz Edition, thanks to the 280X’s use of PowerTune Boost but at lower clockspeeds than the 7970GE. Consequently this isn’t performance we haven’t seen before, but it’s very much worth keeping in mind that the 7970GE was a $400 card while the 280X is a $300 card, so approaching the 7970GE for $100 less is something of a significant price cut for the performance.

As for the immediate competitive comparison, we’ll be paying particular attention to 2560x1440, which should be the sweet spot resolution for this card. At 2560 we can see that the reference clocked 280X doesn’t just hang with the $400 GTX 770 but actually manages to edge it out by just over a frame per second. As a preface we’re going to see these two cards go back and forth throughout our benchmarks, but to be able to directly compete with NVIDIA’s fastest GK104 card for $100 less is a significant accomplishment for AMD.

Finally, let’s quickly talk about the Asus 280X versus the XFX 280X. Asus winning comes as no great shock due to their factory overclock, but now we finally get to see the magnitude of the performance gains from that overclock. At 2560 we’re looking at just shy of a 9% performance gain, which is in excess of both the boost clock overclock and the memory overclock. The specific performance gains will of course depend in the game in question, but this means that the performance gains in at least one instance are being impacted by the base clock overclock, the larger of Asus’s factory overclocks.

The Drivers, The Test & Our New Testbed Company of Heroes 2
Comments Locked

151 Comments

View All Comments

  • alfredska - Wednesday, October 9, 2013 - link

    Yes, I made a couple mistakes in trimming the fat from Ryan's writing. I should have done another proof-read myself. This wasn't the point of my post, though.

    Ryan's review is littered with sentence pauses that drastically slow down your ability to read the article. Some examples are: starting too many sentences with words like "ultimately" or "meanwhile"; making needless references to earlier statements; using past or present perfect tense when just past or present tense is appropriate. I wrote the above example hoping that Ryan would put it next to his own writing and see whether he can 1) read it faster, and 2) retain more information from this version.

    I can accept a misspelling here and there and even some accidental word injections of which I was guilty. The fluidity needs work though. If the reader cannot glide easily between paragraphs, they will stop reading and just look at pictures.
  • chuck.norris.and.son - Tuesday, October 8, 2013 - link

    tl:dr :( blablabla

    Can't you nail it down: AMD or Nvidia? Which GFX card should i buy to play Blockbuster like BF 4?
  • ShieTar - Tuesday, October 8, 2013 - link

    How about not buying any new GFX card and investing the savings into books in order to improve your reading skills?
  • Will Robinson - Tuesday, October 8, 2013 - link

    Radeon 280X will be the sweet spot card to get for BF4.
    R9 290X will be the open class champ over GTX780 I suspect.
  • piroroadkill - Tuesday, October 8, 2013 - link

    The best AMD card you can buy with your money, simply because Battlefield 4 will eventually feature the Mantle renderer which is for GCN cards only, and will probably be a killer feature.
  • hrga - Tuesday, October 8, 2013 - link

    Most moronic branding ever (at least the one to be overrun). They cutout vanilla 7850 or top end 7950 from the HD7000 lineup and call it with confusing R9/R7, unrealistically stupid marketing where nothing material stand behind those names.

    Another rebranding?
    - Yes. [thinking. What the heck did you expect guys]

    Not most successful?
    - [thinking. depends on POV] Well, it's here to milk the most cash as our CPU business didnt produced anything valuable for three years. And we also must have something interesting to present in our slideshow presentation for investors. If we couldn't afford to produce whole new lineup we could always produce yet another rebranded line just like nvidia. We always learn from our (cartel) competition, and customers don't seem to have any objections on that matter.

    So that's why you retain those moronically high prices?
    - We just adjust that according to our competition (cartel)

    But you never lower prices for HD7870 which today celebrates its second birthday and is produced on highly matured 28nm for at least six month. Instead you just rebranded it for second time after HD8860, so now we have R9 270X too. Don't you think you're customers would like to see some new designs while putting old products on discount prices?

    Or at least you could introduce that R9 270, which is same old HD7870, with lower prices than todays HD7870 retail prices are?!
    Instead of higher up prices for same performance (source Newgg http://imageshack.dk/imagesfree/xLh43741.jpg).
    And why the heck R9 desination for this mediocre mainstream product?! You could weaselishly sell this c-rap at the end of 2011, but "Hello AMD!" It the end of 2013.

    Pitcairn used in HD7800/HD8800 seriesis is smaller chip than Evergreen in HD5800, which only three years ago was produced on troublesome early 40nm process while this is two year old design now produced on highly mature TSMC 28nm-HK node for at least six month with far better yields? HD5850 had same or even lower prices at EOL (only year after introduction) than todays two year old Pitcairn desing. How do you explain that?
    - Well ...Milking you know ... When you have good cartel environment like we have competing with nvidia we could sky rocket prices. And you know even crappy Intels Knights Corner chips today produced at 22nm would be any cheaper because Intel knows how to milk moneys on their tick-tock performance introductions and they certainly would gave up that experience in case of "Chip Previously Known as Larabee" (CPKL)
  • labodhibo - Tuesday, October 8, 2013 - link

    Must read this.. totally different perspective:
    http://www.techspot.com/review/722-radeon-r9-270x-...
  • AssBall - Tuesday, October 8, 2013 - link

    Well done review. I kinda like what Asus did with its 280x version.

    Typo: "Asus calls it “CoolTech” and it’s essentially an effort to build a fan that’s blow a an axial fan and a blower (radial) fan at the same time,"
    [blow -> both?]
  • zlandar - Tuesday, October 8, 2013 - link

    This is why I wanted the Asus 770 card also in the recent 770 GTX roundup. The cooler design seems superior for single GPU purposes as long as you have the room for it in your case.
  • AxialLP7 - Tuesday, October 8, 2013 - link

    Not trying to be AFC, just want to make sense of this: "Asus calls it “CoolTech” and it’s essentially an effort to build a fan that’s blow a an axial fan and a blower (radial) fan at the same time, explaining the radial-like center and axial-like outer edge of the fan." Can someone help? This is in the "ASUS RADEON R9 280X DIRECTCU II TOP" section...

Log in

Don't have an account? Sign up now