• What
    is this?
    You've landed on the AMD Portal on AnandTech. This section is sponsored by AMD. It features a collection of all of our independent AMD content, as well as Tweets & News from AMD directly. AMD will also be running a couple of huge giveaways here so check back for those.
    PRESENTED BY

Metro: Last Light

As always, kicking off our look at performance is 4A Games’ latest entry in their Metro series of subterranean shooters, Metro: Last Light. The original Metro: 2033 was a graphically punishing game for its time and Metro: Last Light is in its own right too. On the other hand it scales well with resolution and quality settings, so it’s still playable on lower end hardware.

For the bulk of our analysis we’re going to be focusing on our 2560x1440 results, as monitors at this resolution will be what we expect the 290X to be primarily used with. A single 290X may have the horsepower to drive 4K in at least some situations, but given the current costs of 4K monitors that’s going to be a much different usage scenario.

With that said, for focusing on 4K on most games we’ve thrown in results both at a high quality setting, and a lower quality setting that makes it practical to run at 4K off of a single card. Given current monitor prices it won’t make a ton of sense to try to go with reduced quality settings just to save $550 – and consequently we may not keep the lower quality benchmarks around for future articles – but for the purposes of looking at a new GPU it’s useful to be able to look at single-GPU performance at framerates that are actually playable.

With that said, starting off with Metro at 2560 the 290X hits the ground running on our first benchmark. At 55fps it’s just a bit shy of hitting that 60fps average we love to cling to, but among all of our single-GPU cards it is the fastest, beating even the traditional powerhouse that is GTX Titan. Consequently the performance difference between 290X and GTX 780 (290X’s real competition) is even greater, with the 290X outpacing the GTX 780 by 13%, all the while being $100 cheaper. As we’ll see these results are a bit better than the overall average, but all told we’re not too far off. For as fast as GTX 780 is, 290X is going to be appreciably (if not significantly) faster.

290X also does well for itself compared to the Tahiti based 280X. At 2560 the 290X’s performance advantage stands at 31%, which as we alluded to earlier is greater than the increase in die size, offering solid proof that AMD has improved their performance per mm2 of silicon despite the fact that they’re still on the same 28nm manufacturing process. That 31% does come at a price increase of 83% however, which although normal for this price segment serves as a reminder that the performance increases offered by the fastest video cards with the biggest GPUs do not come cheaply.

Meanwhile for one final AMD comparison, let’s quickly look at the 290X in uber mode. As the 290X is unable to sustain the power/heat workload of a 1000MHz Hawaii GPU for an extended period of time, at its stock (quiet settings) it has to pull back on performance in order to meet reasonable operational parameters. Uber mode on the other hand represents what 290X and the Hawaii can do when fully unleashed; the noise costs won’t be pretty (as we’ll see), but in the process it builds on 290X’s existing leads and increases them by another 5%. And that’s really going to be one of the central narratives for 290X once semi-custom and fully-custom cards come online: Despite being a fully enabled part, 290X does not give us everything Hawaii is truly capable of.

Moving on, let’s talk about multi-GPU setups and 4K. Metro is a solid reminder that not every game scales similarly across different GPUs, and for that matter that not every game is going to significantly benefit from multi-GPU setups. Metro for its part isn’t particularly hospitable to multi-GPU cards, with the best setup scaling by only 53% at 2560. This is better than some games that won’t scale at all, but it won’t be as good as those games that see a near-100% performance improvement. Which consequently is also why we dropped Metro as a power benchmark, as this level of scaling is a poor showcase for the power/temp/noise characteristics of a pair of video cards under full load.

The real story here of course is that it’s another strong showing for AMD at both 2560 and 4K. At 2560 the 290X CF sees better performance scaling than the GTX 780 SLI – 53% versus 41% – further extending the 290X’s lead. Bumping the resolution up to 4K makes things even more lopsided in AMD’s favor, as at this point the NVIDIA cards essentially fail to scale (picking up just 17%) while the 290X sees an even greater scaling factor of 63%. As such for those few who can afford to seriously chase 4K gaming, the 290X is the only viable option in this scenario. And at 50fps average for 4K at high quality, 4K gaming at reasonable (though not maximum) quality settings is in fact attainable when it comes to Metro.

Meanwhile for single-GPU configurations at 4K, 4K is viable, but only at Metro’s lowest quality levels. This will be the first of many games where such a thing is possible, and the first of many games where going up to 4K in this manner further improves on AMD’s lead at 4K. Again, we’re not of the opinion that 4K at these low quality settings is a good way to play games, but it does provide some insight and validationg into AMD’s claims that their hardware is better suited for 4K gaming.

A Note On Crossfire, 4K Compatibility, Power, & The Test Company of Heroes 2
POST A COMMENT

396 Comments

View All Comments

  • Sandcat - Friday, October 25, 2013 - link

    That depends on what you define as 'acceptable frame rates'. Yeah, you do need a $500 card if you have a high refresh rate monitor and use it for 3d games, or just improved smoothness in non-3d games. A single 780 with my brothers' 144hz Asus monitor is required to get ~90 fps (i7-930 @ 4.0) in BF3 on Ultra with MSAA.

    The 290x almost requires liduid...the noise is offensive. Kudos to those with the equipment, but really, AMD cheaped out on the cooler in order to hit the price point. Good move, imho, but too loud for me.
    Reply
  • hoboville - Thursday, October 24, 2013 - link

    Yup, and it's hot. It will be worth buying once the manufacturers can add their own coolers and heat pipes.

    AMD has always been slower at lower res, but better in the 3x1080p to 6x1080p arena. They have always aimed for high-bandwidth memory, which is always performs better at high res. This is good for you as a buyer because it means you'll get better scaling at high res. It's essentially forward-looking tech, which is good for those who will be upgrading monitors in the new few years when 1440p IPS starts to be more affordable. At low res the bottleneck isn't RAM, but computer power. Regardless, buying a Titan / 780 / 290X for anything less than 1440p is silly, you'll be way past the 60-70 fps human eye limit anyway.
    Reply
  • eddieveenstra - Sunday, October 27, 2013 - link

    Maybe 60-70fps is the limit. but at 120Hz 60FPS will give noticable lag. 75 is about the minimum. That or i'm having eagle eyes. The 780gtx still dips in the low framerates at 120Hz (1920x1080). So the whole debate about titan or 780 being overkill @1080P is just nonsense. (780gtx 120Hz gamer here) Reply
  • hoboville - Sunday, October 27, 2013 - link

    That really depends a lot on your monitor. When they talked about Gsync and frame lag and smoothness, they mentioned when FPS doesn't exactly match the refresh rate you get latency and bad frame timing. That you have this problem with a 120 Hz monitor is no surprise as at anything less than 120 FPS you'll see some form of stuttering. When we talk about FPS > refresh rate then you won't notice this. At home I use a 2048x1152 @ 60 Hz and beyond 60 FPS all the extra frames are dropped, where as in your case you'll have some frames "hang" when you are getting less than 120 FPS, because the frames have to "sit" on the screen for an interval until the next one is displayed. This appears to be stuttering, and you need to get a higher FPS from the game in order for the frame delivery to appear smoother. This is because apparent delay decreases as a ratio of [delivered frames (FPS) / monitor refresh speed]. Once the ratio is small enough, you can no longer detect apparent delay. In essence 120 Hz was a bad idea, unless you get Gsync (which means a new monitor).

    Get a good 1440p IPS at 60 Hz and you won't have that problem, and the image fidelity will make you wonder why you ever bought a monitor with 56% of 1440p pixels in the first place...
    Reply
  • eddieveenstra - Sunday, October 27, 2013 - link

    To be honnest. I would never think about going back to 60Hz. I love 120Hz but don't know a thing about IPS monitors. Thanks for the response....

    Just checked it and that sounds good. When becoming more affordable i will start thinking about that. Seems like the IPS monitors are better with colors and have less blur@60Hz than TN. link:http://en.wikipedia.org/wiki/IPS_panel
    Reply
  • Spunjji - Friday, October 25, 2013 - link

    Step 1) Take data irrespective of different collection methods.

    Step 2) Perform average of data.

    Step 3) Completely useless results!

    Congratulations, sir; you have broken Science.
    Reply
  • nutingut - Saturday, October 26, 2013 - link

    But who cares if you can play at 90 vs 100 fps? Reply
  • MousE007 - Thursday, October 24, 2013 - link

    Very true, but remember, the only reason nvidia prices their cards where they are is because they could. (Eg Intel CPUs v AMD) Having said that, I truly welcome the competition as it makes it better for all of us, regardless of which side of the fence you sit. Reply
  • valkyrie743 - Thursday, October 24, 2013 - link

    the card runs at 95C and sucks power like no tomorrow. only only beats the 780 by a very little. does not overclock well.

    http://www.youtube.com/watch?v=-lZ3Z6Niir4
    and
    http://www.youtube.com/watch?v=3OHKWMgBhvA

    http://www.overclock3d.net/reviews/gpu_displays/am...

    i like his review. its pure honest and shows the facts. im not a nvidia fanboy nore am i a amd fanboy. but ill take nvidia right how over amd.

    i do like how this card is priced and the performance for the price. makes the titan not worth 1000 bucks (or the 850 bucks it goes used on forums) but as for the 780. if you get a non reference 780. it will be faster than the 290x and put out LESS heat and LESS noise. as well as use less power.

    plus gtx 780 TI is coming out in mid November which will probably cut the cost of the current 780 too 550 and and this card would be probably aorund 600 and beat this card even more.
    Reply
  • jljaynes - Friday, October 25, 2013 - link

    you say the review sticks with the facts - he starts off talking about how ugly the card is so it needs to beat a titan. and then the next sentence he says the R9-290X will cost $699.

    he sure seems to stick with the facts.
    Reply

Log in

Don't have an account? Sign up now