Metro: Last Light

As always, kicking off our look at performance is 4A Games’ latest entry in their Metro series of subterranean shooters, Metro: Last Light. The original Metro: 2033 was a graphically punishing game for its time and Metro: Last Light is in its own right too. On the other hand it scales well with resolution and quality settings, so it’s still playable on lower end hardware.

Metro: Last Light - 3840x2160 - High Quality

Metro: Last Light - 3840x2160 - Medium Quality

Metro: Last Light - 2560x1440 - High Quality

Metro: Last Light - 1920x1080 - Very High Quality

As has become customary for us for the last couple of high-end video card reviews, we’re going to be running all of our 4K video card benchmarks at both high quality and at a lower quality level. In practice not even GTX 980 is going to be fast enough to comfortably play most of these games at 3840x2160 with everything cranked up – that is going to be multi-GPU territory – so for that reason we’re including a lower quality setting to showcase just what performance looks like at settings more realistic for a single GPU.

GTX 980 comes out swinging in our first set of benchmarks. If there was any doubt that it could surpass the likes of R9 290XU and GTX 780 Ti, then this first benchmark is a great place to set those doubts to rest. At all resolutions and quality settings it comes out on top, surpassing NVIDIA’s former consumer flagship by anywhere from a few percent to 12% at 4K with high quality settings. Otherwise against the R9 290XU it’s a consistent 13% lead at 2560 and 4K Medium.

In absolute terms this is enough performance to keep its average framerates well over 60fps at 2560, and even at 3840 Medium it comes just short of crossing the 60fps mark. High quality mode will take the wind out of GTX 980’s sails though, pushing framerates back into the borderline 30fps range.

Looking at NVIDIA’s last-generation parts for a moment, the performance gains over the lower tier GK110 based GTX 780 are around 25-35%. This is about where you’d expect to see a new GTX x80 card given NVIDIA’s quasi-regular 2 year performance upgrade cadence. And when extended out to a full 2 years, the performance advantage over GTX 680 is anywhere between 60% and 92% depending on the resolution we’re looking at. NVIDIA proclaims that GTX 980 will achieve 2x the performance per watt of GTX 680, and since GTX 980 is designed to operate at a lower TDP than GTX 680, as we can see it means performance over GTX 680 won’t quite be doubled in most cases.

The Test Company of Heroes 2
POST A COMMENT

274 Comments

View All Comments

  • Sttm - Thursday, September 18, 2014 - link

    "How will AMD and NVIDIA solve the problem they face and bring newer, better products to the market?"

    My suggestion is they send their CEOs over to Intel to beg on their knees for access to their 14nm process. This is getting silly, GPUs shouldn't be 4 years behind CPUs on process node. Someone cut Intel a big fat check and get this done already.
    Reply
  • joepaxxx - Thursday, September 18, 2014 - link

    It's not just about having access to the process technology and fab. The cost of actually designing and verifying an SoC at nodes past 28nm is approaching the breaking point for most markets, that's why companies aren't jumping on to them. I saw one estimate of 500 million for development of a 16/14nm device. You better have a pretty good lock on the market to spend that kind of money. Reply
  • extide - Friday, September 19, 2014 - link

    Yeah, but the GPU market is not one of those markets where the verification cost will break the bank, dude. Reply
  • Samus - Friday, September 19, 2014 - link

    Seriously, nVidia's market cap is $10 billion dollars, they can spend a tiny fortune moving to 20nm and beyond...if they want too.

    I just don't think they want to saturate their previous products with such leaps and bounds in performance while also absolutely destroying their competition.

    Moving to a smaller process isn't out of nVidia's reach, I just don't think they have a competitive incentive to spend the money on it. They've already been accused of becoming a monopoly after purchasing 3Dfx, and it'd be painful if AMD/ATI exited the PC graphics market because nVidia's Maxwell's, being twice as efficient as GCN, were priced identically.
    Reply
  • bernstein - Friday, September 19, 2014 - link

    atm. it is out of reach to them. at least from a financial perspective.
    while it would be awesome to have maxwell designed for & produced on intel's 14nm process, intel doesn't even have the capacity to produce all of their own cpus... until fall 2015 (broadwell xeon-ep release)...
    Reply
  • kron123456789 - Friday, September 19, 2014 - link

    "it also marks the end of support for NVIDIA’s D3D10 GPUs: the 8, 9, 100, 200, and 300 series. Beginning with R343 these products are no longer supported in new driver branches and have been moved to legacy status." - This is it. The time has come to buy a new card to replace my GeForce 9800GT :) Reply
  • bobwya - Friday, September 19, 2014 - link

    Such a modern card - why bother :-) The 980 will finally replace my 8800 GTX. Now that's a genuinely old card!!
    Actually I mainly need to do the upgrade because the power bills are so ridiculous for the 8800 GTX! For pities sake the card only has one power profile (high power usage).
    Reply
  • djscrew - Friday, September 19, 2014 - link

    Like +1 Reply
  • kron123456789 - Saturday, September 20, 2014 - link

    Oh yeah, modern :) It's only 6 years old) But it can handle even Tomb Raider at 1080p with 30-40fps at medium settings :) Reply
  • SkyBill40 - Saturday, September 20, 2014 - link

    I've got an 8800 GTS 640MB still running in my mom's rig that's far more than what she'd ever need. Despite getting great performance from my MSI 660Ti OC 2GB Power Edition, it might be time to consider moving up the ladder since finding another identical card at a decent price for SLI likely wouldn't be worth the effort.

    So, either I sell off this 660Ti, give it to her, or hold onto it for a HTPC build at some point down the line. Decision, decisions. :)
    Reply

Log in

Don't have an account? Sign up now