Crysis 3

Still one of our most punishing benchmarks, Crysis 3 needs no introduction. With Crysis 3, Crytek has gone back to trying to kill computers and still holds “most punishing shooter” title in our benchmark suite. Only in a handful of setups can we even run Crysis 3 at its highest (Very High) settings, and that’s still without AA. Crysis 1 was an excellent template for the kind of performance required to drive games for the next few years, and Crysis 3 looks to be much the same for 2014.

Crysis 3 - 3840x2160 - High Quality + FXAA

Crysis 3 - 3840x2160 - Low Quality + FXAA

Crysis 3 - 2560x1440 - High Quality + FXAA

Crysis 3 - 1920x1080 - High Quality + FXAA

Meanwhile delta percentage performance is extremely strong here. Everyone, including the GTX 980, is well below 3%.

Always a punishing game, Crysis 3 ends up being one of the only games the GTX 980 doesn’t take a meaningful lead on over the GTX 780 Ti. To be clear the GTX 980 wins in most of these benchmarks, but not in all of them, and even when it does win the GTX 780 Ti is never far behind. For this reason the GTX 980’s lead over the GTX 780 Ti and the rest of our single-GPU video cards is never more than a few percent, even at 4K. Otherwise at 1440p we’re looking at the tables being turned, with the GTX 980 taking a 3% deficit. This is the only time the GTX 980 will lose to NVIDIA’s previous generation consumer flagship.

As for the comparison versus AMD’s cards, NVIDIA has been doing well in Crysis 3 and that extends to the GTX 980 as well. The GTX 980 takes a 10-20% lead over the R9 290XU depending on the resolution, with its advantage shrinking as the resolution grows. During the launch of the R9 290 series we saw that AMD tended to do better than NVIDIA at higher resolutions, and while this pattern has narrowed some, it has not gone away. AMD is still the most likely to pull even with the GTX 980 at 4K resolutions, despite the additional ROPS available to the GTX 980.

This will also be the worst showing for the GTX 980 relative to the GTX 680. GTX 980 is still well in the lead, but below 4K that lead is just 44%. NVIDIA can’t even do 50% better than the GTX 680 in this game until we finally push the GTX 680 out of its comfort zone at 4K.

All of this points to Crysis 3 being very shader limited at these settings. NVIDIA has significantly improved their CUDA core occupancy on Maxwell, but in these extreme situations GTX 980 will still struggle with the CUDA core deficit versus GK110, or the limited 33% increase in CUDA cores versus GTX 680. Which is a feather in Kepler’s cap if anything, showing that it’s not entirely outclassed if given a workload that maps well to its more ILP-sensitive shader architecture.

Crysis 3 - Delta Percentages

Crysis 3 - Surround/4K - Delta Percentages

The delta percentage story continues to be unremarkable with Crysis 3. GTX 980 does technically fare a bit worse, but it’s still well under 3%. Keep in mind that delta percentages do become more sensitive at higher framerates (there is less absolute time to pace frames), so a slight increase here is not unexpected.

Battlefield 4 Crysis: Warhead
Comments Locked

274 Comments

View All Comments

  • Viewgamer - Friday, September 19, 2014 - link

    To Ryan Smith. How can the GTX 980 possibly have a 165W TDP when it actually consumes 8 watts more than the 195W TDP GTX 680 !? please explain ? did Nvidia just play games with the figures to make them look more impressive ?
  • ArmedandDangerous - Friday, September 19, 2014 - link

    TDP =/= Power consumption although they are related. TDP is the amount of heat it will output.
  • Carrier - Friday, September 19, 2014 - link

    You're right, power consumption and heat output are related. That's because they're one and the same! What else could that electricity be converted to? Light? A massive magnetic field? Mechanical energy? (The fan, slightly, but the transistors aren't going anywhere.)
  • Laststop311 - Friday, September 19, 2014 - link

    no they aren't the same. Not all the electricity used is converted to heat. This is where the word EFFICIENCY comes into play. Yes it is related in a way but maxwell is more efficient with the electricity it draws using more of it and losing less of it to converted heat output. It's all in it's design.
  • bernstein - Friday, September 19, 2014 - link

    bullshit. since a gpu doesn't do chemical nor mechanical transformations all the energy used is converted to heat (by way of moving electrons around). efficiency in a gpu means how much energy is used for a fixed set of calculations (for example: flops)
  • Senpuu - Friday, September 19, 2014 - link

    It's okay to be ignorant, but not ignorant and belligerent.
  • bebimbap - Friday, September 19, 2014 - link

    there is "work" being done, as transistors have to "flip" by use of electrons. Even if you don't believe that "input energy =\= output heat" think of it this way
    100w incandescent bulb produces X amount of useful light
    18w florescent bulb also produces X amount of useful light

    in this sense the florescent bulb is much more efficient as it uses only 18w to produce the same light as the 100w incandescent. so if we say they produce the same amount of heat, then
    100w florescent would produce ~5x the light of a 100w incandescent.
  • Laststop311 - Saturday, September 20, 2014 - link

    ur so smart bro
  • Guspaz - Friday, September 19, 2014 - link

    The power draw figures in this article are overall system power draw, not GPU power draw. Since the 980 offers significantly more performance than the 680, it's cranking out more frames, which causes the CPU to work harder to keep up. As as result, the CPU power draw increases, counteracting the benefits of lower GPU power draw.
  • Carrier - Friday, September 19, 2014 - link

    I don't think that can explain the whole difference. It performs similarly to a 780 Ti in Crysis 3, so the difference in power consumption can only come from the card. The 980 is rated 85W less in TDP but consumes only 68W less at the wall. The discrepancy gets worse when you add losses in the power supply.

    My guess is the TDP is rated at nominal clock rate, which is cheating a little because the card consistently runs much higher than nominal because of the boost.

Log in

Don't have an account? Sign up now