Crysis 3

Still one of our most punishing benchmarks, Crysis 3 needs no introduction. With Crysis 3, Crytek has gone back to trying to kill computers and still holds “most punishing shooter” title in our benchmark suite. Only in a handful of setups can we even run Crysis 3 at its highest (Very High) settings, and that’s still without AA. Crysis 1 was an excellent template for the kind of performance required to drive games for the next few years, and Crysis 3 looks to be much the same for 2014.

Crysis 3 - 3840x2160 - High Quality + FXAA

Crysis 3 - 3840x2160 - Low Quality + FXAA

Crysis 3 - 2560x1440 - High Quality + FXAA

Crysis 3 - 1920x1080 - High Quality + FXAA

Meanwhile delta percentage performance is extremely strong here. Everyone, including the GTX 980, is well below 3%.

Always a punishing game, Crysis 3 ends up being one of the only games the GTX 980 doesn’t take a meaningful lead on over the GTX 780 Ti. To be clear the GTX 980 wins in most of these benchmarks, but not in all of them, and even when it does win the GTX 780 Ti is never far behind. For this reason the GTX 980’s lead over the GTX 780 Ti and the rest of our single-GPU video cards is never more than a few percent, even at 4K. Otherwise at 1440p we’re looking at the tables being turned, with the GTX 980 taking a 3% deficit. This is the only time the GTX 980 will lose to NVIDIA’s previous generation consumer flagship.

As for the comparison versus AMD’s cards, NVIDIA has been doing well in Crysis 3 and that extends to the GTX 980 as well. The GTX 980 takes a 10-20% lead over the R9 290XU depending on the resolution, with its advantage shrinking as the resolution grows. During the launch of the R9 290 series we saw that AMD tended to do better than NVIDIA at higher resolutions, and while this pattern has narrowed some, it has not gone away. AMD is still the most likely to pull even with the GTX 980 at 4K resolutions, despite the additional ROPS available to the GTX 980.

This will also be the worst showing for the GTX 980 relative to the GTX 680. GTX 980 is still well in the lead, but below 4K that lead is just 44%. NVIDIA can’t even do 50% better than the GTX 680 in this game until we finally push the GTX 680 out of its comfort zone at 4K.

All of this points to Crysis 3 being very shader limited at these settings. NVIDIA has significantly improved their CUDA core occupancy on Maxwell, but in these extreme situations GTX 980 will still struggle with the CUDA core deficit versus GK110, or the limited 33% increase in CUDA cores versus GTX 680. Which is a feather in Kepler’s cap if anything, showing that it’s not entirely outclassed if given a workload that maps well to its more ILP-sensitive shader architecture.

Crysis 3 - Delta Percentages

Crysis 3 - Surround/4K - Delta Percentages

The delta percentage story continues to be unremarkable with Crysis 3. GTX 980 does technically fare a bit worse, but it’s still well under 3%. Keep in mind that delta percentages do become more sensitive at higher framerates (there is less absolute time to pace frames), so a slight increase here is not unexpected.

Battlefield 4 Crysis: Warhead
Comments Locked

274 Comments

View All Comments

  • atlantico - Friday, September 19, 2014 - link

    I'm sorry, but I couldn't care less about power efficiency on an enthusiast GPU unit. The 780Ti was a 250W card and that is a great card because it performs well. It delivers results.

    I have a desktop computer, a full ATX tower. Not a laptop. PSUs are cheap enough, it's even a question of that.

    So please, stuff the power requirements of this GTX980. The fact is if it sucked 250W and was more powerful, then it would have been a better card.
  • A5 - Friday, September 19, 2014 - link

    They'll be more than happy to sell you a $1000 GM210 Titan Black Ultra GTX, I'm sure.

    Fact is that enthusiast cards aren't really where they make their money anymore, and they're orienting their R&D accordingly.
  • Fallen Kell - Friday, September 19, 2014 - link

    Exactly. Not only that, the "real" money is in getting the cards in OEM systems which sell hundreds of thousands of units. And those are very power and cooling specific.
  • Antronman - Sunday, September 21, 2014 - link

    Yep, yep, and yep again.

    For OEMs, the difference between spending 10 more or less dollars is huge.

    More efficient cards means less power from the PSU. It's one of the reasons why GeForce cards are so much more popular in OEM systems.

    I have to disagree with the statement about enthusiast cards not being of value to Nvidia.

    Many people are of the opinion that Nvidia has always had better performance than AMD/ATI.
  • Tikcus9666 - Friday, September 19, 2014 - link

    For desktop cards power consumption is meaningless to the 99%
    Price/Performance is much more important. if Card A uses 50w more under full load than card B, but performs around the same and is £50 cheaper to buy at 15 p per kwh cost for energy it would take 6666 hours of running to get your £50 back. Add to this if Card A produces more heat into the room, in winter months your heating system will use less energy, meanning it takes even longer to get your cash back.... tldr Wattage is only important in laptops and tablets and things that need batterys to run
  • jwcalla - Friday, September 19, 2014 - link

    At least in this case it appears the power efficiency allows for a decent overclock. So you can get more performance and heat up your room at the same time.

    Of course I'm sure they're leaving some performance on the table for a refresh next year. Pascal is still a long way's off so they have to extend Maxwell's lifespan. Same deal as with Fermi and Kepler.
  • Icehawk - Friday, September 19, 2014 - link

    When I built my mATX current box one criteria was that it be silent, or nearly so while still being a full power rig (i7 OC'd & 670), and the limitation really is GPU draw - thankfully NVs had dropped by the 6xx series enough I was able to use a fanless PSU and get my machine dead silent. I am glad I don't need a tower box that sounds like a jet anymore :)

    I would love to see them offer a high TDP, better cooled, option though for the uber users who won't care about costs, heat, sound and are just looking for the max performance to drive those 4k/surround setups.
  • Yojimbo - Friday, September 19, 2014 - link

    I agree that power consumption in itself isn't so important to most consumer desktop users, as long as they don't require extra purchases to accommodate the cards. But since power consumption and noise seem to be directly related for GPUs, power efficiency is actually an important consideration for a fair number of consumer desktop users.
  • RaistlinZ - Sunday, September 21, 2014 - link

    Yeah, but they're still limited by the 250W spec. So the only way to give us more and more powerful GPU's while staying within 250W is to increase efficiency.
  • kallogan - Friday, September 19, 2014 - link

    dat beast

Log in

Don't have an account? Sign up now