Crysis 3

Still one of our most punishing benchmarks, Crysis 3 needs no introduction. With Crysis 3, Crytek has gone back to trying to kill computers and still holds “most punishing shooter” title in our benchmark suite. Only in a handful of setups can we even run Crysis 3 at its highest (Very High) settings, and that’s still without AA. Crysis 1 was an excellent template for the kind of performance required to drive games for the next few years, and Crysis 3 looks to be much the same for 2014.

Crysis 3 - 3840x2160 - High Quality + FXAA

Crysis 3 - 3840x2160 - Low Quality + FXAA

Crysis 3 - 2560x1440 - High Quality + FXAA

Crysis 3 - 1920x1080 - High Quality + FXAA

Meanwhile delta percentage performance is extremely strong here. Everyone, including the GTX 980, is well below 3%.

Always a punishing game, Crysis 3 ends up being one of the only games the GTX 980 doesn’t take a meaningful lead on over the GTX 780 Ti. To be clear the GTX 980 wins in most of these benchmarks, but not in all of them, and even when it does win the GTX 780 Ti is never far behind. For this reason the GTX 980’s lead over the GTX 780 Ti and the rest of our single-GPU video cards is never more than a few percent, even at 4K. Otherwise at 1440p we’re looking at the tables being turned, with the GTX 980 taking a 3% deficit. This is the only time the GTX 980 will lose to NVIDIA’s previous generation consumer flagship.

As for the comparison versus AMD’s cards, NVIDIA has been doing well in Crysis 3 and that extends to the GTX 980 as well. The GTX 980 takes a 10-20% lead over the R9 290XU depending on the resolution, with its advantage shrinking as the resolution grows. During the launch of the R9 290 series we saw that AMD tended to do better than NVIDIA at higher resolutions, and while this pattern has narrowed some, it has not gone away. AMD is still the most likely to pull even with the GTX 980 at 4K resolutions, despite the additional ROPS available to the GTX 980.

This will also be the worst showing for the GTX 980 relative to the GTX 680. GTX 980 is still well in the lead, but below 4K that lead is just 44%. NVIDIA can’t even do 50% better than the GTX 680 in this game until we finally push the GTX 680 out of its comfort zone at 4K.

All of this points to Crysis 3 being very shader limited at these settings. NVIDIA has significantly improved their CUDA core occupancy on Maxwell, but in these extreme situations GTX 980 will still struggle with the CUDA core deficit versus GK110, or the limited 33% increase in CUDA cores versus GTX 680. Which is a feather in Kepler’s cap if anything, showing that it’s not entirely outclassed if given a workload that maps well to its more ILP-sensitive shader architecture.

Crysis 3 - Delta Percentages

Crysis 3 - Surround/4K - Delta Percentages

The delta percentage story continues to be unremarkable with Crysis 3. GTX 980 does technically fare a bit worse, but it’s still well under 3%. Keep in mind that delta percentages do become more sensitive at higher framerates (there is less absolute time to pace frames), so a slight increase here is not unexpected.

Battlefield 4 Crysis: Warhead
Comments Locked

274 Comments

View All Comments

  • squngy - Wednesday, November 19, 2014 - link

    It is explained in the article.

    Because GTX980 makes so many more frames the CPU is worked a lot harder. The W in those charts are for the whole system so when the CPU uses more power it makes it harder to directly compare GPUs
  • galta - Friday, September 19, 2014 - link

    The simple fact is that a GPU more powerful than a GTX 980 does not make sense right now, no matter how much we would love to see it.
    See, most folks are still gaming @ 1080, some of us are moving up to 1440. Under this scenarios, a GTX 980 is more than enough, even if quality settings are maxed out. Early reviews show that it can even handle 4K with moderate settings, and we should expect further performance gains as drivers improve.
    Maybe in a year or two, when 4K monitors become more relevant, a more powerful GPU would make sense. Now they simply don't.
    For the moment, nVidia's movement is smart and commendable: power efficiency!
    I mean, such a powerful card at only 165W! If you are crazy/wealthy enough to have two of them in SLI, you can cut your power demand by 170W, with following gains in temps and/or noise, and and less expensive PSU, if you're building from scratch.
    In the end, are these new cards great? Of course they are!
    Does it make sense to up-grade right now? Only if you running a 5xx or 6xx series card, or if your demands have increased dramatically (multi-monitor set-up, higher res. etc.).
  • Margalus - Friday, September 19, 2014 - link

    A more powerful gpu does make sense. Some people like to play their games with triple monitors, or more. A single gpu that could play at 7680x1440 with all settings maxed out would be nice.
  • galta - Saturday, September 20, 2014 - link

    How many of us demand such power? The ones who really do can go SLI and OC the cards.
    nVidia would be spending billions for a card that would sell thousands. As I said: we would love the card, but still no sense
    Again, I would love to see it, but in the forseeable future, I won't need it. Happier with noise, power and heat efficiency.
  • Da W - Monday, September 22, 2014 - link

    Here's one that demands such power. I play 3600*1920 using 3 screens, almost 4k, 1/3 the budget, and still useful for, you know, working.
    Don't want sli/crossfire. Don't want a space heater either.
  • bebimbap - Saturday, September 20, 2014 - link

    gaming at 1080@144 or 1080 with min fps of 120 for ulmb is no joke when it comes to gpu requirement. Most modern games max at 80-90fps on a OC'd gtx670 you need at least an OC'd gtx770-780. I'd recommend 780ti. and though a 24" 1080 might seem "small" you only have so much focus. You can't focus on periphery vision you'd have to move your eyes to focus on another piece of the screen. the 24"-27" size seems perfect so you don't have to move your eyes/head much or at all.

    the next step is 1440@144 or min fps of 120 which requires more gpu than @ 4k60. as 1440 is about 2x 1080 you'd need a gpu 2x as powerful. so you can see why nvidia must put out a powerful card at a moderate price point. They need it for their 144hz gsync tech and 3dvision

    imo the ppi race isn't as beneficial as higher refresh rate. For TVs manufacturers are playing this game of misinformation so consumers get the short end of the stick, but having a monitor running at 144hz is a world of difference compared to 60hz for me. you can tell just from the mouse cursor moving across the screen. As I age I realize every day that my eyes will never be as good as yesterday, and knowing that I'd take a 27" 1440p @ 144hz any day over a 28" 5k @ 60hz.
  • Laststop311 - Sunday, September 21, 2014 - link

    Well it all depends on viewing distance. I use a 30" 2560x1600 dell u3014 to game on currently since it's larger i can sit further away and still have just as good of an experience as a 24 or 27 thats closer. So you can't just say larger monitors mean you can;t focus on it all cause you can just at a further distance.
  • theuglyman0war - Monday, September 22, 2014 - link

    The power of the newest technology is and has always been an illusion because the creation of games will always be an exercise in "compromise". Even a game like WOW that isn't crippled by console consideration is created by the lowest common denominator demographic in the PC hardware population. In other words... ( if u buy it they will make it vs. if they make it I will upgrade ). Besides the unlimited reach of an openworld's "possible" textures and vtx counts.
    "Some" artists are of the opinion that more hardware power would result in a less aggressive graphic budget! ( when the time spent wrangling a synced normal mapped representation of a high resolution sculpt or tracking seam problems in lightmapped approximations of complex illumination with long bake times can take longer than simply using that original complexity ). The compromise can take more time then if we had hardware that could keep up with an artists imagination.
    In which case I gotta wonder about the imagination of the end user that really believes his hardware is the end to any graphics progress?
  • ppi - Friday, September 19, 2014 - link

    On desktop, all AMD needs to do is to lower price and perhaps release OC'd 290X to match 980 performance. It will reduce their margins, but they won't be irrelevant on the market, like in CPUs vs Intel (where AMD's most powerful beasts barely touch Intel's low-end, apart from some specific multi-threaded cases)

    Why so simple? On desktop:
    - Performance is still #1 factor - if you offer more per your $, you win
    - Noise can be easily resolved via open air coolers
    - Power consumption is not such a big deal

    So ... if AMD card at a given price is as fast as Maxwell, then they are clearly worse choice. But if they are faster?

    In mobile, however, they are screwed big way, unless they have something REAL good in their sleeve (looking at Tonga, I do not think they do, as I am convinced AMD intends to pull off another HD5870 (i.e. be on the new process node first), but it apparently did not work this time around.)
  • Friendly0Fire - Friday, September 19, 2014 - link

    The 290X already is effectively an overclocked 290 though. I'm not sure they'd be able to crank up power consumption reliably without running into heat dissipation or power draw limits.

    Also, they'd have to invest in making a good reference cooler.

Log in

Don't have an account? Sign up now