Crysis 3

Still one of our most punishing benchmarks, Crysis 3 needs no introduction. With Crysis 3, Crytek has gone back to trying to kill computers and still holds “most punishing shooter” title in our benchmark suite. Only in a handful of setups can we even run Crysis 3 at its highest (Very High) settings, and that’s still without AA. Crysis 1 was an excellent template for the kind of performance required to drive games for the next few years, and Crysis 3 looks to be much the same for 2013.

Much like Battlefield 3, at 2560 it’s a neck and neck race between the 290X and the GTX 780. At 52fps neither card stands apart, and in traditional Crysis fashion neither card is fast enough to pull off 60fps here – never mind the fact that we’re not even at the highest quality levels.

Meanwhile if we bump up the resolution to 4K, things get ugly, both in the literal and figurative senses. Even at the game’s lowest quality settings neither card can get out of the 40s, though as usual the 290X pulls ahead in performance at this resolution.

As such, for 60fps+ on Crysis 3 we’ll have to resort to AFR, which gives us some interesting results depending on which resolution we’re looking at. For 2560 it’s actually the GTX 780 SLI that pulls ahead, beating the 290X in scaling. However at 4K it’s the 290X CF that pulls ahead, enjoying a 53% scaling factor to the GTX 780’s 40%. Interestingly both cards see a reduction in scaling factors here versus 2560, despite the fact that both cards are having no problem reaching full utilization. Something about Crysis 3, most likely the sheer workload the game throws out at our GPUs, is really bogging things down at 4K. Though to AMD’s credit despite the poorer scaling factor at 4K the 290X CF in uber mode is just fast enough to hit 60fps at Medium quality, and not a frame more.

Moving on to our look at delta percentages, all of our AFR setups are acceptable here, but nothing is doing well. 20-21% variance is the order of the day, a far cry from the 1-2% variance of single card setups. This is one of those games where both vendors need to do their homework, as we’re going to be seeing a lot more of CryEngine 3 over the coming years.

As for 4K, things are no better but at least they’re no worse.

Battlefield 3 Crysis
Comments Locked

396 Comments

View All Comments

  • Spunjji - Friday, October 25, 2013 - link

    Word.
  • extide - Thursday, October 24, 2013 - link

    That doesn't mean that AMD can't come up with a solution that might even be compatible with G-Sync... Time will tell..
  • piroroadkill - Friday, October 25, 2013 - link

    That would not be in NVIDIA's best interests. If a lot of machines (AMD, Intel) won't support it, why would you buy a screen for a specific graphics card? Later down the line, maybe something like the R9 290X comes out, and you can save a TON of money on a high performing graphics card from another team.

    It doesn't make sense.

    For NVIDIA, their best bet at getting this out there and making the most money from it, is licencing it.
  • Mstngs351 - Sunday, November 3, 2013 - link

    Well it depends on the buyer. I've bounced between AMD and Nvidia (to be upfront I've had more Nvidia cards) and I've been wanting to step up to a larger 1440 monitor. I will be sure that it supports Gsync as it looks to be one of the more exciting recent developments.

    So although you are correct that not a lot of folks will buy an extra monitor just for Gsync, there are a lot of us who have been waiting for an excuse. :P
  • nutingut - Saturday, October 26, 2013 - link

    Haha, that would be something for the cartel office then, I figure.
  • elajt_1 - Sunday, October 27, 2013 - link

    This doesn't prevent AMD from making something similiar, if Nvidia decides to not make it open.
  • hoboville - Thursday, October 24, 2013 - link

    Gsync will require you to buy a new monitor. Dropping more money on graphics and smoothness will apply at the high end and for those with big wallets, but for the rest of us there's little point to jumping into Gsync.

    In 3-4 years when IPS 2560x1440 has matured to the point where it's both mainstream (cheap) and capable of delivering low-latency ghosting-free images, then Gsync will be a big deal, but right now only a small percentage of the population have invested in 1440p.

    The fact is, most people have been sitting on their 1080p screens for 3+ years and probably will for another 3 unless those same screens fail--$500+ for a desktop monitor is a lot to justify. Once the monitor upgrades start en mass, then Gsync will be a market changer because AMD will not have anything to compete with.
  • misfit410 - Thursday, October 24, 2013 - link

    G-String didn't kill anything, I'm not about to give up my Dell Ultrasharp for another Proprietary Nvidia tool.
  • anubis44 - Tuesday, October 29, 2013 - link

    Agreed. G-sync is a stupid solution to non-existent problem. If you have a fast enough frame rate, there's nothing to fix.
  • MADDER1 - Thursday, October 24, 2013 - link

    Mantle could be for either higher frame rate or more detail. Gsync sounds like just frame rate.

Log in

Don't have an account? Sign up now