Crysis

Up next is our legacy title for 2013/2014, Crysis: Warhead. The stand-alone expansion to 2007’s Crysis, at over 5 years old Crysis: Warhead can still beat most systems down. Crysis was intended to be future-looking as far as performance and visual quality goes, and it has clearly achieved that. We’ve only finally reached the point where single-GPU cards have come out that can hit 60fps at 1920 with 4xAA, never mind 2560 and beyond.

Unlike games such as Battlefield 3, AMD’s GCN cards have always excelled on Crysis: Warhead, and as a result at all resolutions and all settings the 290X tops our charts for single-GPU performance. At 2560 this is a 15% performance advantage for the 290X, pushing past GTX 780 and GTX Titan to be the only card to break into the 50fps range. While at 4K that’s a 22% performance advantage, which sees 290X and Titan become the only cards to even crack 40fps.

But of course if you want 60fps in either scenario, you need two GPUs. At which point 290X’s initial performance advantage, coupled with its AFR scaling advantage (77/81% versus 70%) only widens the gap between the 290X CF and GTX 780 SLI. Though either configuration will get you above 60fps in either resolution.

Meanwhile the performance advantage of the 290X over the 280X is lower here than it is in most games. At 2560 it’s just a 26% gain, a bit short of the 30% average.290X significantly bulks up on everything short of memory bandwidth and rasterization versus 280X, so the list of potential bottlenecks is relatively short in this scenario.

Interestingly, despite the 290X’s stellar performance when it comes to average framerates, the performance advantage with minimum framerates is more muted. 290X still beats GTX 780, but only by 4% at 2560. We’re not CPU bottlenecked, as evidenced by the AFR scaling, so there’s something about Crysis that leads to the 290X crashing a bit harder in the most strenuous scenes.

Crysis 3 Total War: Rome 2
Comments Locked

396 Comments

View All Comments

  • Spunjji - Friday, October 25, 2013 - link

    Word.
  • extide - Thursday, October 24, 2013 - link

    That doesn't mean that AMD can't come up with a solution that might even be compatible with G-Sync... Time will tell..
  • piroroadkill - Friday, October 25, 2013 - link

    That would not be in NVIDIA's best interests. If a lot of machines (AMD, Intel) won't support it, why would you buy a screen for a specific graphics card? Later down the line, maybe something like the R9 290X comes out, and you can save a TON of money on a high performing graphics card from another team.

    It doesn't make sense.

    For NVIDIA, their best bet at getting this out there and making the most money from it, is licencing it.
  • Mstngs351 - Sunday, November 3, 2013 - link

    Well it depends on the buyer. I've bounced between AMD and Nvidia (to be upfront I've had more Nvidia cards) and I've been wanting to step up to a larger 1440 monitor. I will be sure that it supports Gsync as it looks to be one of the more exciting recent developments.

    So although you are correct that not a lot of folks will buy an extra monitor just for Gsync, there are a lot of us who have been waiting for an excuse. :P
  • nutingut - Saturday, October 26, 2013 - link

    Haha, that would be something for the cartel office then, I figure.
  • elajt_1 - Sunday, October 27, 2013 - link

    This doesn't prevent AMD from making something similiar, if Nvidia decides to not make it open.
  • hoboville - Thursday, October 24, 2013 - link

    Gsync will require you to buy a new monitor. Dropping more money on graphics and smoothness will apply at the high end and for those with big wallets, but for the rest of us there's little point to jumping into Gsync.

    In 3-4 years when IPS 2560x1440 has matured to the point where it's both mainstream (cheap) and capable of delivering low-latency ghosting-free images, then Gsync will be a big deal, but right now only a small percentage of the population have invested in 1440p.

    The fact is, most people have been sitting on their 1080p screens for 3+ years and probably will for another 3 unless those same screens fail--$500+ for a desktop monitor is a lot to justify. Once the monitor upgrades start en mass, then Gsync will be a market changer because AMD will not have anything to compete with.
  • misfit410 - Thursday, October 24, 2013 - link

    G-String didn't kill anything, I'm not about to give up my Dell Ultrasharp for another Proprietary Nvidia tool.
  • anubis44 - Tuesday, October 29, 2013 - link

    Agreed. G-sync is a stupid solution to non-existent problem. If you have a fast enough frame rate, there's nothing to fix.
  • MADDER1 - Thursday, October 24, 2013 - link

    Mantle could be for either higher frame rate or more detail. Gsync sounds like just frame rate.

Log in

Don't have an account? Sign up now