Crysis 3

Still one of our most punishing benchmarks, Crysis 3 needs no introduction. With Crysis 3, Crytek has gone back to trying to kill computers and still holds “most punishing shooter” title in our benchmark suite. Only in a handful of setups can we even run Crysis 3 at its highest (Very High) settings, and that’s still without AA. Crysis 1 was an excellent template for the kind of performance required to drive games for the next few years, and Crysis 3 looks to be much the same for 2013.

Much like Battlefield 3, at 2560 it’s a neck and neck race between the 290X and the GTX 780. At 52fps neither card stands apart, and in traditional Crysis fashion neither card is fast enough to pull off 60fps here – never mind the fact that we’re not even at the highest quality levels.

Meanwhile if we bump up the resolution to 4K, things get ugly, both in the literal and figurative senses. Even at the game’s lowest quality settings neither card can get out of the 40s, though as usual the 290X pulls ahead in performance at this resolution.

As such, for 60fps+ on Crysis 3 we’ll have to resort to AFR, which gives us some interesting results depending on which resolution we’re looking at. For 2560 it’s actually the GTX 780 SLI that pulls ahead, beating the 290X in scaling. However at 4K it’s the 290X CF that pulls ahead, enjoying a 53% scaling factor to the GTX 780’s 40%. Interestingly both cards see a reduction in scaling factors here versus 2560, despite the fact that both cards are having no problem reaching full utilization. Something about Crysis 3, most likely the sheer workload the game throws out at our GPUs, is really bogging things down at 4K. Though to AMD’s credit despite the poorer scaling factor at 4K the 290X CF in uber mode is just fast enough to hit 60fps at Medium quality, and not a frame more.

Moving on to our look at delta percentages, all of our AFR setups are acceptable here, but nothing is doing well. 20-21% variance is the order of the day, a far cry from the 1-2% variance of single card setups. This is one of those games where both vendors need to do their homework, as we’re going to be seeing a lot more of CryEngine 3 over the coming years.

As for 4K, things are no better but at least they’re no worse.

Battlefield 3 Crysis
Comments Locked

396 Comments

View All Comments

  • itchyartist - Thursday, October 24, 2013 - link

    Incredible performance and value from AMD!

    The fastest single chip video card in the world. Overall it is faster than the nvidia Titan and only $549! Almost half the price!

    Truly great to see the best performance around at a cost that is not bending you over. Battlefield 4 with AMD Mantle just around the corner. These new 290X GPUs are going to be uncontested Kings of the Hill for the Battlefield 4 game. Free battlefield game with the 290X too.Must buy.

    Incredible!
  • Berzerker7 - Thursday, October 24, 2013 - link

    ...really? The card is $600. You reek of AMD PR.
  • Novulux - Thursday, October 24, 2013 - link

    It says $549 in this very review?
  • Berzerker7 - Thursday, October 24, 2013 - link

    It does indeed. His article still smells like pre-written script.
  • siliconwizard - Thursday, October 24, 2013 - link

    Like all the reviews state GTX Titan is now irrelevant. 290X took the crown and saved the wallet.
  • siliconwizard - Thursday, October 24, 2013 - link

    Thinking that sphere toucher' s comment is accurate. Bit of salt here over amd taking over the high end slot and ridiculing the titan card. Only going to get worse once the Mantle enabled games are rleased. Nvidia is finished for battlefield 4. Crushed by amd, 290x and mantle.
  • MousE007 - Thursday, October 24, 2013 - link

    Mantle.....lol , nvidia Gsync just killed AMD
  • ninjaquick - Thursday, October 24, 2013 - link

    lol? a G-Sync type solution is a good candidate for being integrated into a VESA standard, and make it part of the Display's Information that is exchanged though DP/HDMI/DVI, so all AMD would need to do is make sure their drivers are aware that they can send frames to the screen as soon as they are finished. The best part would be that, with the whole Mantle deal, AMD would probably expose this to the developer, allowing them to determine when frames are 'G-Sync'd' and when they are not.
  • MousE007 - Thursday, October 24, 2013 - link

    No, there is a "hand- shake" between GPU and the monitor or tv, will not be supported with any other brand.
  • inighthawki - Thursday, October 24, 2013 - link

    You do realize that it can still be put into the VESA standard, right? Then only GPUs supporting the standard can take advantage of it. Also ANYONE who believes that GSync OR Mantle is going to "kill the other" is just an idiot.

Log in

Don't have an account? Sign up now