Crysis

Up next is our legacy title for 2013/2014, Crysis: Warhead. The stand-alone expansion to 2007’s Crysis, at over 5 years old Crysis: Warhead can still beat most systems down. Crysis was intended to be future-looking as far as performance and visual quality goes, and it has clearly achieved that. We’ve only finally reached the point where single-GPU cards have come out that can hit 60fps at 1920 with 4xAA, never mind 2560 and beyond.

Unlike games such as Battlefield 3, AMD’s GCN cards have always excelled on Crysis: Warhead, and as a result at all resolutions and all settings the 290X tops our charts for single-GPU performance. At 2560 this is a 15% performance advantage for the 290X, pushing past GTX 780 and GTX Titan to be the only card to break into the 50fps range. While at 4K that’s a 22% performance advantage, which sees 290X and Titan become the only cards to even crack 40fps.

But of course if you want 60fps in either scenario, you need two GPUs. At which point 290X’s initial performance advantage, coupled with its AFR scaling advantage (77/81% versus 70%) only widens the gap between the 290X CF and GTX 780 SLI. Though either configuration will get you above 60fps in either resolution.

Meanwhile the performance advantage of the 290X over the 280X is lower here than it is in most games. At 2560 it’s just a 26% gain, a bit short of the 30% average.290X significantly bulks up on everything short of memory bandwidth and rasterization versus 280X, so the list of potential bottlenecks is relatively short in this scenario.

Interestingly, despite the 290X’s stellar performance when it comes to average framerates, the performance advantage with minimum framerates is more muted. 290X still beats GTX 780, but only by 4% at 2560. We’re not CPU bottlenecked, as evidenced by the AFR scaling, so there’s something about Crysis that leads to the 290X crashing a bit harder in the most strenuous scenes.

Crysis 3 Total War: Rome 2
Comments Locked

396 Comments

View All Comments

  • mr_tawan - Tuesday, November 5, 2013 - link

    AMD card may suffer from loud cooler. Let's just hope that the OEM versions would be shipped with quieter coolers.
  • 1Angelreloaded - Monday, November 11, 2013 - link

    I have to be Honest here, it is beast, in fact the only thing in my mind holding this back is lack of feature sets compared to NVidia, namely PhysX, to me this is a bit of a deal breaker compared for 150$ more the 780 Ti gives me that with lower TDP/and sound profile, as we are only able to so much pull from 1 120W breaker without tripping it and modification for some people is a deal breaker due to wear they live and all. Honestly What I really need to see from a site is 4k gaming at max, 1600p/1200p/1080p benchmarks with single cards as well as SLI/Crossfire to see how they scale against each other. To be clear as well a benchmark using Skyrim Modded to the gills in texture resolutions as well to fully see how the VRAM might effect the cards in future games from this next Gen era, where the Consoles can manage a higher texture resolution natively now, and ultimately this will affect PC performance when the standard was 1-2k texture resolutions now becomes double to 4k or even in a select few up to 8k depth. With a native 64 bit architecture as well you will be able to draw more system RAM into the equation where Skyrim can use a max of 3.5 before it dies with Maxwell coming out and a shared memory pool with a single core microprocessor on the die itself with Gsync for smoothness we might see an over engineered GPU card capable of much much more than we thought, ATI as well has their own ideas which will progress, I have a large feeling Hawaii is actually a reject of sorts because they have to compete with Maxwell and engineer more into the cards themselves.
  • marceloviana - Monday, November 25, 2013 - link

    I Just wondering why does this card came with 32Gb gddr5 and see only 4Gb. The PCB show 16 Elpida EDW2032BBBG (2G each). This amount of memory will help a lot in large scenes wit Vray-RT.
  • Mat3 - Thursday, March 13, 2014 - link

    I don't get it. It's supposed to have 11 compute units per shader engine, making 44 on the entire chip. But the 2nd picture says each shader engine can only have up to 9 compute units....?
  • Mat3 - Thursday, March 13, 2014 - link

    2nd picture on page three I mean.
  • sanaris - Monday, April 14, 2014 - link

    Who cares? This card was never meant to compute something.

    It supposed to be "cheap but decent".
    Initially they made this ridiculous price, but now it is around 200-350 at ebay.
    For $200 it worth its price, because it can be used only to play games.
    Who wants to play games at medium quality (not the future ones), may prefer it.

Log in

Don't have an account? Sign up now