Crysis

Up next is our legacy title for 2013/2014, Crysis: Warhead. The stand-alone expansion to 2007’s Crysis, at over 5 years old Crysis: Warhead can still beat most systems down. Crysis was intended to be future-looking as far as performance and visual quality goes, and it has clearly achieved that. We’ve only finally reached the point where single-GPU cards have come out that can hit 60fps at 1920 with 4xAA, never mind 2560 and beyond.

Unlike games such as Battlefield 3, AMD’s GCN cards have always excelled on Crysis: Warhead, and as a result at all resolutions and all settings the 290X tops our charts for single-GPU performance. At 2560 this is a 15% performance advantage for the 290X, pushing past GTX 780 and GTX Titan to be the only card to break into the 50fps range. While at 4K that’s a 22% performance advantage, which sees 290X and Titan become the only cards to even crack 40fps.

But of course if you want 60fps in either scenario, you need two GPUs. At which point 290X’s initial performance advantage, coupled with its AFR scaling advantage (77/81% versus 70%) only widens the gap between the 290X CF and GTX 780 SLI. Though either configuration will get you above 60fps in either resolution.

Meanwhile the performance advantage of the 290X over the 280X is lower here than it is in most games. At 2560 it’s just a 26% gain, a bit short of the 30% average.290X significantly bulks up on everything short of memory bandwidth and rasterization versus 280X, so the list of potential bottlenecks is relatively short in this scenario.

Interestingly, despite the 290X’s stellar performance when it comes to average framerates, the performance advantage with minimum framerates is more muted. 290X still beats GTX 780, but only by 4% at 2560. We’re not CPU bottlenecked, as evidenced by the AFR scaling, so there’s something about Crysis that leads to the 290X crashing a bit harder in the most strenuous scenes.

Crysis 3 Total War: Rome 2
Comments Locked

396 Comments

View All Comments

  • Blamcore - Friday, October 25, 2013 - link

    Wow, I was just remarking yesterday that NV fanbois had sunk to the level of apple fanbois, when I was seeing the argument "you just like AMD because you can't afford NV" on a few boards. Now here is apple fanbois famous argument "my company is better because they have a higher profit margin" Gratz your unreasonable bias just went up a level!
    I know, you aren't a fanboy, you are really a business expert here to recommend that a company should gain market share by releasing a card roughly equal to what it's competitor had out for months and pricing it the same as they do! Maybe the could have asked 650 if they released it last January
  • puppies - Saturday, October 26, 2013 - link

    R+D costs come from the sale price of the card. Are you tring to claim a $300 GPU costs $300 in materials? R+D costs also come from the fact that shrinking the process enables the manufacturer to get more cards per die each time.

    Look at Intel and AMD their chips don't go up in price each time they get faster, they stay at the same price point. The last 2 cards I have bought have been Nvidia but the next one will be AMD at this rate. I expect a 660TI to be faster and more energy efficient than a 560TI and at the same price point WHEN IT IS RELEASED and I think a lot of people are in the same boat. Nvidia is trying to push people into spending more each time they release a new model line up and it stinks.

    I don't care if a 660 is faster than a 560TI, forcing people to move down the GPU lineup just smacks of NVIDIA price gouging.
  • Samus - Thursday, October 24, 2013 - link

    I have to disagree with you Berzerker. Although his post clearly "overpromotes" the 290, it is incredible value when you consider it is faster and cheaper (by hundreds of dollars) than the Titan.

    -Geforce 660TI owner
  • Laststop311 - Thursday, October 24, 2013 - link

    For people that value a quiet computer, this card is trash
  • Spunjji - Friday, October 25, 2013 - link

    For people that value a quiet computer, all stock coolers are useless.

    People that value a truly quiet computer won't even be playing at this end of the GPU market.
  • Samus - Friday, October 25, 2013 - link

    This card is a great candidate for water cooling since the back of the PCB is essentially empty. Water cooling the face side is cheaper/easier, and this card can clearly use it.
  • HisDivineOrder - Friday, October 25, 2013 - link

    He didn't say "silent." He said "quiet." I'd argue the Titan/780/690 coolers were all "quiet," but not "silent."

    Since he said quiet, I don't think his expectation is unreasonable to expect a certain level of "quiet" at the $500+ range of discrete cards.
  • Nenad - Friday, October 25, 2013 - link

    780 with stock cooler is not useless, and it IS quiet (it is not 'silent')
    BTW, going by posted numbers it seems 290x will be TWICE as noisy as GTX780 ?
  • ballfeeler - Thursday, October 24, 2013 - link

    Methinks Berzerker7 is just salty and perhaps partial to nvidia.  Nothing itchy wrote is inaccurate, including the $550 price that Salty-Berzerker7 claimed was $600. 

    -          Fastest card ?  - yup

    -          Free game ? – yup

    -          Pooped all over titan ? –yup

    Do not be salty mr. Berzerker7.  AMD just roundhouse kicked nvidia square in the gonads with performance above Titan for half the price.
  • Shark321 - Thursday, October 24, 2013 - link

    At 1080p it's actually slower than Titan if you average all reviews across the sites. With some reviews even slightly slower than the 780. It's also the loudest card ever produced after 30 minutes of playing (9,6 Sone in Battlefield 3 according to PCGamesExtreme). With this noise it's not acceptable and there will be no other coolers for the time being.

Log in

Don't have an account? Sign up now