Crysis: Warhead

Up next is our legacy title for 2013/2014, Crysis: Warhead. The stand-alone expansion to 2007’s Crysis, at over 5 years old Crysis: Warhead can still beat most systems down. Crysis was intended to be future-looking as far as performance and visual quality goes, and it has clearly achieved that. We’ve only finally reached the point where single-GPU cards have come out that can hit 60fps at 1920 with 4xAA, never mind 2560 and beyond.

Unlike games such as Battlefield 3, AMD’s GCN cards have always excelled on Crysis: Warhead, and as a result it’s a good game for the 290 right off the bat. Furthermore because the 290X throttles so much here, coupled with this game’s love of ROP performance, the 290 actually beats the 290X, if only marginally so. .5fps is within our experimental variation (even though this benchmark is looped multiple times), but it just goes to show how close the 290 and 290X can be, and furthermore how powerful the higher average clockspeeds can be in ROP or geometry bound scenarios. Graphics rendering may be embarrassingly parallel in general, but sometimes a bit narrower and a bit higher clocked can be the path to better performance.

Meanwhile because the 290 does so well here, it makes for another sizable victory over the GTX 780, beating it by 16%. Further down the line the GTX 770 is beaten by 46%, and the 280X by 27%.

Moving on to our minimum framerates, the 290 actually extends its lead over the 290X. Now minimum framerates aren’t as reliable as average framerates, even in Crysis, so our experimental variation is going to be higher here, but it does once again show the advantages the 290 enjoys being clocked higher than the 290X under a sustained workload. Though on the other hand the GTX 780 catches up slightly, closing the gap to 10%.

Crysis 3 Total War: Rome 2
Comments Locked

295 Comments

View All Comments

  • TempAccount007 - Saturday, November 9, 2013 - link

    What part of REFERENCE COOLER do you not understand?
  • johnny_boy - Wednesday, November 13, 2013 - link

    The IF isn't so big, I think. A lot of gamers already have blocks for their graphics cards, or don't care much about the additional noise, or want a block anyway at some point and the 290 presents an opportunity to get one now (and then cooling is quieter/better than the competing nVidia cards for the same price when figuring in the watercooling costs for the AMD card). I'd rather get the 290 (over the 780) and use my current watercooling solution. If I didn't have watercooling then I'd still rather buy the 290 and upgrade to watercooling.
  • Eniout - Thursday, November 14, 2013 - link

    my Aunty Julia recently got Jeep Compass SUV by working from a
    computer. read the full info here www.Jobs37.coℳ
  • tgirgis - Thursday, February 20, 2014 - link

    That's really extremely one sided, first of all, AMD already has a response to G-Sync, (their version for now has been dubbed "Free-Sync" but no idea if that nomenclature is final) and they have TressFX (which, at the moment, does look better than Nvidia's "Hairworks" but Nvidia will probably soon catch up), and they've got Mantle, which is definitely a massive advantage.

    Not to mention the R9 290 comes with 4GB Vram, as opposed to the GTX 780's 3GB, though it's really not a huge issue except in 4k gaming. Finally, shield compatibility isn't really a benefit, it's a $250 handheld game system, it's only beneficial if you interested in purchasing one of those, as opposed to being an included feature.

    Nvidia is not without it's advantages however, they still have lower power consumption and thermals which is great for mini-itx systems (although manufacturer custom cooled cards can help bridge the gap for thermals) and they do still have Physx.

    If Mantle keeps going the way it is now, Nvidia might be forced to pay royalties to AMD similar to how they did with Intel a few years back. If anything, AMD should throw "Allow us to use Physx" in the negotiations :)
  • slickr - Tuesday, November 5, 2013 - link

    O yeah, Nvidia at this point has no choice, but to lower its prices again. I mean for $400 this card is amazing. It performs on the same level as the $1000 Titan and on the same level as the $550 290X, so a giant performance at a very cheap price.

    Even with the high noise(just wait 2 weeks for custom cooler) this card blows the GTX 780 out of the water, the performance is so much better.

    I think if Nvidia wants to stay in the competition they would need to cut the GTX 780 price to at least $400 as well and try and get sales due to better acoustics and a lower power consumption, but if it was just performance in question they would need to lower the price of the 780 to $350 or 300 euros.

    Of course that would mean that the 770 should get a price reduction as well and be around $270.
  • holdingitdown - Tuesday, November 5, 2013 - link

    Yes this card is incredibly disruptive. The performance makes the 780 look like a mess. Expect to see at least another $100 slashed off the 780 and the 770 needs a little more taken off.

    The R9 290 is a monster!
  • crispyitchy - Tuesday, November 5, 2013 - link

    Best card to release yet as far as I am concerned.

    The noise profile is not perfect, but every card is noisy once gaming to one degree or another.

    What is perfect is the giant performance for this perfect price.

    Newegg here I COME
  • Wreckage - Tuesday, November 5, 2013 - link

    I doubt NVIDIA will cut their price. This card is so loud that most people will stay away and get a 780 or 770. AMD is so desperate to increase performance that they sacrifice everything else. It's like the last sad days of 3DFX.
  • Da W - Tuesday, November 5, 2013 - link

    Remember what happened after 3Dfx died? Higher price and mediocre performance.
    I'd buy AMD if only to keep them alive and force Nvidia to drop their prices.
  • HisDivineOrder - Tuesday, November 5, 2013 - link

    Actually, traditionally, 3dfx was overpriced until the very end. ATI was always there competing with nVidia and 3dfx, anyway.

    So competition existed for as long as we've had discrete GPU's in any meaningful way. It's AMD that wants to end competition by standardizing PC gaming high performance around a GCN-based API only they can use meaningfully.

Log in

Don't have an account? Sign up now