Crysis: Warhead

Up next is our legacy title for 2013/2014, Crysis: Warhead. The stand-alone expansion to 2007’s Crysis, at over 5 years old Crysis: Warhead can still beat most systems down. Crysis was intended to be future-looking as far as performance and visual quality goes, and it has clearly achieved that. We’ve only finally reached the point where single-GPU cards have come out that can hit 60fps at 1920 with 4xAA, never mind 2560 and beyond.

Whereas Battlefield 3 is a game that traditionally favors NVIDIA, Crysis: Warhead has generally favored AMD in this generation, leading to an uphill battle for NVIDIA. 290X was able to beat GTX Titan here at 2560, but with the additional performance offered by the GTX 780 Ti NVIDIA is once again at the top, though only by a margin of under 2fps (or 2%). Otherwise compared to NVIDIA’s other cards, Crysis: Warhead is another consistent game for GTX 780 Ti, with NVIDIA’s latest card beating GTX Titan and GTX 780 by 9% and 18% respectively.

Moving on, even when we double up on cards the GTX 780 Ti and 290X remain close. At 2560 it’s a virtual tie at 87fps apiece, while at 4K the GTX 780 Ti SLI takes a slight lead.

As for our minimum framerates under Crysis: Warhead, NVIDIA does end up breaking the deadlock here with a slight performance advantage for the GTX 780 Ti, beating the 290X by several percent, pushing its minimum framerate above 40fps.

Crysis 3 Total War: Rome 2
Comments Locked

302 Comments

View All Comments

  • 1Angelreloaded - Saturday, November 16, 2013 - link

    False I Hit the 3.5 Gb limit quite a few times due to it being a 32 bit game, now if they are 64bit games then yes they will use more than 3GB for textures and draw distance , but meh you know what your talking about.......right.
  • ahlan - Friday, November 8, 2013 - link

    Damage control Nvidia fanboy! Nvidia fanboys are delusional as MS and Apple fanboys...

    Keep paying more for the same performance...
  • dylan522p - Thursday, November 7, 2013 - link

    Not at all. In quiet more. It runs hotter, is louder 95% of the time and is using more power.
  • dylan522p - Thursday, November 7, 2013 - link

    And performs significantly worse.
  • DMCalloway - Thursday, November 7, 2013 - link

    Definition of upsetting: Early gtx 780 adopters now able to purchase a 'true' gtx 780 at the same price point previous gtx 780's were at launch. Nvidia sat back, took everyone's cash, and now to remain competitive finally release a fully enabled chip..... wow
  • Spunjji - Thursday, November 7, 2013 - link

    I think early adopters on both sides got dicked here. The R9 290 makes everything else look like a joke in terms of pricing, for all its manifest flaws.
  • dylan522p - Thursday, November 7, 2013 - link

    I would rather not have the 480v2, in my machine.
  • Yojimbo - Thursday, November 7, 2013 - link

    And next year they'll release something even faster at the same price point. You can't have both increasing performance/price over time and also not have your new hardware become a comparatively bad deal in the future. People who bought the GTX 780 when it came out got 5 to 6 months of use of the card in exchange for a card which is now ~15% slower than what's available at the same price point.
  • ShieTar - Friday, November 8, 2013 - link

    In other words: Nvidia did what absolutely every other CPU & GPU provider has also done over the last 30 years? Wow indeed.

    Everybody wants to bring the most profitable product possible to the market. That means, you need to be good enough to interest customers and cheap enough to be affordable. And you don't get better or cheaper, unless something changes the market, e.g. competition.
  • extide - Thursday, November 7, 2013 - link

    You stated the 290x is "unable to compete with an older architecture." That is false. LOL

Log in

Don't have an account? Sign up now