Crysis Warhead Performance

This game is another killer at 2560x1600. Only multi-GPU solutions will cut it at this resolution with Gamer settings and Enthusiast shaders enabled. Once (if) the 64-bit patch is released, we should see some general performance improvement as well, but graphics limitations are the name of the game here. Crysis was tough on PCs, and while Warhead is a bit more optimized, this one is still a system killer.


Click to Enlarge

With a 30" display, a minimum of a GTX 295 is required for playability with the settings we chose. Of course, dropping the settings down a bit will certainly help out, but even on smaller panel sizes a high end single card will be desireable for the highest image quality settings. Luckily for fans, running in DX9 mode with slightly lower settings is very playable and not that much of a sacrifice.

Crysis Warhead seems to favor SLI over CrossFire as the single 4870 1GB leads the GTX 260 while the GTX 260 SLI wins out over the 4870 X2. This isn't so much a question of what architecture handles the game better as what multi-GPU solution handles the game better. Either way, it comes out in NVIDIA's favor.

Call of Duty World at War Performance Fallout 3 Performance
Comments Locked

100 Comments

View All Comments

  • Gasaraki88 - Monday, January 12, 2009 - link

    It's stupid to get this card if you don't have a 30" monitor and a high end cpu. They are testing the video card here not CPUs. Testing on slower CPU will just show every card pegged at the same frame rate.

    This review was fine, thanks. =)
  • SiliconDoc - Thursday, January 15, 2009 - link

    Gee, suddenly the endlessly bragged about "folding" means absolutely zero (ATI cards suck at it BTW) ... and you've discounted CUDA, and forgotten about PhysX ,,, and spit in the face of hundreds of thousands of single PCI-e 16x motherboard owners.
    Now go to the 4870x2 reviews and type that same crap you typed above - because I KNOW none of you were saying it THEN on the 4870x2 reviews...
    In fact, I WAS THE ONLY ONE who said it at those reviews... the main reason BEING THE 4870X2 WAS RECIEVING ENDLESS PRAISE FOR WINNING IN THAT ONE 2560X resolution....
    Yes, they couldn't stop lauding it up over how it excelled at 2560x - oh the endless praise and the fanboys drooling and claiming top prize...
    Now all of sudden, when the red card get SPANKED hard....
    ____________________________________________________________

    Yes, when I posted it at the red reviews - I didn't have folding or PhysX to fall back on... to NEGATE that... not to mention EXCELLENT dual gpu useage and gaming profiles OUT OF THE BOX.

    The facts are, the 4870x2 had better be at least 100 bucks cheaper, or more - who wants all the hassles ?
  • TheDoc9 - Tuesday, January 13, 2009 - link

    It wasn't fine for me, and I don't believe that this card should only be purchased by those with a 30" monitor and bleeding edge cpu. That someone with a fast core proc might be able to find some use with this product vs. the next slowest alternative. Prove to me I'm wrong.
  • Nfarce - Tuesday, January 13, 2009 - link

    Ok. I can look at many data results on this website with the GTX280 paired with an i7 and a stock clocked E8500 and do some interpolation of said data into the results here.

    See Exhibit A from the Jan. 8 article on AMD's Phenom II X4 940 & 920 using a single GTX280:

    Crysis Warhead @ 1680x1050 (mainstream quality, enthusiast on) results:

    Stock E7200 @ 2.53 GHz-> 66.2 fps
    Stock E8600 @ 3.30 GHz-> 84.0 fps
    Stock i965 @ 3.20 GHz-> 86.8 fps

    Now back to this report with the same game resolution (but using gamer quality with enthusiast on) with a single GTX280:

    Stock E7200 @ 2.53 GHz -> ???
    Stock E8600 @ 3.30 GHz -> ???
    Stock i965 @ 3.20 GHz -> 36.6 fps

    Now using the GTX295:

    Stock E7200 @ 2.53 GHz -> ???
    Stock E8600 @ 3.30 GHz -> ???
    Stock i965 @ 3.20 GHz -> 53.1fps

    With the above data, it shouldn't take an M.I.T. PhD to reasonably get a figure of potentials with slower CPUs and lower resolutions.





  • TheDoc9 - Wednesday, January 14, 2009 - link

    That actually is informative.
  • A5 - Monday, January 12, 2009 - link

    If you're not playing at 25x16, this card isn't going to make anything playable that isn't already on an existing, cheaper solution.

    In that same vein, the people who will drop $500 on a video card will most likely have a high-end CPU - there isn't a reason to test it on systems that aren't the top of the heap. They're testing the card, not the whole system - the conclusion to be made is that on any given set of hardware, Card X will outperform Card Y.
  • SiliconDoc - Monday, January 12, 2009 - link

    Just like the bloated pig 4870x2 has been for so many months - a costly, outlandish, not neccessary, wild good for nothing at a bad pricepoint, unless you're playing at 2560x - and let's add, at that level it's sometimes not even playable framerates anyway.
    Glad to see people have finally come to realize what a piece of crap the 4870x2 solution is - thanks NVidia for finally straightening so many out.
    This is great.
  • Hxx - Tuesday, January 13, 2009 - link

    you obviously have no clue about video cards or you cannot afford one, which explains your attitude. First off the 4870 x2 is an awesome card much faster than any other card available except the 295. Second, it is reasonably priced at 400 after mir which is not bad for a high end product. This card can run every game out there just as good as the gtx 295. There is no difference between the two because once the game runs at 35 fps or above you will not notice a difference. In other words, the 16 fps difference in cod5 between the cards has no value because the game plays fine at 40fps. The gtx 295 priced at $500 is a waste of money. And btw I am not an ATI fanboy , I own a gtx 280.
  • strikeback03 - Wednesday, January 14, 2009 - link

    The 4870x2 launched at $550, so unless you need a new card RIGHT NOW you can wait until the initial rush on the GTX295 is over and the price settles down some.
  • SiliconDoc - Tuesday, January 13, 2009 - link

    So tell me about the forced dual mode in ATI ? Oh that's right mr know it all, they don't have that.
    SLI does.
    Yes, I obviously know notihng, but you made a fool of yourself.
    BTW - please give me the link for the $400.00 4870x2 - because I will go buy one - then post all the driver and crash issues.

    Waiting....

Log in

Don't have an account? Sign up now