Crysis Warhead Performance

This game is another killer at 2560x1600. Only multi-GPU solutions will cut it at this resolution with Gamer settings and Enthusiast shaders enabled. Once (if) the 64-bit patch is released, we should see some general performance improvement as well, but graphics limitations are the name of the game here. Crysis was tough on PCs, and while Warhead is a bit more optimized, this one is still a system killer.


Click to Enlarge

With a 30" display, a minimum of a GTX 295 is required for playability with the settings we chose. Of course, dropping the settings down a bit will certainly help out, but even on smaller panel sizes a high end single card will be desireable for the highest image quality settings. Luckily for fans, running in DX9 mode with slightly lower settings is very playable and not that much of a sacrifice.

Crysis Warhead seems to favor SLI over CrossFire as the single 4870 1GB leads the GTX 260 while the GTX 260 SLI wins out over the 4870 X2. This isn't so much a question of what architecture handles the game better as what multi-GPU solution handles the game better. Either way, it comes out in NVIDIA's favor.

Call of Duty World at War Performance Fallout 3 Performance
Comments Locked

100 Comments

View All Comments

  • SiliconDoc - Monday, January 12, 2009 - link

    Pssst ! The GTX295 wins hands down in both those departments...that's why it's strangely "left out of the equation".
    (most all the other sites already reported on that - heck it was in NVidia's literature - and no they didn't lie - oh well - better luck next time).
  • Amiga500 - Wednesday, January 14, 2009 - link

    Well... to be honest...


    If leaving out power consumption pisses people like you off - good one anandtech!


    (I guess your nice and content your nvidia e-penis can now roam unopposed?)
  • SiliconDoc - Thursday, January 15, 2009 - link

    First of all, it's "you're" when you are referring to my E-PENIS. (Yes, please also capitalize ALL the letters to be properly indicative of size.)
    Second, what were you whining about ?
    Third, if you'd like to refute my points, please make an attempt, instead of calling names.
    Fourth, now you've fallen to the lowest common denominator, pretending to hate a fellow internet poster, and supporting shoddy, slacker work, with your own false fantasy about my temperament concerning power and the article.
    What you failed to realize is me pointing out the NVidia advantage in that area, has actually pissed you off, because the fanboy issue is at your end, since you can't stand the simple truth.
    That makes your epeenie very tiny.
  • kk650 - Tuesday, January 13, 2009 - link

    Remove yourself from the gene pool, fuckwit americunt
  • darckhart - Monday, January 12, 2009 - link

    from all the games where the gtx295 beats the 4870x2, it's only a 3-5 fps win. i don't see how that "gets the nod even considering price." at best, that's $10 per frame. i think we need to see thermals and power draw (i don't recall if you talked about these in the earlier article) to better justify that extra $50.
  • JarredWalton - Tuesday, January 13, 2009 - link

    I bought a 4870X2 a couple months back... if I had had the option of the GTX 295, it would have been my pick for sure. I wanted single-slot, dual-GPU (because I've got a decent platform but am tired to dual video cards). I cannot overstate the importance of drivers, and frankly ATI's drivers still disappoint on a regular basis. Until and unless ATI can get driver profiles into their drivers for CrossFire, I don't think I can ever feel happy with the solution. Also, 4870X2 drivers need to bring back the "CrossFire disable" option in the drivers; we all know there are two GPUs, and there are still occasional games where CrossFire degrades performance over a single GPU.
  • TheDoc9 - Monday, January 12, 2009 - link

    Definitely a half-ass review, something I don't expect from Anandtech. Something more to come later?

    Many questions on these cards can still be asked;
    -Testing at other resolutions, not just a recommendation to stay away unless playing at 2500 res. and a 30" monitor.
    -Testing on other rigs, such as a mid range quad core and dual core to give us an idea of how it might perform on our rig (who don't own a mega clocked i7)

    I don't like to sound negative, but honestly there was no enthusiasm written in this preview/review/snapshot/whatever it's supposed to be. Kind of disappointing how every other major site has had complete reviews since launch day. Was this written on the plane trip home?
  • bigonexeon - Friday, January 16, 2009 - link

    i dont see the point of using an intel i7 as intel released a article that the i7 cache 3 has a memory leak thats only attempt at fixing it is a software patch which there not going too fix until the second generation of the i7s also why compare standard cards too a newer card why not put the older cards newer designs against the newer cards etc the superclocks,xxx which is the latest edition of the old model used
  • theprodigalrebel - Monday, January 12, 2009 - link

    Might want to take a second look at the line graph and the table below it.
  • Iketh - Monday, January 12, 2009 - link

    This article was just right. I had no enthusiasm to read about this card because there isnt anything to get excited about. Apparently Derek didn't either. Im sure there will that enthusiasm again when a next gen card appears and there is something new to talk about.

    It's also having to follow the Phenom II article.

Log in

Don't have an account? Sign up now