Crysis Warhead Performance

This game is another killer at 2560x1600. Only multi-GPU solutions will cut it at this resolution with Gamer settings and Enthusiast shaders enabled. Once (if) the 64-bit patch is released, we should see some general performance improvement as well, but graphics limitations are the name of the game here. Crysis was tough on PCs, and while Warhead is a bit more optimized, this one is still a system killer.


Click to Enlarge

With a 30" display, a minimum of a GTX 295 is required for playability with the settings we chose. Of course, dropping the settings down a bit will certainly help out, but even on smaller panel sizes a high end single card will be desireable for the highest image quality settings. Luckily for fans, running in DX9 mode with slightly lower settings is very playable and not that much of a sacrifice.

Crysis Warhead seems to favor SLI over CrossFire as the single 4870 1GB leads the GTX 260 while the GTX 260 SLI wins out over the 4870 X2. This isn't so much a question of what architecture handles the game better as what multi-GPU solution handles the game better. Either way, it comes out in NVIDIA's favor.

Call of Duty World at War Performance Fallout 3 Performance
Comments Locked

100 Comments

View All Comments

  • strikeback03 - Thursday, January 15, 2009 - link

    Just check the first page of comments.

    From CyberHawk:
    Been waiting for this one...
    ... but I find a response a bit cold.

    It's the fastest card for God sake!


    From formulav8:
    Yeps, this is one of the worst reviews Anand himself has ever done. He continues to praise nVideo who just a month or 2 ago was charging $600 for their cards.

    Give credit where credit is do. He even harps on a sideport feature that doesn't mean much now and AMD says it didn't provide no real benefit even when it was enabled.

    I've been a member of this site since 2000 and am dissappointed how bad the reviews here are getting especially when they have a biased tone to them.


    Those are just two examples from the first page of comments on an article you yourself pointed out. Just for kicks, here is one from another article (4830 launch):

    From Butterbean:
    I jumped to "Final Words" and boom - no info on 4830 but right into steering people to an Nvidia card. That's so Anandtech.

    I still state that no matter whose hardware they review or what they say, someone will accuse them of bias.
  • SiliconDoc - Tuesday, January 13, 2009 - link

    AMEN. Good post.
  • jmg873 - Tuesday, January 13, 2009 - link

    you stated in the article that the gtx260 sli beating out the 4870 x2 showed that sli was superior to crossfire. i don't have an opinion at this point on which is better but saying that sli beats crossfire based on that isn't accurate. the 4870 x2 isn't 2 4870's in crossfire, it's 2 of them basically "welded" together on 1 card. if you have two 4870's in crossfire that will probably yield a different result, maybe worse, maybe better.
  • Jovec - Tuesday, January 13, 2009 - link

    My understanding is it is still Crossfire/SLI for dual gpu, single slot cards like the 295 and 4870x2. The advantage of such cards is that you don't need an CF/SLI mobo while also being a bit cheaper (than purchasing 2 of said cards). You could also go quad with these cards on a CF/SLI mobo.
  • JarredWalton - Tuesday, January 13, 2009 - link

    Just because ATI disables the "CrossFire" tab with the 4870X2 doesn't mean it isn't CrossFire. Trust me: I bought one and I'm disappointed with the lack of performance in new games on a regular basis. I'm thinking I need to start playing games around four months after they launch, just so I can be relatively sure of getting working CF support - and that's only when a game is an A-list title. There are still plenty of games where CrossFire can negatively impact performance or cause other quirkiness (Penny Arcade Adventures comes to mind immediately).
  • af530 - Wednesday, January 14, 2009 - link

    Die of aids you shithead
  • thevisitor - Tuesday, January 13, 2009 - link

    PLEASE LEARN !

    The first graph in this review should be dollar per frame
    but you anand cannot show it, right!
    because everyone can see then how nvidia sells total crap, AGAIN.

    the price is the 99% buy condition, so you must consider it when writing something again !
  • Kroneborge - Tuesday, January 13, 2009 - link

    Different people care about different things when choosing a product. For some dollar per frame might be the most important thing, for others (especially at the high end) all they care about is having the most powerful product, and so they gladly pay a high premium for that last little extra bit.

    Neither is wrong, what's right for one person, can be wrong for another.

    A reviewers job it to tell you about the performance, it's up to you to decide if you think that's worth the money. They can't make up your mind for you.
  • elerick - Tuesday, January 13, 2009 - link

    Graphics are in a transition phase right now. With the economy ditching high end, they are forced to compete for midrange. That is why the competition is much more severe in the 4850/GT260 camp.

    It's sad that anand's readers have to blog and flame everything that is written all the time. If you leave power consumption out of the *inital* launch review it is for good reason, perhaps they are forced to get the review out or it will become yesterdays news. The most important thing here is benchmarks. Power consumption and SLI will soon be following once they can get their hands on more video cards.

    I'm tired of reading comments where everyone just bitches about everything, grow up.

    I look forward to reading your next review, Cheers!
  • araczynski - Tuesday, January 13, 2009 - link

    nvidia still sucking in price as usual.

Log in

Don't have an account? Sign up now