Crysis Warhead Analysis

Crysis Warhead, the sequel to the original that follows the same story from a different perspective, does a great job of improving on the Crysis engine in terms of balancing performance and improving playability with a still-forward-looking engine (though we lack the native 64-bit runtime of the original game). We push the settings pretty high in spite of the fact that we don't turn them all the way up. Everything is set to "Gamer" quality with the exception of Shaders which are set to "Enthusiast" level.




1680x1050    1920x1200    2560x1600


Like CoD, Crysis favors NVIDIA hardware. The settings we're rocking require more than a single Radeon HD 4850 or GeForce 9800 GTX+ even at 1680x1050, so it's likely that many gamers will be running at lower settings than these. As with CoD, SLI sweeps this benchmark in terms of performance. The Radeon HD 4870 CrossFire pushes up against the GeForce GTX 260 SLI setup, but the core 216 or an overclocked GTX 260 setup would easily put some distance between them.




1680x1050    1920x1200    2560x1600


In terms of scaling, SLI looks better at lower resolutions while CrossFire puts the heat on as resolution increases. Despite the fact that the 4850 scales at over 77% (which is very good), the higher baseline of the NVIDIA cards keeps this from making the impact that it could. At the same time, configurations with two 4850 cards perform on par with the GTX 285 and offer much better value.




1680x1050    1920x1200    2560x1600


Our performance data and our value data show that in this case, AMD's approach to single card multiGPU on the high end is effective. The 4850 X2 2GB can be had from newegg for less than $300 which is more than $50 cheaper than a single GPU NVIDIA solution that gets you the same performance where it counts.

It's interesting to note that AMD two card solutions tended to scale better than the single card multi-GPU options here. Where memory is a limiter, we see our higher memory single card options scaling better. In this case, it looks like memory isn't as large a bottleneck as something else. We can't say for certain, but our guess is that it's the PCIe bus: with both cards getting a full x16 slot, each GPU is able to communicate more efficiently with the host and it seems that this is beneficial to Crysis performance.

At 1920x1200, the only single card solutions that remain playable are the GTX 280 and GTX 285. Getting good performance on a 30" monitor requires either a GTX 295 or 2x GTX 280/285s. Nothing else passes the test at the highest resolution we tested.

Call of Duty World at War Analysis Fallout 3 Analysis
Comments Locked

95 Comments

View All Comments

  • Nighttrojan - Wednesday, February 25, 2009 - link

    Well, the problem is that I tried it at the same settings too, the 7600gt was still faster. The difference is so great that it takes about 15s for the 4870 to even process a change in settings while with the 7600 gt it's practically instant.
  • mrmarks - Tuesday, February 24, 2009 - link

    Is bolting two video cards together really necessary? I've played Call of Duty at max settings on my two year old midrange card and it ran beautifully. These cards seemed to be designed for games not of today or tomorrow, but rather for games that may never exist. Few store fronts sell pc games today, and many of the games produced today are not terribly graphics intensive. Also, most popular pc games are available on consoles, which are much more practical. I know this sounds negative but the truth is that the video card manufactures are just ignoring the current pc game market.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Geeze dude, there are hundreds of PC games out every year. I suppose the couch console has a wider selection, but it's not like we're dead space here. (hahah)
    I guess it's easier to sell to Santa for the little curtain climbers when even Daddy has a resistence problem. ( Believe me, it's a standing joke amongst friends).
    Then we have the kiddies screwing up the computer issue - not a problem when the household heads do so... the console keeps the PC functional, so to speak.
    But really, lots of game choices for the PC - the biggest retailer in the USA - you know whom - has loads of PC games. Ever heard of best buy - aww just forget it.
    Just go whine somewhere else would ya ?
  • Elfear - Tuesday, February 24, 2009 - link

    ^ Go take a 2nd look at the 2560x1600 res graphs. In almost every case the single cards are struggling to keep their heads above water and in a few cases failing miserably. Ideally, minimum framerates would be >60fps in all of today's games for that buttery-smooth feel but we are far from that with even dual graphics cards.

    I agree with you that dual graphics cards are not needed by most gamers but to say that there is no need period is ignoring reality.
  • JPForums - Tuesday, February 24, 2009 - link

    I'm confused about your opinion on AMD drivers. You made this comment in this article:

    "Because of AMD's driver issues, we often have to wait when new games come out to enjoy proper CrossFire scaling with them. And when hotfixes come out it takes more than a month (usually more like two) to fully integrate changes into a WHQL driver."

    However, you've also made comments in multiple articles similar to this one from GPU Transcoding Throwdown: Elemental's Badaboom vs. AMD's Avivo Video Converter on December 15th, 2008:

    "The train wreck that has been the last few months of Catalyst has happened before and it will happen again as long as AMD puts too many resources into pushing drivers out every month and not enough into making sure those drivers are of high enough quality."

    So on the one hand it seems you are criticizing AMD for not releasing WHQL drivers soon enough. While on the other hand, you seem to want them to take more time with their drivers. Anandtech is not the kind of site to post contradicting opinions on a whim, so I have to assume that your opinion simply hasn't been stated clearly (or I just missed it).

    I remember a comment from an article to the effect of nVidia has a good driver release model. It is interesting to note that nVidia's driver release schedule over the last 6+ months hasn't been all that different from AMD's schedule.

    nVidia's driver release schedule:

    182.06 February 18, 2009
    181.22 January 22, 2009
    181.20 January 8, 2009
    180.48 November 19, 2008
    178.24 October 15, 2008
    178.13 September 25, 2008
    177.41 June 26, 2008
    177.35 June 17, 2008

    182.05b February 10, 2009
    181.22b January 16, 2009

    While nVidia doesn't have a specific release data, they have on average put an update out nearly every month. Being a user of more nVidia hardware than AMD hardware I know that nVidia uses beta drivers in a similar fashion to how AMD uses hot fixes.

    In my opinion, AMD needs to ditch the set release date. They should try to target updates about once a month, but release them only when they are ready. I do think their driver team is likely understaffed, but I a think nVidia's The Way it's Meant to be Played program plays a significant role in why nVidia can support certain games well at launch and AMD can't. Though, I can't say whether it's simply good nVidia developer relations or forced negligence towards AMD. I don't really have a problem with them releasing hotfixes, especially given that AMD has had issues supporting some games at launch, but if their driver team and/or developer relations were better, they wouldn't need them as often.

    I would, like to hear more clearly stated opinions from Derek and Anand on this subject. Thanks.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Here's a clearly stated opinion. ATI KEEPS ******* IT UP, while their competition does a good job.
  • Patrick Wolf - Tuesday, February 24, 2009 - link

    What's the word on overclocking these multi-gpu setups? I can only speak for the 9800 GX2 but it's a very good OCer. EVGA GX2's lowest end model is 600/1500/1000. The SSC edition is 675/?/1100. Same card, just factory overclocked.

    I was running the lowest end model @ a stable 715/1787/1050 Currently testing out 720/1800/1065 (w/ the GX2's cover removed) with good results.
  • mpk1980 - Tuesday, February 24, 2009 - link

    i think the 2 gtx 260's in Sli are a good value right now...that setup comes in the top of the charts most of the time competing with the 280 and 285 sli setups and right now...can be had for less than 400 at newegg after rebate (380ish i believe) which is just as cheap at 1 gtx 285 and 120 cheaper than a 295.....i dont think you can go wrong with that right now....and i cant wait to see 3 of those badboys in tri sli :)
  • gigahertz20 - Monday, February 23, 2009 - link

    Very few people (like around 1%) that read this article play games at 2560x1600, because that resolution requires a CRT monitor or a super high end LCD monitor (not even sure where to get an LCD that goes that high). I realize this article wanted to push the SLI and crossfire video card configurations and see what kind of FPS they would get at that resolution, but the FPS graphs in this article should be set at 1920x1200 by default, not 2560x1600, since that resolution is useless to almost every gamer.
  • 7Enigma - Tuesday, February 24, 2009 - link

    I think the point was to show that UNLESS you are running >24" monitors there is no REASON to have CF/SLI. I would normally agree with you but in this particular review I agree with the writer.

    I still think the broken line graphs showing all resolutions on the same x/y axis was more beneficial.

Log in

Don't have an account? Sign up now