Crysis Warhead Analysis

Crysis Warhead, the sequel to the original that follows the same story from a different perspective, does a great job of improving on the Crysis engine in terms of balancing performance and improving playability with a still-forward-looking engine (though we lack the native 64-bit runtime of the original game). We push the settings pretty high in spite of the fact that we don't turn them all the way up. Everything is set to "Gamer" quality with the exception of Shaders which are set to "Enthusiast" level.




1680x1050    1920x1200    2560x1600


Like CoD, Crysis favors NVIDIA hardware. The settings we're rocking require more than a single Radeon HD 4850 or GeForce 9800 GTX+ even at 1680x1050, so it's likely that many gamers will be running at lower settings than these. As with CoD, SLI sweeps this benchmark in terms of performance. The Radeon HD 4870 CrossFire pushes up against the GeForce GTX 260 SLI setup, but the core 216 or an overclocked GTX 260 setup would easily put some distance between them.




1680x1050    1920x1200    2560x1600


In terms of scaling, SLI looks better at lower resolutions while CrossFire puts the heat on as resolution increases. Despite the fact that the 4850 scales at over 77% (which is very good), the higher baseline of the NVIDIA cards keeps this from making the impact that it could. At the same time, configurations with two 4850 cards perform on par with the GTX 285 and offer much better value.




1680x1050    1920x1200    2560x1600


Our performance data and our value data show that in this case, AMD's approach to single card multiGPU on the high end is effective. The 4850 X2 2GB can be had from newegg for less than $300 which is more than $50 cheaper than a single GPU NVIDIA solution that gets you the same performance where it counts.

It's interesting to note that AMD two card solutions tended to scale better than the single card multi-GPU options here. Where memory is a limiter, we see our higher memory single card options scaling better. In this case, it looks like memory isn't as large a bottleneck as something else. We can't say for certain, but our guess is that it's the PCIe bus: with both cards getting a full x16 slot, each GPU is able to communicate more efficiently with the host and it seems that this is beneficial to Crysis performance.

At 1920x1200, the only single card solutions that remain playable are the GTX 280 and GTX 285. Getting good performance on a 30" monitor requires either a GTX 295 or 2x GTX 280/285s. Nothing else passes the test at the highest resolution we tested.

Call of Duty World at War Analysis Fallout 3 Analysis
Comments Locked

95 Comments

View All Comments

  • kmmatney - Monday, February 23, 2009 - link

    Especially at the 1920 x 1200 resolution - that resolution is becoming a sweetspot nowadays.
  • just4U - Monday, February 23, 2009 - link

    I disagree. I see people finally moving away from their older 17-19" flat panels directly into 22" wide screens. 24" and 1920/1200 resolutions are no where near the norm.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Correct, but he said sweet spot because his/her wallet is just getting bulgy enough to comtenplate a movement in that direction... so - even he/she is sadly stuck at "the end user resolution"...
    lol
    Yes, oh well. I'm sure everyone is driving a Mazerati until you open their garage door....or golly that "EVO" just disappeared... must have been stolen.
  • DerekWilson - Monday, February 23, 2009 - link

    The 1GB version should perform very similarly to the two 4850 cards in CrossFire.

    The short answer is that the 1GB version won't have what it takes for 2560x1600 but it might work out well for lower resolutions.

    We don't have a 1GB version, so we can't get more specific than that, though this is enough data to make a purchasing decision -- just look at the 4850 CrossFire option and take into consideration the cheaper price on the 1GB X2.
  • politbureau - Tuesday, June 1, 2010 - link

    I realize this is an older article, however I always find it interesting to read when upgrading cards.

    While I find it admirable that Derek has compared the 'older' GTX 280 SLI scaling, it is unfortunate that he hasn't pointed out that it should perform identically to the GTX 285s if the clocks were the same.

    This was also passed over in the "worthy successor" article, where it does not compare clock for clock numbers - an obvious test, if we want to discover the full value of the die shrink.

    I recently 'upgraded' to 3 GTX 285s from 3 GTX 280s through warranty program with the mfg, and there is little to no difference in performance between the 2 setups. While cabling is more convenient (no 6 to 8 pin adapters), the 285s won't clock any better than my 280s would, Vantage scores are within a couple hundred points of each other at the same clocks (the 280s actually leading), and the temperature and fan speed of the new cards hasn't improved.

    I think this is a valuable point in an article that compares performance per dollar, and while slightly outside the scope of the article, I think it's a probabtive observation to make.

Log in

Don't have an account? Sign up now