Call of Duty World at War Analysis

This game, as with previous CoD installments, tends to favor NVIDIA hardware. The updated graphics engine of World at War, while looking pretty good, still offers good performance and good scalability.




1680x1050    1920x1200    2560x1600


In this test, even though we disabled the frame rate limit and vsync, single GPU solutions seem limited to around 60 frames per second. This is part of why we see beyond linear scaling with more than one GPU in some cases: it's not magic, it's that single card performance isn't as high as it should be. We don't stop seeing artificial limits on single GPU performance until 2560x1600.




1680x1050    1920x1200    2560x1600


SLI rules this benchmark with GT 200 based parts coming out on top across the board. This game does scale very well with multiple GPUs, most of the time coming in over 80% (the exception is the 9800 GTX+ at 2560x1600). At higher resolutions, the AMD multiGPU options do scale better than their SLI counter parts, but the baseline NVIDIA performance is so much higher that it really doesn't make a big practical difference.




1680x1050    1920x1200    2560x1600


In terms of value, the 9800 GTX+ (at today's prices), leads the way in CoD. Of course, though it offers the most frames per second per dollar, it is a good example of the need to account for both absolute performance and value: it only barely squeaks by as playable at 2560x1600.

Because we see very good performance across the board, multiple GPUs are not required even for the highest settings at the highest resolution. The only card that isn't quite up to the task at 2560x1600 is the Radeon HD 4850 (though the 4870 512MB and 9800 GTX+ are both borderline).

Age of Conan Analysis Crysis Warhead Analysis
Comments Locked

95 Comments

View All Comments

  • kmmatney - Monday, February 23, 2009 - link

    Especially at the 1920 x 1200 resolution - that resolution is becoming a sweetspot nowadays.
  • just4U - Monday, February 23, 2009 - link

    I disagree. I see people finally moving away from their older 17-19" flat panels directly into 22" wide screens. 24" and 1920/1200 resolutions are no where near the norm.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Correct, but he said sweet spot because his/her wallet is just getting bulgy enough to comtenplate a movement in that direction... so - even he/she is sadly stuck at "the end user resolution"...
    lol
    Yes, oh well. I'm sure everyone is driving a Mazerati until you open their garage door....or golly that "EVO" just disappeared... must have been stolen.
  • DerekWilson - Monday, February 23, 2009 - link

    The 1GB version should perform very similarly to the two 4850 cards in CrossFire.

    The short answer is that the 1GB version won't have what it takes for 2560x1600 but it might work out well for lower resolutions.

    We don't have a 1GB version, so we can't get more specific than that, though this is enough data to make a purchasing decision -- just look at the 4850 CrossFire option and take into consideration the cheaper price on the 1GB X2.
  • politbureau - Tuesday, June 1, 2010 - link

    I realize this is an older article, however I always find it interesting to read when upgrading cards.

    While I find it admirable that Derek has compared the 'older' GTX 280 SLI scaling, it is unfortunate that he hasn't pointed out that it should perform identically to the GTX 285s if the clocks were the same.

    This was also passed over in the "worthy successor" article, where it does not compare clock for clock numbers - an obvious test, if we want to discover the full value of the die shrink.

    I recently 'upgraded' to 3 GTX 285s from 3 GTX 280s through warranty program with the mfg, and there is little to no difference in performance between the 2 setups. While cabling is more convenient (no 6 to 8 pin adapters), the 285s won't clock any better than my 280s would, Vantage scores are within a couple hundred points of each other at the same clocks (the 280s actually leading), and the temperature and fan speed of the new cards hasn't improved.

    I think this is a valuable point in an article that compares performance per dollar, and while slightly outside the scope of the article, I think it's a probabtive observation to make.

Log in

Don't have an account? Sign up now