Call of Duty World at War Analysis

This game, as with previous CoD installments, tends to favor NVIDIA hardware. The updated graphics engine of World at War, while looking pretty good, still offers good performance and good scalability.




1680x1050    1920x1200    2560x1600


In this test, even though we disabled the frame rate limit and vsync, single GPU solutions seem limited to around 60 frames per second. This is part of why we see beyond linear scaling with more than one GPU in some cases: it's not magic, it's that single card performance isn't as high as it should be. We don't stop seeing artificial limits on single GPU performance until 2560x1600.




1680x1050    1920x1200    2560x1600


SLI rules this benchmark with GT 200 based parts coming out on top across the board. This game does scale very well with multiple GPUs, most of the time coming in over 80% (the exception is the 9800 GTX+ at 2560x1600). At higher resolutions, the AMD multiGPU options do scale better than their SLI counter parts, but the baseline NVIDIA performance is so much higher that it really doesn't make a big practical difference.




1680x1050    1920x1200    2560x1600


In terms of value, the 9800 GTX+ (at today's prices), leads the way in CoD. Of course, though it offers the most frames per second per dollar, it is a good example of the need to account for both absolute performance and value: it only barely squeaks by as playable at 2560x1600.

Because we see very good performance across the board, multiple GPUs are not required even for the highest settings at the highest resolution. The only card that isn't quite up to the task at 2560x1600 is the Radeon HD 4850 (though the 4870 512MB and 9800 GTX+ are both borderline).

Age of Conan Analysis Crysis Warhead Analysis
Comments Locked

95 Comments

View All Comments

  • MamiyaOtaru - Tuesday, February 24, 2009 - link

    So we have to be perfect in every way to point out errors? NBA players shouldn't listen to their coaches because their coaches can't play as well as they do? Game reviewers shouldn't trash a game because they couldn't make a better one?
  • ggathagan - Tuesday, February 24, 2009 - link

    When it comes to grammatical errors as insignificant as the ones pointed out, yes.
    If you're going to be that critical, then you best check your own grammar.
  • cptnjarhead - Wednesday, February 25, 2009 - link

    Grammar shmammar, you guys need to move out of your mom’s basement and get laid. :)
  • bigboxes - Wednesday, February 25, 2009 - link

    +1
  • stym - Monday, February 23, 2009 - link

    I am curious to see how a pair of radeon 4830 would perform in this lineup. A single one is quite weak at those resolutions, but I am willing to bet a pair of those would hold its own against a single GTX280.
    Oh, and it would be much cheaper, too ($180 including the bridge).

    Could you possibly include that setup next?
  • DerekWilson - Monday, February 23, 2009 - link

    You are right that a single 4830 won't be enough perform on par with these guys ... but I don't think two of them would really be worth it against the GTX 280 except maybe at lower resolutions. The 1GB 4830 will run you at least $145, so you're looking at $290 for two of them and the 4850 X2 2GB is the same price. The 512MB 4830 will be limited by memory usage at higher resolutions just like the 4850 512MB.

    We might look at the 4830 in CrossFire internally and see if it warrants an update, but so far it isn't in the roadmap for the rest of the series.
  • stym - Monday, February 23, 2009 - link

    I was thinking 512MB 4830s, which are in the $90~$110 price range. That price range is the only reason I mention them, because it puts the price tag of a pair of those in the exact same range as a Radeon 4830 512MB or even a GTX260.

    You said that a 4850 1GB doesn't make sense, and that's even more obvious for a 4830.

  • pmonti80 - Monday, February 23, 2009 - link

    I find too that this would be an interesting match at the $200+ pricetag.
  • wilkinb - Monday, February 23, 2009 - link

    why not just drop AoC, it was bad when it came out, has always had issues and odd results and no one i know played for more then 2 months...

    If you want to have a mmo, why not use one that people play? and maybe even more mature in development...

    I know you will say it adds value, but you dont know its it bad code or showing a different view.
  • ajoyner - Tuesday, February 24, 2009 - link

    Most of the issues with the game are gone. There are currently no other MMO's out there that have the graphics or combat system to tax a gpu like this game. Your comment on testing a game that people play is very subjective. There are many MMO's out there that I would not touch....WOW, cough, cough.....but that doesn't mean other people don't enjoy them. I think having this game as one that is regularly benchmarked adds a great deal of value to the article.

Log in

Don't have an account? Sign up now