Call of Duty World at War Analysis

This game, as with previous CoD installments, tends to favor NVIDIA hardware. The updated graphics engine of World at War, while looking pretty good, still offers good performance and good scalability.




1680x1050    1920x1200    2560x1600


In this test, even though we disabled the frame rate limit and vsync, single GPU solutions seem limited to around 60 frames per second. This is part of why we see beyond linear scaling with more than one GPU in some cases: it's not magic, it's that single card performance isn't as high as it should be. We don't stop seeing artificial limits on single GPU performance until 2560x1600.




1680x1050    1920x1200    2560x1600


SLI rules this benchmark with GT 200 based parts coming out on top across the board. This game does scale very well with multiple GPUs, most of the time coming in over 80% (the exception is the 9800 GTX+ at 2560x1600). At higher resolutions, the AMD multiGPU options do scale better than their SLI counter parts, but the baseline NVIDIA performance is so much higher that it really doesn't make a big practical difference.




1680x1050    1920x1200    2560x1600


In terms of value, the 9800 GTX+ (at today's prices), leads the way in CoD. Of course, though it offers the most frames per second per dollar, it is a good example of the need to account for both absolute performance and value: it only barely squeaks by as playable at 2560x1600.

Because we see very good performance across the board, multiple GPUs are not required even for the highest settings at the highest resolution. The only card that isn't quite up to the task at 2560x1600 is the Radeon HD 4850 (though the 4870 512MB and 9800 GTX+ are both borderline).

Age of Conan Analysis Crysis Warhead Analysis
Comments Locked

95 Comments

View All Comments

  • Nighttrojan - Wednesday, February 25, 2009 - link

    Well, the problem is that I tried it at the same settings too, the 7600gt was still faster. The difference is so great that it takes about 15s for the 4870 to even process a change in settings while with the 7600 gt it's practically instant.
  • mrmarks - Tuesday, February 24, 2009 - link

    Is bolting two video cards together really necessary? I've played Call of Duty at max settings on my two year old midrange card and it ran beautifully. These cards seemed to be designed for games not of today or tomorrow, but rather for games that may never exist. Few store fronts sell pc games today, and many of the games produced today are not terribly graphics intensive. Also, most popular pc games are available on consoles, which are much more practical. I know this sounds negative but the truth is that the video card manufactures are just ignoring the current pc game market.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Geeze dude, there are hundreds of PC games out every year. I suppose the couch console has a wider selection, but it's not like we're dead space here. (hahah)
    I guess it's easier to sell to Santa for the little curtain climbers when even Daddy has a resistence problem. ( Believe me, it's a standing joke amongst friends).
    Then we have the kiddies screwing up the computer issue - not a problem when the household heads do so... the console keeps the PC functional, so to speak.
    But really, lots of game choices for the PC - the biggest retailer in the USA - you know whom - has loads of PC games. Ever heard of best buy - aww just forget it.
    Just go whine somewhere else would ya ?
  • Elfear - Tuesday, February 24, 2009 - link

    ^ Go take a 2nd look at the 2560x1600 res graphs. In almost every case the single cards are struggling to keep their heads above water and in a few cases failing miserably. Ideally, minimum framerates would be >60fps in all of today's games for that buttery-smooth feel but we are far from that with even dual graphics cards.

    I agree with you that dual graphics cards are not needed by most gamers but to say that there is no need period is ignoring reality.
  • JPForums - Tuesday, February 24, 2009 - link

    I'm confused about your opinion on AMD drivers. You made this comment in this article:

    "Because of AMD's driver issues, we often have to wait when new games come out to enjoy proper CrossFire scaling with them. And when hotfixes come out it takes more than a month (usually more like two) to fully integrate changes into a WHQL driver."

    However, you've also made comments in multiple articles similar to this one from GPU Transcoding Throwdown: Elemental's Badaboom vs. AMD's Avivo Video Converter on December 15th, 2008:

    "The train wreck that has been the last few months of Catalyst has happened before and it will happen again as long as AMD puts too many resources into pushing drivers out every month and not enough into making sure those drivers are of high enough quality."

    So on the one hand it seems you are criticizing AMD for not releasing WHQL drivers soon enough. While on the other hand, you seem to want them to take more time with their drivers. Anandtech is not the kind of site to post contradicting opinions on a whim, so I have to assume that your opinion simply hasn't been stated clearly (or I just missed it).

    I remember a comment from an article to the effect of nVidia has a good driver release model. It is interesting to note that nVidia's driver release schedule over the last 6+ months hasn't been all that different from AMD's schedule.

    nVidia's driver release schedule:

    182.06 February 18, 2009
    181.22 January 22, 2009
    181.20 January 8, 2009
    180.48 November 19, 2008
    178.24 October 15, 2008
    178.13 September 25, 2008
    177.41 June 26, 2008
    177.35 June 17, 2008

    182.05b February 10, 2009
    181.22b January 16, 2009

    While nVidia doesn't have a specific release data, they have on average put an update out nearly every month. Being a user of more nVidia hardware than AMD hardware I know that nVidia uses beta drivers in a similar fashion to how AMD uses hot fixes.

    In my opinion, AMD needs to ditch the set release date. They should try to target updates about once a month, but release them only when they are ready. I do think their driver team is likely understaffed, but I a think nVidia's The Way it's Meant to be Played program plays a significant role in why nVidia can support certain games well at launch and AMD can't. Though, I can't say whether it's simply good nVidia developer relations or forced negligence towards AMD. I don't really have a problem with them releasing hotfixes, especially given that AMD has had issues supporting some games at launch, but if their driver team and/or developer relations were better, they wouldn't need them as often.

    I would, like to hear more clearly stated opinions from Derek and Anand on this subject. Thanks.
  • SiliconDoc - Wednesday, March 18, 2009 - link

    Here's a clearly stated opinion. ATI KEEPS ******* IT UP, while their competition does a good job.
  • Patrick Wolf - Tuesday, February 24, 2009 - link

    What's the word on overclocking these multi-gpu setups? I can only speak for the 9800 GX2 but it's a very good OCer. EVGA GX2's lowest end model is 600/1500/1000. The SSC edition is 675/?/1100. Same card, just factory overclocked.

    I was running the lowest end model @ a stable 715/1787/1050 Currently testing out 720/1800/1065 (w/ the GX2's cover removed) with good results.
  • mpk1980 - Tuesday, February 24, 2009 - link

    i think the 2 gtx 260's in Sli are a good value right now...that setup comes in the top of the charts most of the time competing with the 280 and 285 sli setups and right now...can be had for less than 400 at newegg after rebate (380ish i believe) which is just as cheap at 1 gtx 285 and 120 cheaper than a 295.....i dont think you can go wrong with that right now....and i cant wait to see 3 of those badboys in tri sli :)
  • gigahertz20 - Monday, February 23, 2009 - link

    Very few people (like around 1%) that read this article play games at 2560x1600, because that resolution requires a CRT monitor or a super high end LCD monitor (not even sure where to get an LCD that goes that high). I realize this article wanted to push the SLI and crossfire video card configurations and see what kind of FPS they would get at that resolution, but the FPS graphs in this article should be set at 1920x1200 by default, not 2560x1600, since that resolution is useless to almost every gamer.
  • 7Enigma - Tuesday, February 24, 2009 - link

    I think the point was to show that UNLESS you are running >24" monitors there is no REASON to have CF/SLI. I would normally agree with you but in this particular review I agree with the writer.

    I still think the broken line graphs showing all resolutions on the same x/y axis was more beneficial.

Log in

Don't have an account? Sign up now