Call of Duty: World at War


While not our favorite Call of Duty game, World at War certainly improves upon the graphics quality of previous versions. We play through the first few minutes of the Semper FI level by following a repeatable course and capture our performance results with FRAPS. We set the various graphics and texture options to their highest settings with AA at 2x and AF at 8x.

Call of Duty: World at War - Semper FI

This game is not particularly hard on either the GPU or CPU, but we do hit a hard cap at 94fps. At 1680x1050 the Phenom II platform is able match either Intel platform in single card and CrossFire mode, although minimum frame rates favor Intel slightly. When overclocked, the Phenom II is only about 2% slower in average frame rates but minimum frame rates are 20% lower.

Adding a second card for CrossFire operation improves average frame rates by 12% and minimum frame rates decrease 2% for the Phenom II. The Intel Q9550 has an 11% increase in average frame rates and 4% in minimum frame rates. The Core i7 average frame rates improve by 13% and minimum rates decrease by 12%. Overclocking our processors resulted in a 2%~5% improvement in average frame rates with the Q9550 benefiting the greatest.

Call of Duty: World at War - Semper FI

We have roughly the same performance results at 1920x1200 when comparing the platforms. The Phenom II is competitive with the Intel platforms in single card and CrossFire operation, though minimum frame rates in CrossFire mode trail the Intel solutions around 5% on average. Once we overclock the CPUs, the minimum frame rate is about 16% lower on the Phenom II compared to the Intel products.

Installing a second card for CrossFire operation improves average frame rates by 27% and minimum frame rates increase 16% for the Phenom II. The Intel Q9550 has a 28% increase in average frame rates and 24% in minimum frame rates. The Core i7 average frame rates improve by 29% and minimum rates increase by 30%. Overclocking our processors resulted in a 4%~5% improvement in average frame rates with the Q9550 benefiting the most.

We did not notice any difference in game play quality at either resolution between the platforms after playing through several of the levels. Each platform offered a very smooth and fluid gaming experience. We thought the higher minimum frame rates on the Intel systems would be noticeable during the action scenes in the jungle, but we honestly could not tell the systems apart during testing.

Test Setup Crysis Warhead
Comments Locked

68 Comments

View All Comments

  • Joe Schmoe - Tuesday, February 3, 2009 - link


    This was a very good article. I'm not quite ready to build a new system just yet. But it is tax return season. I'm glad the Phenom II is competitive. We all win when AMD puts out a nice chip. I was about to jump on the I7 band wagon but decided to just grab a q6600 and save my coins for now. Hopefully this will end some of the endless flame wars going on through the forums.
  • Aquineas - Tuesday, February 3, 2009 - link

    First of all, thanks for the hard work you put into testing. Many folks are getting hung up on 5-10 percent performance differences and making a big deal out of it . I think the most important part of the article is the part where it says, repeatedly (paraphrased):

    "We couldn't perceive a difference in gaming performance between platforms."

    That being said, I think 18 months from now we'll see more games where the CPU differential matters more, which is right around the time I'll be doing my next system build.
  • myterrybear - Tuesday, February 3, 2009 - link

    I agree with this as well, great job on the article & shows the point as I have ALWAYS said, when it comes down to it would ya even notice the diffrence between the 2 if you had just sat down on it & started to do stuff on it ??

    Yeah exactly 6 or 8 gig ram on Phenom II would be interesting, I know I've found 4 gigs on Phenom I to be very nice now that I am running a full 64bit os ( win 7 beta) on a oc to 3 ghz Phenom 9850 be. I'm just awaiting to see how things will be once I get my Phenom II 940 any second now. :)
  • myterrybear - Tuesday, February 3, 2009 - link

    My thing that I am noticing with all these tests of core i7 vs phenom II is the fact the systems are not even ramwise. I mean what would a core i7 run like with 4 gigs of ram or if the phenom II platform had 6 gigs of ram.

    it's a valid argument I think.
  • Aquineas - Tuesday, February 3, 2009 - link

    Honestly, it probably wouldn't matter much. If I were the author I'd re-run the test with 8GB on the PII, but it's probably less than a 2 percent differential.
  • BlueBlazer - Tuesday, February 3, 2009 - link

    Love to see Intel and AMD in SLI numbers!
  • ThePooBurner - Monday, February 2, 2009 - link

    Am i the only one that noticed that the results for the PhenomII were just about identical between resolutions? There should have been some form of difference unless the AMD platform is being artificially hard-capped for some reason. Otherwise that the frame rates would be identical when upping the resolution makes no sense at all. I suggest looking into it further.
  • ThePooBurner - Monday, February 2, 2009 - link

    Err, Crysis Warhead is what i meant by FarCray2.
  • 7Enigma - Tuesday, February 3, 2009 - link

    This is a perfect example of why the full data is so incredibly important in teasing out the details.

    Yes if you look at the graphs they show a very close clustering for the single card, CF, and overclocked CF, but if you look to the right of the names you will see the min and more importantly max will scale with upgraded components. Not to the same level as one would like but there appears to be some really REALLY rough sections as the min frame rate is almost identical across the board (look at single vs. CF you see the same frame rate). That is probably due to some driver issue where both cards are not being utilized and the single card is not optimized well either.
  • ThePooBurner - Wednesday, February 4, 2009 - link

    I think you are missing my point. When going to a higher resolution it is expect that the frame rates for a card will change. Both the min and the max as well as the average. In almost every single game tested the values for the ATi cards at all resolutions are nearly identical. This smells very fishy to me and makes me think there is some sort of artificial limit being placed on the ATi hardware.

Log in

Don't have an account? Sign up now