Call of Duty: World at War


While not our favorite Call of Duty game, World at War certainly improves upon the graphics quality of previous versions. We play through the first few minutes of the Semper FI level by following a repeatable course and capture our performance results with FRAPS. We set the various graphics and texture options to their highest settings with AA at 2x and AF at 8x.

Call of Duty: World at War - Semper FI

This game is not particularly hard on either the GPU or CPU, but we do hit a hard cap at 94fps. At 1680x1050 the Phenom II X3 720BE (henceforth, 720BE) is around 4% slower in average frame rates than the X4 940 in single card and CrossFire X mode due to a 200MHz clock speed advantage in this game. When overclocked, the 720BE equals the X4 940, although both trail the Intel platforms by 3% in average frame rates and 18% in minimum frame rates.

Adding a second card for CrossFire operation improves average frame rates by 11%, but minimum frame rates decrease 14% for the 720BE. Overclocking the 720BE resulted in a 6% improvement in average frame rates and 35% in minimum frame rates in CrossFire performance.

Call of Duty: World at War - Semper FI

We have roughly the same performance results at 1920x1200 when comparing the platforms. The Phenom II X3 720BE is competitive with the Intel platforms and X4 940 in single card and CrossFire operation, though minimum frame rates in CrossFire mode trail the other solutions up to 17%. Once we overclock the CPUs, the minimum frame rate is about 15% lower on the Phenom II processors compared to the Intel products.

Installing a second card for CrossFire operation improves average frame rates by 25% and minimum frame rates increase 5% for the 720BE.  Overclocking the 720BE resulted in a 7% improvement in average frame rates and 37% in minimum frame rates.

We did not notice any difference in game play quality at either resolution between the platforms after playing through several of the levels. Each platform offered a very smooth and fluid gaming experience. We thought the higher minimum frame rates on the Intel systems would be noticeable during the heavy action scenes in the jungle, but honestly, we could not tell the systems apart during testing.  The 720BE provided an equal gaming experience to the other processors with our settings. 

The Setup... Crysis Warhead
Comments Locked

59 Comments

View All Comments

  • Hrel - Saturday, March 28, 2009 - link

    Great Article, good stuff to know. However, in the future, when you're doing a CPU article I'd appreciate it if you'd include results for the E8400 Core 2 Duo; as that was the processor to get for some time. I'm sure I'm not the only one that put that CPU in just about every computer built for some time. It'd be nice to see how it stacks up to the newer CPU's.

    On another note; it'd be nice if you could include performance results for the 8800GT/9800GT in future GPU articles; as that WAS the GPU to get for so long. I'm sure there are tons of people out there that still have that GPU running in their systems. No matter what the point of the article is or what level of intensity you're putting the cards through; it's always nice to have something to go off of to compare what you have now; to what's just coming out.
  • 7Enigma - Tuesday, March 31, 2009 - link

    My gaming computer I built back in January uses the E8500 as at the time it was for me the best bang for the buck. I have it OC'd to 3.85GHz on stock voltage, with tremendous headroom if I ever decide to up the voltage (Xigmatek 120mm rifle cpu cooler). If I was building today I would probably go with the AMD X3, but for my gaming (currently on 19" LCD with plans to go to 24" probably in the next 2 years), the E8500 has more than enough grunt.
  • atlmann10 - Saturday, March 28, 2009 - link

    The percentage of people who actually use a computer to it's abilities today is so minimal its almost self defeating. This percentage on this boards and most hardware discussion boards would be a good bit different. But, when you average that to the computer users in the US that is unremarkable at any percentage level. Do not get mad at me; because, I am not talking at least specifically, on the large part of people on here. Software development much less released product is so far behind the hardware market it is also unremarkable. Yes maybe somewhere between 3-10% of high end newly releases games put a computer to a decent percentage of usage but that percentage is unremarkable at best. Next month AMD releases the rest of the Phenom 2's and in September INTEL releases it's newest processors to make this even more so. I think when I upgrade (probably the middle of this summer (Julyish), I will most likely go for the 920. But the arguments especially some of the more ridiculous in this exact forum are childish, and I imagine that super harsh comment was posted by a 13 year old. At least I would hope so, if not you (that commenter specifically) need to truly evaluate you mental sense of operation. For childish comments like that you need to go to the Nickelodeon website, if they have a discussion of this type of equipment. Anyway; the main point of my post is this, Software developers need to get off there A77, and make some products that use the hardware to at least a 50% level more than 5% of the time.
  • Beno - Saturday, March 28, 2009 - link

    this just shows that those games arent written for quad core processors.
  • Summer - Saturday, March 28, 2009 - link

    The article did a good job concluding that one can build a good gaming machine without spending too much. Emphasizing on real game performance is a plus especially to the average consumer who just wants a decent system to play today's popular games. Hopefully the average anandtech reader won't think too much about the article and turn it into another AMD versus INTEL e-penis thread.

    SIDENOTE: I'll definitely be looking forward to the Northbridge article.


  • nubie - Saturday, March 28, 2009 - link

    What about performance of the x2 7750 that you can buy on ebay for $49 (free shipping) and Newegg for ~$65 (+tax and shipping)?

    I haven't even seen a review of this processor that I can remember, is it Phenom I or II? Is it built from a quad-core die like the x3?

    They seem to overclock well, would they do for a real budget gaming system, say ~$300-400 for the entire system including a HD4xxx or 9800 series card?

    I appreciate your "mainstream" bias, but some people have no money and just want to run the games or keep their system usable without laying out more than $100 for an upgrade.
  • Roland00 - Saturday, March 28, 2009 - link

    The x2 7750 is a Phenom 1 chip with two cores disabled. I haven't seen reviews here but there have been several reviews at other sites. It is comparable at stock to an e5200 from intel (the x2 7750 is barely faster, but not by much.)

    Once you start OC either chip the e5200 is a better chip for it has far more OC headroom. In addition the e5200 is comparable in price to the x2 7750.

    Eventually we are going to get Phenom II dual cores but that is going to be several months from now.
  • nubie - Saturday, March 28, 2009 - link

    Yeah, I found an e5200 for $59 on ebay, as soon as I can get my P6N RMA'd by MSI to support 45nm processors I am going to try for 4Ghz (my Scythe Infinity should be up to the task ;) )

    If AMD and DFI hadn't dropped support for my Infinity-AM2 I might have stayed with AMD.

    Thanks for the info on the Phenom 'x2'. I wonder when we will get a Phenom II tech processor for under $100 (preferably closer to $50).
  • iamezza - Tuesday, March 31, 2009 - link

    The x2 7750 actually does really well in gaming benchmarks compared to the e5200, but gives up a bit in application performance.
    It uses a lot more power than the intel chips though and has less overclocking headroom.
    It does have a potentially better upgrade path with the AM2+ socket being able to accept future AM3 CPUs from AMD whereas the intel socket 775 won't be getting any new CPUs for it.
  • buzznut - Saturday, March 28, 2009 - link

    So I didn't see too many comments that were actually about the article, go figure.
    I think its awesome that someone is writing articles that are for "the rest of us", people who cannot afford the latest and greatest and have to make compromises to build themselves a new system. Or don't receive products to test and so are able to build the most ridiculous, benchmark busting, $4000, top of the top end behemoth.

    If I had seen this article a month ago, I might not have bought the PII X4 940 and yet I am still glad I did. I do alot more than just gaming on my pc.

    I would find it even more interesting to take the intel processors and clock them to 3.8ghz (or same speed as the AMD proc.) and run the same battery of tests; yes I know that is not the point of this article.

    And I see why you would want to see the max performance available from each chip.

    About a year ago or even as recent as 6 months ago, everyone was saying what a stupid move it was for AMD to acquire ATI and basically counting AMD out as far as competing ever again. Right now, AMD doesn't look too stupid to me. Seems to me they are doing quite well with developing the "platform" as their strategy for getting their share of the market.

    Look at the way Nvidia and Intel are fighting right now. I think AMD has the right idea and are moving in a sound direction. I think they have compelling products, certainly from a budget pc users standpoint. I know others will not agree, judging from AMD's bottom line in the recent past and even currently, but they appear to be moving in a positive direction.

    I think AMD's graphics division is firing on all cylinders now.And as "bad" as the original Phenom was, the have become competitive again with Phenom 2. I am pretty impressed with the turn around.

Log in

Don't have an account? Sign up now