Call of Duty: World at War Performance

Call of Duty 5 is pretty good looking, though we much prefer the previous title in the series as far as the gaming experience goes. This newly added test at AnandTech seems to favor NVIDIA hardware. We test a section near the beginning of the game with a straight line FRAPS run through in the woods.

Call of Duty World at War

We see the GeForce GTX 285 come out way ahead of the Radeon 4870 1GB, but the 4870 X2 does best our newly released part by about 18% which is better than the 13% price premium paid for the fastest AMD single card over the GTX 285. As for the improvement the overclocked GTX 285 brings, it's more than 12% at 2560x1600 which is pretty good.


Click to Enlarge

Looking at the rest of the tests show a bit of a performance barrier near 60 FPS despite the fact that we set max fps to 1000 and disabled vsync everywhere we could. Framerates under 2560x1600 are fairly compressed with a little separation at 1920x1200 that still shows the NVIDIA advantage, but not as clearly.

Age of Conan Performance Crysis Warhead Performance
Comments Locked

76 Comments

View All Comments

  • Gorghor - Tuesday, January 27, 2009 - link

    Actually more of a retorical question than anything else. Sales haven't been good and no support for dual DVI when using the Hybrid Power mode are the reasons I've heard about. I still don't understand why they don't push these Hybrid technologies.

    I mean, in a day and age where everybody's talking about saving our planet, it just seems idiotic to push ever more power hungry graphic cards eating up as much as a 600 liter marine aquarium. What a damned waste, not to mention the fact that electricity is far from cheap (any of you tried living in Germany?). The worse part is that the technology exists and works (both from ATI and Nvidia) in laptops, so it can't be all that complicated to make a decent version for the desktop. It's just that no one seems to care...

    Well I for one can't stand the idea of wasting power on an idle graphics card that could just as well be disabled when I'm not gaming (read: 99% of the time). And I wish more people would think the same.

    Sorry 'bout the rant, just makes me angry!
  • veryhumid - Monday, January 26, 2009 - link

    One thing I'd like to see is some older cards in the mix... just a couple. Maybe a GX2, 8800GT... recently popular cards. I'm curious to see how big and how fast the performance gap gets over a fairly short period of time.
  • MadBoris - Monday, January 19, 2009 - link

    A couple things...

    Here we have a die shrink and you are apparently more interested if you can save a few watts rather than how high the OC can go?

    I think most when looking at this card would be more interested if it can do 10% overclock rather than 10% power savings.

    How high does it OC, did I miss it somewhere?

    People aren't buying brute force top-end video cards for their power consumption just like a Ferrari isn't purchased for it's gas mileage. I'm not saying power consumption doesn't have a place but it's a distant second to performance that a die shrink offers in stable OC potential.

    I also have to echo the statements about 2560x1600, being the standard chart, may need some rethinking. I get it that at those resolutions that is where these cards shine and start leaving behind weaker cards. BUT it's a very small percentage of readership that has 30" monitors. I'm at 24" with 1920 and that is probably not common. It would seem to make the best sense to target primarily the most common resolutions which people may be tempted to purchase for. Probably 1680 or 1920 most likely. Cheaper video cards do much better in comparison at smaller resolutions, which are the actual resolutions most are using. I get it that the chart below shows the different resolutions but that is where 2560 should be found, it shouldn't be the defacto standard. Reminds me of using 3dmark and how the numbers don't reflect reality, ofcourse these cards look good at 2560 but that isn't what we have.
    ~My 2 cents.

  • SiliconDoc - Monday, January 19, 2009 - link

    Yes, the power savings freakism to save the whales, and the polar bears, is out of control. After all that, the dips scream get a 750 watt or 1000 watt, or you're in trouble, park that 400-500 you've been using - or the 300 in many cases no doubt. Talk about blubbering hogwash...and ATI loses in the power consumption war, too, and even that is covered up - gosh it couldn't be with that "new tech smaller core" .....
    Now of course when NVidia has a top end card for high rezolution, it's all CRAP - but we NEVER heard that whine with the 4870 - even though it only excelled at HIGHER and HIGHEST resolution and aa af - so that's how they got in the RUT of 2560x - it showed off the 4870 in all it's flavors for 6 months - and they can't help themselves doing it, otherwise the 4870 SUCKS, and SUCKS WIND BADLY... and get spanked about quite often.
    The high rez shows and reviews are for the red team bias win - and now suddenly when it six months of endless red raving for 2650x - all the eggheads realize they don't have that resolution sitting in front of them - because guess who - BLABBERED like mad about it.
    :-) YEP
    The only one to point it out on the 4870x2 and the like - and boy was I crushed... gosh what a fanboy...
    But now it's all the rave- so long as it's directed at NVidia.
    AND - the lower benchies make the ati top cards look CRAPPY in comparison.
    Oh well, I'm sure the reds know it - maybe they love the possibility of higher fps if they ever pop for a 30".
    _______________________________________________________

    All I can say is thank you NVidia for finally straightening out the DERANGED MINDS of the 4870 loverboys.

    Thank you, Nvidia - you may have cured thousands this time.
  • hk6900 - Saturday, February 21, 2009 - link


    Die painfully okay? Prefearbly by getting crushed to death in a
    garbage compactor, by getting your face cut to ribbons with a
    pocketknife, your head cracked open with a baseball bat, your stomach
    sliced open and your entrails spilled out, and your eyeballs ripped
    out of their sockets. Fucking bitch

    I really hope that you get curb-stomped. It'd be hilarious to see you
    begging for help, and then someone stomps on the back of your head,
    leaving you to die in horrible, agonizing pain. *beep*

    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.

    You're dead if I ever meet you in real life, f ucker. I'll f ucking
    kill you.

    I would love to f ucking send your f ucking useless ass to the
    hospital in intensive care, fighting for your worthless life.

    http://www.youtube.com/watch?v=Po0j4ONZRGY">http://www.youtube.com/watch?v=Po0j4ONZRGY

    I wish you a truly painful, bloody, gory, and agonizing death, *beep*
  • MadBoris - Monday, January 19, 2009 - link

    All the green and red aside...I didn't mean to bash AT or Derek.
    Just wondering what happened to the reviews of years back when we had more than just one chart to gawk at and make determinations for a product. With more charts we could analyse easier, or maybe their was a summary. The paper launch review, was just that, and maybe this limited review is kind of a message to Nvidia to not paper launch but...we lose.

    Even though this is just a refresh with a die shrink, I still think it's worth revisiting and restating some of what may be obvious to some who breathe GPU's, but which I don't remember from 6 months ago.

    Like...the whole landscape for todays purchasing decisions.
    Effects of CPU scaling, AA and AF penalties for a card, which resolutions make sense for a high end card with a GB memory.
    You don't have to test it and graph it but give us a reminder like 4x AA is free with this card and not this one, or CPU scaling doesn't mean as much as it once did, or this card doesn't make sense below 1680x1050, or this amount of memory isn't quite needed these days unless you are at 2560, etc. A reminder is good. I was surprised not to see a mention of highest OC and what % perf that gets you in the same article talking die shrink, so things have changed. OC isn't the black art it once was, it's often supported right in the mfr drivers so it shouldn't be passed up. I always run my HW at a comfortable OC (several notches back from highest stable), why wouldn't I if it's rock solid.

    I like 1 GB memory but there's only a few games that can break my 640MB GTS barrier (oblivion w/ tweaks, Crysis w/ tweaks, maybe there are others I don't have). I'd like 1GB more if Win 7 wasn't still supporting 32 bit and then we can see devs start using bigger footprints. But due to multiplatforming and the one size fits all lowest common denominator console hardware being the cookie cutter, well 1GB memory means less than it probably ever did in the last couple years since games won't be utilizing it.
  • SiliconDoc - Tuesday, January 20, 2009 - link

    I like all your suggestions there. As for the reviewer(Derek), I can't imagine the various pressures and mess to put together even one review, so I don't expect much all the time, and I don't expect a reviewer to not have a favorite- and even push for it in their pieces - consciously or otherwise. The reader needs to allow for and understand that.
    I think what we do get should be honestly interpreted and talked about (isn't that really the whole point besides the interest and entertaining valuie), so I'm more about straightening out longstanding hogwash the overclocking masses repeat.
    You made some fine points I hope Derek sees it - I just imagine they are crushed for time of late, perhaps the economic cutbacks have occured at AnAnd as well - but a bit of what you pointed out should be easy enough. Maybe they will roll it in.
  • hk6900 - Saturday, February 21, 2009 - link

    Get murdered, cunt
  • MadBoris - Monday, January 19, 2009 - link

    In all fairness, Derek does summarize some of what one can draw from the numbers, atleast mainly 2560 graphs and how it compares to the 280 price/perf. But like my subject states you can't please everyone, I guess I would like to see more, maybe that is just me.
  • Muzzy - Saturday, January 17, 2009 - link

    The box said "HDMI", but I don't see any HDMI port on the card. What am I missing?

Log in

Don't have an account? Sign up now