Doom 3 Performance

While the Doom 3 frenzy isn't nearly as bad as it was a month ago, the performance seen under id's latest engine is quite important as there are a number of games in development right now using the Doom 3 engine. We have two sets of benchmarks here to look at - playable resolution benchmarks, as well as a chart of performance vs. resolution to see how the cards compare at sometimes not-so-playable resolutions.

Since we're dealing with relatively entry-level cards, we found that the perfect balance between frame rate and image quality lands at 800x600, and thus, that's where our first graph comes from.

Here, we see that the GeForce 6600, despite its lower fill rate and lower memory bandwidth, is still able to outperform the regular X700 by about 8%. It's not a huge margin, but impressive considering that the card is underpowered compared to the X700. The explanation as to "why" is more of an architectural discussion, as we've seen that NVIDIA's GeForce 6 series of GPUs are much better suited for the Doom 3 engine than ATI's.

The GeForce 6200 comes in a valiant third, clearly outperforming the 4-pipe competitors from ATI and nipping away at the heels of the slightly more expensive X700. Here's the tricky part though. Depending on what price the 6200 and X700 are actually available for when they hit the streets, the recommendation could go either way. At the same price, the X700 is the clear winner here, but at a higher price, the decision becomes more of a question of budget rather than which one to pick.

Doom 3 - Demo1

Next, we have the resolution scaling chart to see how all of these cards fair in the grander scheme of things. Here, we see that none of the cards are particularly CPU limited under Doom 3 and all of them experience a serious drop in performance as you increase the resolution. Doom 3 is clearly taxing enough for even the fastest of contenders here.



What about playability? We took some notes during our testing of the cards and will share them here as to what our gaming experiences were with all of the cards in a section we like to call "Notes from the Lab".

ATI X300: The card is clearly slower than the 6200. The added memory bandwidth gives it a performance advantage over the 64-bit SE, but it's nowhere near in the same league as the 6200. ATI desperately needs to have an X800 derived part for their low-end, much like they have in the mid-range with the X700.

ATI X300SE: The game plays "OK" at 640x480, definitely sluggish in certain areas. The aliasing is particularly bad at 640, so the resolution only really works if you have a small monitor or if the person playing isn't much of a gamer at all and has never been introduced to the fact that you can get rid of aliasing. At 800x600, things just get too slow for comfort and beyond that is basically unplayable.

ATI X600 Pro: You can't notice any visual quality differences between ATI and NVIDIA when it comes to Doom3, not to mention that the game is frankly too dark to notice any differences in texture filtering to begin with. 640x480 and 800x600 play quite well on the X600, despite the fact that the frame rate is clearly lower than the two NVIDIA cards. Unfortunately, anything above 800x600 is a bit too slow on the X600 Pro. It's "playable", but honestly, just frustratingly slow compared to the other cards.

ATI X700: The X700 performs clearly better than the X600 Pro and close to the 6600, but the 6600 is clearly faster in actual gameplay.

NVIDIA GeForce 6200: 800x600 seems to be a sweet spot of image quality to performance ratio for the 6200. The game played very smooth with no noticeable image quality issues. 1024x768 looked better, but started to get a little slow for our tastes. 1280x1024 was far too slow, although it looked great. If you want to go up to 1280, you're going to want to go for a 6600 at least.

NVIDIA GeForce 6600: At 800x600, the 6600 completely blows away the 6200; it makes the 6200 feel like a slow card. 1024x768 is still sluggish in places, but overall, much better than the 6200. 1280x1024 is fine when just walking around, but once you get enemies on the screen and they start attacking you, things slow down. It may be that it takes the 6600GT to truly be smooth at this resolution. That being said, it continues to amaze us about how good lower resolutions look in Doom 3.

Intel Integrated Graphics: Surprisingly enough, Intel's integrated graphics will actually run Doom3, but it is basically unplayable at medium quality at 640x480 - not to mention that we couldn't get it to complete a single benchmark run (the driver kept on crashing).

The Test Half Life 2 (Source) Visual Stress Test
Comments Locked

44 Comments

View All Comments

  • PrinceGaz - Tuesday, October 12, 2004 - link

    I'm assuming the 6200 you tested was a 128-bit version? You don't seem to mention it at all in the review, but I doubt nVidia would send you a 64-bit model unless they wanted to do badly in the benchmarks :)

    I don't think the X700 has appeared on an AT review before, only the X700 XT. Did you underclock your XT, or have you got hold of a standard X700? I trust those X700 results aren't from the X700 XT at full speed! :)

    As #11 and #12 mentioned, with the exception of Doom 3, the X600 Pro is faster than the 6200:

    Doom 3 - 39.3 60.1 (-35%)
    HL2 Stress Test - 91 76 (+20%)
    SW Battlefront - 45 33 (+36%)
    Sims 2 - 33.9 32.2 (+5%)
    UT2004 (1024x768) - 46.3 37 (+25%) [they were CPU limited at lower resolutions]
    BF Vietnam - 81 77 (+5%)
    Halo - 45.2 44 (+3%)
    Far Cry - 74.7 60.6 (+23%)

    So the X600 Pro is slower than the 6200 (128-bit) in Doom 3 by a significant amount, but its marginally faster than it in three games, and its significantly faster than the 6200 in the other three games and also the HL2 Stress Test. So that makes the X600 Pro the better card.

    The X700 absolutely thrashed even the 6600, let alone the 6200, in every game except of course Doom 3 where the 6600 was faster, and Halo where the X700 was a bit faster than the 6600 but not by such a large amount.

    Given the prices of the ATI cards, X300SE ($75), X300 ($100), X600 Pro ($130), X700 (MSRP $149); the 6600 is going to have to be priced at under its MSRP of $149 because of the far superior X700 at the same price point. Lets say a maximum of $130 for the 6600.

    If thats the case, I can't see how the 6200 could have a street-price of $149 (128-bit) and $129 (64-bit). How can the 6200 (128-bit) even have the same price as the faster 6600 anyway? Its also outperformed by the $130 X600 Pro which makes a $149 price ridiculous. I think the 6200 will have to be priced more like the X300 and X300SE-- $100 and $75 for the 128-bit and 64-bit versions respectively, if they are to be successful.

    Maybe most 6200's will end up being cheap 64-bit cards that are sold to people who aren't really bothered about gaming, or who mistakenly believe the amount of memory is the most important factor. You just have to look at how many 64-bit FX5200's are sold.
  • Shinei - Tuesday, October 12, 2004 - link

    The PT Barnum theory, wilburpan. There's a sucker born every minute, and if they're willing to drop $60 for a 64-bit version of a card when they could have had a 128-bit version, so much the better for profits. The FX5200 continues to be one of the best selling AGP cards on the market, despite the fact that it's worse than a Ti4200 at playing games, let alone DX9 games.
  • wilburpan - Tuesday, October 12, 2004 - link

    "The first thing to notice here is that the 6200 supports either a 64-bit or 128-bit memory bus, and as far as NVIDIA is concerned, they are not going to be distinguishing cards equipped with either a 64-bit or 128-bit memory configuration."

    This really bothers me a lot. If I knew there were two versions of this card, I definitely would want to know which version I was buying.

    What would be the rationale for such a policy?
  • wilburpan - Tuesday, October 12, 2004 - link

  • nserra - Tuesday, October 12, 2004 - link

    Why do you all keep talking about the Geforce 6600 cards (buying them) when the X700 was the clear winner?
    You all want to buy the worst card (less performing)? I dont understand.

    Why dont anantech use 3Dmark05?

    No doubt that mine 9700 was a magnificent buy almost 2 years ago. What a piece of cheat are the Geforce FX line of cards....
    Why didnt they use one (a 5600/5700) just to see...

    Even 4pipe line Ati cards can keep up with 8 pipe nvidia, gee what a mess... old tech yeah right.
  • coldpower27 - Tuesday, October 12, 2004 - link

    I am very happy you included Sims 2 into your benchmark suite:)

    I think this game like the amount of vertex processor on X700 plus it's advanatge in fillrate and memory bandwidth, could you please test the Sims 2 when you can on the high end cards from both vendors? :P
  • jediknight - Tuesday, October 12, 2004 - link

    What I'm wondering is.. how do previous generation top-of-the-line cards stack up to current gen mainstream cards?
  • AnonymouseUser - Tuesday, October 12, 2004 - link

    Saist, you are an idiot.

    "OpenGl was never really big on ATi's list of supported API's... However, adding in Doom3, and the requirement of OGL on non-Windows-based systems, and OGL is at least as important to ATi now as DirectX."

    Quake 3, RtCW, HL, CS, CoD, SW:KotOR, Serious Sam (1&2), Painkiller, etc, etc, etc, etc, are OpenGL games. Why would they ONLY NOW want to optimize for OpenGL?
  • Avalon - Monday, October 11, 2004 - link

    Nice review on the budget sector. It's good to see a review from you again, Anand :)
  • Bonesdad - Monday, October 11, 2004 - link

    Affordable gaming??? Not until the 6600GT AGP's come out...affordable is not replacing your mobo, cpu and video card...

Log in

Don't have an account? Sign up now