Doom 3 Performance

Under Doom 3, we see quite a performance hit. The fact that Id software makes heavy use of the stencil buffer for its shadowing means that there's more pressure on bandwidth per pixel produced than in other shader heavy games. In Half-Life 2, for instance, we see less of a performance impact when moving from the old 128-bit 6200 to the new version.

Doom 3 Performance
Doom 3 Resolution Scaling

Our resolution scaling graph shows that the performance gap between the X300 and 6200 parts closes slightly as resolution increases. Running High Quality looks great even at 640x480, but we would suggest Medium Quality at 800x600 for playing the game.

The Test Far Cry Performance
Comments Locked

43 Comments

View All Comments

  • Cybercat - Wednesday, December 15, 2004 - link

    Basically, this is saying that this generation $90 part is no better than last generation $90 part. That's sad. I was hoping the performance leap of this generation would be felt through all segments of the market.
  • mczak - Wednesday, December 15, 2004 - link

    #12, IGP would indeed be interesting. In fact, TurboCache seems quite similar to ATI's Hypermemory/Sideport in their IGP.
  • Cygni - Wednesday, December 15, 2004 - link

    In other news, Nforce4 (2 months ago) and Xpress 200 (1 month ago) STLL arent on the market. Good lord. Talk about paper launches from ATI and Nvidia...
  • ViRGE - Wednesday, December 15, 2004 - link

    Ok, I have to admit I'm a bit confused here. Which cards did you exactly test, the 6200/16MB(32bit) and the 6200/32MB(64bit), or what? And what about the 6200/64MB, will it be a 64bit card, or a whole 128bit card?
  • Cybercat - Wednesday, December 15, 2004 - link

    What does 2 ROP stand for? :P *blush*
  • PrinceGaz - Wednesday, December 15, 2004 - link

    #15- I've got a Ti4200 but I'd never call it nVidia's best card. It is still the best card you can get in the bargain-bin price-range it is now sold at (other cards at a similar price are the FX5200 and Radeon 9200), though supplies of new Ti4200's are very limited these days.

    #12- Thanks Derek for answering my question about higher resolutions. As only the front-buffer needs to be in the onboard memory (because it's absolutely critical the memory accessed to send the signal to the display must always be available without any unpredictable delay), that means even the 16MB 6200 can run at any resolution, even 2560x1600 in theory though performance would probably be terrible as everything else would need to be in system memory.
  • housecat - Wednesday, December 15, 2004 - link

    Another Nvidia innovation done right.
  • MAValpha - Wednesday, December 15, 2004 - link

    I would expect the 6200 to blow the Ti4200 out of the water, because the FX5700/Ultra is considered comparable to the GF4Ti. By comparison, many places are pitting the 6200 against the higher-end FX5900, and it holds its own.
    Even with the slower TurboCache, it should still be on par with a 4600, if not a little bit faster. Notice how the more powerful version beats an X300 across the board, a card derived from the 9600 series?
  • DigitalDivine - Wednesday, December 15, 2004 - link

    how about raw perfomance numbers pitting the 6200 with nvidia's best graphics card imo, the ti4200.
  • plk21 - Wednesday, December 15, 2004 - link

    I like seing such an inexpensive part playing newer games, but I'd hardly call it real-world to pair a $75 video card with an Athlon64 4000+, which Newegg lists at $719 right now.

    It'd be interesting to see how these cards fare with a more realistic system for them to be paired with, i.e. a Sempron 2800+

Log in

Don't have an account? Sign up now