Doom 3 Graphics Deathmatch

by Derek Wilson on August 3, 2004 8:05 AM EST

High End Tests: Tourney

In this section, we will be taking a look at Ultra Quality performance, as well as various high resolutions and antialiasing settings. These are the only cards that can handle 1600x1200 or antialiasing. Though jaggies aren't a big problem (the artists designed the game very well with many low contrast edges), eliminating these minor annoyances is a luxury afforded to those with these latest generation power houses.

To clarify an issue we've noticed across the board, Ultra Quality runs perfectly fine on current generation high end hardware. Yes, the game recommends >500 MB of nice fast graphics RAM, as do we if one's intention is to play through the single player game. But, it is very important to note that all the multiplayer maps we've played on in the past few hours have been small enough to avoid the massive swapping that occurs when moving between parts of the world on the single player map.

This really means two things to us. When 512MB cards come around, we won't need any heavier artillery on the GPU side to tackle rendering the game. It also means that Deathmatch players can easily benefit from the Ultra Quality setting immediately.

Not all tests are without problems, and this time we experienced some issues with our 6800 Ultra Extreme part. We noticed visual artifacts as we were running our AA tests. These kept getting worse as time went on, and not even letting the card cool down would help fix the problem. Eventually, our system rebooted while we were testing and wouldn't get through another benchmark run. John Carmack has spoken of possible issues when overclocking a graphics card with Doom 3, and this may or may not have an impact on factory overclocked parts. We will absolutely keep our ears open and our test beds working to try to determine if this is just an isolated random GPU death, or if there is some other evil at work.

Another intersting observation we've made is that if your 6800 Ultra Extreme can handle the game, Ultra Quality at 1600x1200 with 4xAA is a playable reality. But enough talk; feast on the numbers.

The Test Midrange Tests: Team DM
Comments Locked

71 Comments

View All Comments

  • redscull - Friday, September 3, 2004 - link

    Thought I'd share my own benchmarks, especially so people can see the impact the CPU makes. I ran once to let the demo cache, then took the average of two following times.

    Shuttle SN85G4V2, Athlon 64 3000+, 1GB PC3200, XFX 6800 GT 256MB, Raptor drive

    1600x1200, no AA, High Quality: 58.2
    1024x768, 4xAA, High Quality: 63.0

    I'm 0.7 and 2.2 fps slower in those tests, and the only real difference between my system and the review system is about a $550 CPU upgrade =)
  • uethello - Friday, August 6, 2004 - link

    I appreciate all the hard work that went into this review but I wish you had used a more realistic / mainstream CPU i.e.; AMD 2500+ or a p4. Just my .02 All in all, though, a fantastic article.
  • Phiro - Friday, August 6, 2004 - link

    Question, how much memory does each card have? Does it make a big diff with Doom3 if you have 128mb vs. 256mb? I'm asking becuase there is a huge price difference between a 6800 128mb and a 6800 256mb.
  • Locutus4657 - Thursday, August 5, 2004 - link

    I just thought I would post that I am running an AMD64 3000+ with 1GB Ram on a Chaintech MB with an ATI Radeon 9600XT. Running 1024.768 medium quality is no problem!
  • DerekWilson - Thursday, August 5, 2004 - link

    KrazyDawg ...

    You're right ... copied down wrong again.

    Last time I copy data while I'm trying not to fall asleep, I promise.
  • KrazyDawg - Thursday, August 5, 2004 - link

    Can the results for the High Quality Med test be fixed? It shows the Radeon 9800 Pro with a higher frame rate than the 9800 XT. Also, the 9800XT has the same frame rate as a 9700Pro.
  • KrazyDawg - Thursday, August 5, 2004 - link

    Will there be a noticeable difference in frame rate between a 9800 Pro 256MB and a 9800 Pro 128MB?
  • Jeff7181 - Wednesday, August 4, 2004 - link

    GREAT article guys. After playing it for a couple days, I agree 100% with what was said. Awesome game, brilliant developing by Carmack and his team. There's not a game I'd rather be playing right now... some people aren't impressed by it... because they filled their head with hype about how it will be revolutionary and a breakthrough in gaming. While it's not quite THAT amazing, it is in my opinion, easily the best looking game you can buy today... including Far Cry.
  • mattmm - Wednesday, August 4, 2004 - link

    Maybe I'm missing something, or I'm just uneducated with PCI-Express. But shouldn't PCI-E be involved in testing? If thats the way the graphics slot standard is headed, why havent they produced high perofmrance cards like the 6800 for that platform? And what is everyones feeligns about being left int he dust with your $500 AGP card in a couple months "IF" PCI-E is debuted with something better? Like a majority I'm faced with having to buy a whole new system for this game, but I dont want to jsut put something together to play this game NOW, I want it for games LATER as well. Just dont want to make the mistake of sinking the dough into a technology where in a few months the possibility of something far greater is bound. (I know its the hate-love relationship with advancing technology)
  • PrinceGaz - Wednesday, August 4, 2004 - link

    I made a slight mistake earlier when I said the console command to show the framerate while playing is "con_showfps 1", it is actually "com_showfps 1". Sorry.

    #57- my CPU is actually a slightly overclocked XP 1700+ which I've ran at 1800+ speeds ever since I built this box. I did try overclocking my Ti4200 to Ti4400 speeds (275/550), which is well within its maximum possible overclock without any visible corruption (290/580).

    As you'd expect, a faster graphics card did next to nothing for my framerate at 640x480 as that was pretty much CPU limited. The 10% gfx overclock only raised the framerate of the 640x480 low quality mode by 1%, from 31.4 fps to 31.7fps. I doubt even a 6800 Ultra could manage more than 33fps with my CPU, mobo, and memory. So an XP 1800+ on a KT266A mobo has a roughly 33fps ceiling regardless of graphics card or mode.

    At 1024x768 medium quality, the 21.2fps at 250/500 was raised by a healthy 8% to 22.9fps with the 250/550 overclock of 10%, so a faster graphics card would in my system would definitely push that a lot higher, probably close to 33fps. Increasing core speed alone had a greater impact than memory alone, at 275/500 I measured 22.2fps, while at 250/550 I got 21.8fps. If you've got any GeForce4 Ti series card (even an overclocked Ti4600), regardless of your CPU I'd recommend running at 800x600 in Medium quality mode, or possibly High quality mode with Aniso disabled though you're unlikely to see much difference and theres always the risk of texture swapping at some point.

Log in

Don't have an account? Sign up now