Doom 3 Performance

One of the most demanding games that we test in terms of graphics, Doom 3 shows some impressive gains. Let's take a look.

Doom 3


Doom 3


Doom 3


Doom 3


We'll start by comparing the 6800 Ultra and the 7800 GT. The most notable increase here is at 2048x1536 with AA enabled, where we see a 43% improvement in fps with the 7800 GT. We get a similar increase (48.4%) at that resolution without AA enabled, but with AA, we went from 19.3 fps, an unplayable framerate, to 27.6, which is borderline-playable. At 1600x1200, both AA and no AA see only about a 14% increase.

As expected, we see higher gains than this when we compare one 6800 Ultra to two in SLI mode. Without AA, the framerates for both resolutions increase by around 30 fps, a 34% increase at 16x12, and a 77.6% increase at 20x15. The gains are even more impressive with AA enabled. 16x12 AA goes from 41.6 to 75.4, an increase of 81.3%; and at 20x15 AA, from 19.3 to 38.8 - an impressive 101% increase.

The gains that we see with the 7800 GT will definitely make a difference in performance with this game, but unfortunately, the GT still struggles at 20x15 with AA enabled. Two 6800Us in SLI mode don't have this problem, and in fact, they handle 20x15 with AA fairly well. This might not matter however, to those who don't care about AA at high resolutions.

It's interesting to note that Doom 3 appears more dependent on GPU memory bandwidth than GPU processing speed, at least in certain scenarios. Notice how the 6800 Ultra SLI configuration actually beats the 7800 GT SLI configuration in several of the tests. The 6800 cards do seem to have more problems with the 20x15 resolution, however.

Battlefield 2 Performance Everquest II Performance
Comments Locked

77 Comments

View All Comments

  • Hacp - Thursday, August 11, 2005 - link

    THANK YOU FOR MAKING ALL THE POSTS VISABLE! IT WAS ANNOYING CLICKING ON THE DOWN BUTTON!

    sry for the caps :).
  • crimson117 - Thursday, August 11, 2005 - link

    Page 4, first two sentences:

    "One of the most demanding games we test in terms of graphics, Doom 3 shows some impressive gains here. First lets compare the 6800 Ultra and the 7800 GT."

    should be

    "One of the most demanding games we test in terms of graphics, Doom 3, shows some impressive gains here. First let's compare the 6800 Ultra and the 7800 GT."

    note: the comma after Doom 3 makes "One" the subject of the sentence. Alternatively, it could be:

    "Doom 3 is one of the most demanding games we test in terms of graphics, and it shows some impressive gains here."
  • pio!pio! - Thursday, August 11, 2005 - link

    Can the sample card you obtained be overclocked to anywhere near GTX speed? (cpu and memory)

    Also I know it's early, but any thoughts on softmodding it to enable more pipelines?
  • JNo - Thursday, August 11, 2005 - link

    Good question. xbitlabs.com state in their review that, "The GeForce 7800 GT reference graphics card we had in our lab proved quite overclockable. We managed to increase its working frequencies from 400MHz for the chip and 1000MHz for the memory to 470MHz for the chip and 1200MHz for the memory, which should ensure a significant performance increase during our tests." Unfortunately they did not include overclocked results in their graphs.

    Also, xbitlabs noted that the PCB is shorter than for the GTX, that the cooler is shorter and also commented on noise levels, given that unlike the GTX, the GT is apparently unable to regulate its fan speed. It is a shame that anandtech missed out on such details.
  • mmp121 - Thursday, August 11, 2005 - link

    I'm thinking this is a typo when you re-made the charts for BF2 to include the NV 6800 Ultra. The ATI Radeon X800 XT is really the ATI Radeon X850 XT PE right? Maybe I am wrong. Just wanted to point out the fluke so you guys can fix it. Good read so far!
  • DerekWilson - Friday, August 12, 2005 - link

    The BF2 numbers use the X800 not the X850
  • Phantronius - Thursday, August 11, 2005 - link

    Waaaah!!! Why no 6800GT comparison and why the hell didn't you guys benchmark Far Cry with HDR Mode???????
  • AtaStrumf - Thursday, August 11, 2005 - link

    I really resent this comment on page 1:

    "Until performance is increased beyond the 7800 GTX, it will be hard for us to see a reason for a new desktop 7 series part."

    So you rich boys got your flashy new toys and now you see no need for a new desktop 7 part, because you're too busy thinking about how you're gonna be playing games on the go (i.e. your new laptop GPU).

    What about us stuck at 6600 GT level. Don't we deserve an upgrade??? Like a 12-16 pipe/ 256-bit 7600 GT? I guess that we are just fine, since we're stuck with 1280x1024 res monitors anyway, right? WRONG! We've been cheated out of a significant upgrade long enough. Until 6600 GT there was only BS in the mainstream for about 2,5 years (R9500/9600/PRO/XT/GF5600/Ultra,...) and we're not going back there, no sir!

    Same sh!t when people with broadband post large uncompressed images on the web and forget about all those with dial-up, even though they themselves left that sorry bunch not too long ago. The world is weee bit bigger than your own back yard and someone writing for a site as big as AT should really know that.
  • DerekWilson - Friday, August 12, 2005 - link

    I just want to add my two cents as well ...

    I will agree with you that the 6600 GT was the first real solid mainstream option in a while. It's a good card.

    I'll argue that the next mainstream card you'll want to look at is the 6800 GT. There are 128MB parts and 256MB parts all with 256-bit busses and 16 pixel pipes at good clock speeds.

    As other's have said, releasing a G70 part with the same specs as the 6800 GT will have the same performance as well.

    We tried to explain in the article that the 6 Series comprises the rest of the lineup going forward. There are no performance gaps that the G70 needs to fill in. The only reason NVIDIA would want to release slower G70 parts would be to phase out the 6 Series part at that same speed grade.

    It also doesn't make sense for NVIDIA to immediately release a slower G70 part. The lower transistor count and more mature process used on NV4x chips will likely make it easier for NVIDIA to sell the parts at a lower price and higher profit than equivalently performing G70 parts. The economics of this are very complicated and depend quite a bit on NVIDIA and TSMC and the cost per IC for NV4x and G70 chips.

    It would almost make sense for NVIDIA to take less functional G70 chips and sell them as 6 Series parts. But maybe I'm wrong. Perhaps NVIDIA will think it can trick people into thinking a part with the same performance as a 6800 or 6800 GT and a 7 Series name is worth more money.

    It really shouldn't matter to anyone whether 6 Series parts keep falling in price or other 7 Series parts come out. I stand behind my statement. We'll see lower performing G70 parts come out if and when it becomes more financially viable for NVIDIA to sell those parts than NV4x parts. There really isn't any other factor that matters.

    Transparency AA may be an interesting thing, but moving towards the budget end of the spectrum will tend to make the performance impact of Transparency AA too high to matter. Other little tweaks and features have already been covered and aren't that compelling over the 6 Series, and seeing a G70 perform the same as an NV4x for the same price really wouldn't be that exciting no matter what my budget looks like.

    Derek Wilson
  • coldpower27 - Thursday, August 11, 2005 - link

    It would be interesting to see if the 90nm based 7600 part has a 256Bit Memory Interface, as it seems 256Bit Memory Interface cards usually have a minimum die size of around 200mm2 just out of range of a high volume mianstream level card.

    I would certainly want a 7600 GT if it had a similar pipeline configuration in comparison to the 6800 or 6800 GT because G7x does have some improvements I like, most notable the improvement in HDR performance levels, and Transparency AA, also what about WGF 1.0 support? It usually for the most part better to have a new mainstream card based on newer tech then just shifting older technology into the lower price points, as those cards aren't meant for those price points and are usually more expensive to produce.

    Not all of us can even afford the current 399US price point of 7800 GT.

Log in

Don't have an account? Sign up now