Day of Defeat: Source Performance

Rather than test Half-Life 2 once again, we decided to use a game that pushed the engine further. The reworked Day of Defeat: Source takes 4 levels from the original game and adds not only the same brush up given to Counter Strike: Source, but HDR capabilities as well. The new Source engine supports games with and without HDR. The ability to use HDR is dependent on whether or not the art assets needed to do the job are available.

In order to push this test to the max, the highest settings possible were tested. This includes enabling the reflect all and full HDR options. Our first test was run with no antialiasing and trilinear filtering, and our second set of numbers was generated after 4x antialiasing and 8x anisotropic filtering were enabled.

From the looks of our graph, we hit a pretty hard CPU barrier near 80 frames per second. The 7800 GTX sticks the closest to this limit for the longest, but still can't help but fall off when pushing two to three megapixel resolutions. The 7800 GT manages to out-perform the X1800 XT, but more interestingly, the X850 XT and X1800 XL put up nearly identical numbers in this test. The X1300 Pro and the X1600 XT parallel each other as well, with the X1600 XT giving almost the same numbers as the X1300 Pro one resolution higher. The 6600 GT (and thus the 6800 GT as well) out-performs the X1600 XT pretty well.



Antialiasing changes the performance trends a bit, but at lower resolutions, we still see our higher end parts reaching up towards that 80 fps CPU limit. The biggest difference that we see here is that the X1000 series are able to gain the lead in some cases as resolution increases with AA enabled. The X1800 XT takes the lead from the 7800 GTX at 1920x1200 and at 2048x1536. The 7800 GT manages to hold its lead over the X1800 XL, but neither is playable at the highest resolution with 4xAA/8xAF enabled. The 6600 GT and X1600 XT manage to stop being playable at anything over 1024x768 with AA enabled. On those parts, gamers are better off just increasing the resolution to 1280x960 with no AA.



The 7800 series parts try to pretend that they scale as well as the X1800 series at 1600x1200 and below when enabling AA. The X850 XT scales worse than the rest of the high end, but this time, the X1600 XT is able to show that it handles the move to AA better than either of the X8xx parts test and better than the 6600 GT as well. The 6800 GT takes the hit from antialiasing better than the X1600 XT though.



Doom 3 showed NVIDIA hardware leading at every step of the way in our previous tests. Does anything change when looking at numbers with and without AA enabled?

Battlefield 2 Performance Doom 3 Performance
Comments Locked

93 Comments

View All Comments

  • nserra - Friday, October 7, 2005 - link

    I agree.

    When doing some article the site must say if they are doing a preview, review or overview.
  • Questar - Friday, October 7, 2005 - link

    "High quality anisotropic filtering is definitely something that we have begged of NVIDIA and ATI for a long time and we are glad to see it, but the benefits just aren't that visible in first-person shooters and the like."

    So you like all the texture shimmering on a 7800?!?
  • DerekWilson - Friday, October 7, 2005 - link

    We will absolutely be looking further in depth on the shimmering issue.

    But texture shimmering and the impact of ATI's new High Quality AF option aren't the same problem. Certainly angle independant AF will help games where both ATI and NV have shimmering issues, but those instances occur less often and in things like space and flight games.

    I don't like shimmering, and I do like the option for High Quality AF. But I simply wanted to say that the option for High Quality AF is not worth the price difference.
  • PrinceGaz - Friday, October 7, 2005 - link

    We're not talking about ATI's new angle-independent HQ AF option. It's nVidia's over-agressive trilinear-filtering optimisations that all 7800 series cards are doing, almost to the point of it being bilinear-filtering. They did that a couple of years ago and are doing it again now, but only on the 7800 series cards (6800 and under get normal filtering).

    If you want an example of this, just look at the transitions between mipmaps on the 7800 in the first review of the new ATI cards. I'm not talking about spikes on certain angles, but how the 7800 almost immediately jumps from one mipmap to the next, whereas ATI blends the transition far better. In fact, that is the main thing that struck me about those AF patterns in the review.

    Over-agressive trilinear-optimisation is a problem even on 6800 series cards after supposedly disabling it in the drivers (it reduces the impact of it). I just wish it could be turned off entirely as some games need full true trilinear filtering to avoid shimmering.
  • DerekWilson - Saturday, October 8, 2005 - link

    I know what you are talking about.

    The issue is that *I* was talking about the new HQ AF option in ATI hardware in the sentence Questar quoted in the original post in this thread.

    He either thought I was talking about good AF in general or that the HQ AF has something to do with why ATI doesn't have a texture shimmering problem.

    I just wanted to clear that up.

    Also, the real problem with NVIDIA hardware is the combination of trilinear and anisotropic optmizations along side the "rose" style angle dependant AF. Their "brilinear" method of waiting until near the mipmap transition to blend textures is a perfectly fine solution if just using trilinear filtering (the only point of which is to blurr the transition lines between mipmaps anyway).
  • TheInvincibleMustard - Friday, October 7, 2005 - link

    Hard|OCP did some image quality comparisons between the 7800GT and the X1800XL in their "X1000" launch article, and there was a noticable difference between ATi's HQAF and nVidia's AF, and in a FPS no less. Add in the fact that they pretty much said that you could enable HQAF for hardly any performance drop, and that's a pretty nice point in ATi's favor.

    I think that AnandTech should look at an IQ comparison again, if they're not seeing any difference.

    -TIM
  • nserra - Friday, October 7, 2005 - link

    I agree. New image quality tests must be done.

    Or maybe nvidia cards with 2 x performance of Ati, but with xgi/sis image quality is OK.
    I don’t think so.

    S3 and XGI have been plagued by their texture quality (image quality). But no one cares if those problems come from an Nvidia card.

    X8xx was supposed to offer lower image quality than R3xx, but no one really has showed that.
  • bob661 - Friday, October 7, 2005 - link

    I've never experienced image quality issues on NVidia or ATI cards. They both look the same to me. YMMV.
  • ChrisSwede - Friday, October 7, 2005 - link

    I was wondering what card available now that compares to my 9800 PRO? i.e. which card should I look for in reviews and equate to mine?

    ?Maybe none? :)

    Thanks
  • ChrisSwede - Friday, October 7, 2005 - link

    Thanks

Log in

Don't have an account? Sign up now