Day of Defeat: Source Performance

Rather than test Half-Life 2 once again, we decided to use a game that pushed the engine further. The reworked Day of Defeat: Source takes 4 levels from the original game and adds not only the same brush up given to Counter Strike: Source, but HDR capabilities as well. The new Source engine supports games with and without HDR. The ability to use HDR is dependent on whether or not the art assets needed to do the job are available.

In order to push this test to the max, the highest settings possible were tested. This includes enabling the reflect all and full HDR options. Our first test was run with no antialiasing and trilinear filtering, and our second set of numbers was generated after 4x antialiasing and 8x anisotropic filtering were enabled.

From the looks of our graph, we hit a pretty hard CPU barrier near 80 frames per second. The 7800 GTX sticks the closest to this limit for the longest, but still can't help but fall off when pushing two to three megapixel resolutions. The 7800 GT manages to out-perform the X1800 XT, but more interestingly, the X850 XT and X1800 XL put up nearly identical numbers in this test. The X1300 Pro and the X1600 XT parallel each other as well, with the X1600 XT giving almost the same numbers as the X1300 Pro one resolution higher. The 6600 GT (and thus the 6800 GT as well) out-performs the X1600 XT pretty well.



Antialiasing changes the performance trends a bit, but at lower resolutions, we still see our higher end parts reaching up towards that 80 fps CPU limit. The biggest difference that we see here is that the X1000 series are able to gain the lead in some cases as resolution increases with AA enabled. The X1800 XT takes the lead from the 7800 GTX at 1920x1200 and at 2048x1536. The 7800 GT manages to hold its lead over the X1800 XL, but neither is playable at the highest resolution with 4xAA/8xAF enabled. The 6600 GT and X1600 XT manage to stop being playable at anything over 1024x768 with AA enabled. On those parts, gamers are better off just increasing the resolution to 1280x960 with no AA.



The 7800 series parts try to pretend that they scale as well as the X1800 series at 1600x1200 and below when enabling AA. The X850 XT scales worse than the rest of the high end, but this time, the X1600 XT is able to show that it handles the move to AA better than either of the X8xx parts test and better than the 6600 GT as well. The 6800 GT takes the hit from antialiasing better than the X1600 XT though.



Doom 3 showed NVIDIA hardware leading at every step of the way in our previous tests. Does anything change when looking at numbers with and without AA enabled?

Battlefield 2 Performance Doom 3 Performance
Comments Locked

93 Comments

View All Comments

  • bob661 - Friday, October 7, 2005 - link

    1280x960 is actually in keeping with the 4:3 aspect ratio. 1280x1024 actually stretches the height of your display although it's a little hard to tell the difference.
  • TheInvincibleMustard - Friday, October 7, 2005 - link

    The actual physical dimension of a 1280x1024 screen is larger than a 1280x960 if the pixel size is the same -- there's no "stretching" of anything, as 5:4 is just more square than 4:3 is but you've got more pixels to cover the "more squareness" of it.

    -TIM
  • DerekWilson - Friday, October 7, 2005 - link

    It would be more of a squishing if you ran 1280x1024 on a monitor built for 4:3 with a game that didn't correctly manage the aspect ratio mapping.

    The performance of 1280x1024 and 1280x960 is very similar and it's not worth testing both.
  • TheInvincibleMustard - Friday, October 7, 2005 - link

    True enough, but most 17" and 19" LCD monitors (the monitors in question in this line of posts) are native 1280x1024, and therefore no squishing is performed.

    I do agree with you that it is redundant to perform testing at both 1280x1024 and 1280x960, as those extra ~82,000 pixels don't mean a whole lot in the big picture.

    -TIM
  • JarredWalton - Saturday, October 8, 2005 - link

    Interesting... I had always assumed that 17" and 19" LCDs were still 4:3 aspect ratio screens. I just measured a 17" display that I have, and it's 13.25" x 10.75" (give or take), nearly an exact 5:4 ratio. So 1280x1024 is good for 17" and 19" LCDs, but 1280x960 would be preferred on CRTs.
  • TheInvincibleMustard - Saturday, October 8, 2005 - link

    By Jove, I think he's got it! :D

    -TIM
  • bob661 - Friday, October 7, 2005 - link

    That might explain why I can't tell the difference. Thanks much for the info.
  • intellon - Friday, October 7, 2005 - link

    bang on with the graphs in this article... top notch. I guess the difference in performance of these cards make it less congested.
    On another note, I was wondering would it be too much hassle to set up ONE more computer with a mass sold cpu (say like the 3200+) and a value ram and just run couple of the different game engines on it, and post how the new cards perform? You don't have to run this "old" setup with every card ... just the new launches. It would be much helpful to common people who won't buy the fx55.
    I for one, make estimates about how much slower the cards would run on my comp, but those estimates could be much better with a slower processor.
    I understand that the point of the review is to let the gpu free and keep the cpu from holding it back, but testing with a common setup is helpful for someone with limited imagination (about how the card will run on their system) or not so deep pockets.
    Of course you can just go right ahead and ignore this post and I won't complaint again, but if you do add such a system in the next review (it just has to be run with the new cards) I'll be the one who'll thank you deeply.
  • Sunrise089 - Friday, October 7, 2005 - link

    2nd, even if only for a few tests
  • LoneWolf15 - Friday, October 7, 2005 - link

    One other factor in making a choice is that there are no ATI X1000 series cards available at this point. Once again, every review site covered a paper-launch, despite railing on it in the past. No-one is willing to be the first to be scooped and say "We won't review a product that you can't buy".

    I have an ATI card myself (replaced a recent nVidia card a year ago, so I've had both), but right now I'm pretty sick of card announcements for cards that aren't available. This smacks of ATI trying to boost its earnings or its rating in the eye of its shareholders, and ignoring its customers in the process. It's going to be a long time before I buy a graphics card again, but if I had to choose a vendor based on the past two years, both companies' reputations fall far short of the customer service I'd hope for.

Log in

Don't have an account? Sign up now