Final Words

Performance-wise, we've seen how the HDR effects have a significant impact on performance in Day of Defeat, and in some cases, the impact was surprising. On cards like the X800 and the 6600 GT, it's interesting to see how HDR requires the kind of resources that would effectively cut your framerate in half, especially given the general subtlety of the lighting effects.

We've also shown that ATI seems to handle Valve's implementation of HDR better than NVIDIA, and if we could have tested with one of the next-gen ATI cards, the 7800 GTX would have assuredly been beaten out for the highest fps. But most of the cards that we tested were able to handle the performance hit from the HDR settings. Unfortunately, if you have a less-powerful card than these that we've tested, you will probably have to either turn down your resolution or forego the HDR.

While the HDR effects in the game are subtle, we should mention that after a bit of play testing, we found that our eyes tended to adapt to the auto-exposure and bloom effects and everything seemed to blend together in a way which added a lot to the gameplay. In fact, when playing the game with the HDR settings turned off, the game looks surprisingly flat by comparison. We are impressed at how Valve was able to enhance the source engine in such a major way, while keeping everything subtle enough to sometimes forget that it's there. Much like the Matrix, it's hard to understand until you experience it for yourself.

If you played much Day of Defeat before the upgrade, the source version will no doubt make you very happy, just as Counterstrike: Source did when it first came out. However, by now, the Halflife 2 engine isn't quite as new and exciting as it was when CS Source came out, and in spite of the new HDR effects, the "wow" factor isn't quite as pronounced. Still, there is no denying that the new lighting effects add a kind of sparkle to the HL2 graphics, which, while still excellent, had seemed to have lost a bit of luster with time. These graphical enhancements are certainly a step forward, and it will be very interesting to see how these new effects will be used in games of the near future.

We will also be excited to see if Valve is able to stick to their guns and continue to enhance Source on the way to their next major project. This new method certainly seems to make more sense to us as Valve's engine customers will have access to a better quality engine and gamers will reap the benefits of new technology faster. We look forward to the surprises Valve has in store for us in the future.
Image Comparison
Comments Locked

47 Comments

View All Comments

  • Frackal - Friday, September 30, 2005 - link

    I actually meant to ask about the resolution too.

    In my own bench, starting from spawn point at anzio and then running to the other end of the map while doing some shooting with the bazooka, at:

    1680x1050
    4AA/16AF All High/Reflect All
    MultiSampling AA
    Forced Trilinear mipmaps

    I get just over 70FPS


    4400+ @2.65
    BFG GTX @ 480/1360
    2gigs mushkin at 241 1:1 2/3/3/8
  • ashay - Friday, September 30, 2005 - link

    Is it just me? In EVERY screenshot of HDR vs !HDR that I've seen, I've thought the !HDR looks better. Maybe I need to play and see for my self.

  • wanderer27 - Friday, September 30, 2005 - link

    I have yet to see where either Bloom or HDR makes things look better.

    AOE III - HDR/Bloom looks worse.

    Day of Defeat - HDR/Bloom looks worse.

    Oblivion - Bloom effect looks bad, haven't seen a shot of non-Bloom on this game yet.

    Maybe it will do something for darkly lit games, but so far they all look too glowy (AOE), or washed out (DOD).

    So far this looks like a useless technology they're trying to shove down our throats. Thankfully, in AOE you have the option to turn this crap off.

    My advice to the Devs, stop wasting time on this and find something that'll actually make things look or play better.

  • overclockingoodness - Friday, September 30, 2005 - link

    The image with HDR is smoother.
  • Frackal - Friday, September 30, 2005 - link

    Frankly I was blown away by the graphics in DOD-S and I've played COD2, FEAR, BF2, etc... If you're running the right rez with all details turned up, its like being in a photograph much of the time.

    Valve should have gotten way more props for this

    I hope I don't have to see you guys exclaiming how good FEAR's crappy graphics are if you ever review that game..

    Anyway I love AT but I thought this really downplayed the impressive graphics here.
  • Gigahertz19 - Friday, September 30, 2005 - link

    The very bottom image looks the best to me. Compare it to the top which has alot of the jaggies but witht he HDR the jaggies are missing, it looks alot smoother. Wish I had a better GPU then a 9700 pro.
  • Bonesdad - Friday, September 30, 2005 - link

    All the jaggies are definately still there, the entire image is just more washed out...I think the top image has richer colors. Not really impressed, personally.
  • toyota - Friday, September 30, 2005 - link

    all the jaggies are still there in the bottom pic. they are just a little washed out. its not any smoother.
  • DerekWilson - Friday, September 30, 2005 - link

    We did top end and upper midrange ... this was really just a taste though -- believe me, we'll have more benchmarks with this game soon :-)
  • PrinceGaz - Friday, September 30, 2005 - link

    Interesting article though like many others I was distinctly unimpressed by the static screenshots showing the "benefits" of HDR. Maybe it works a lot better while actually playing the game...

    The choice of cards tested seemed a bit strange to me though. Either a 7800GT or GTX would have been enough for top-end nVidia performance, as would a single good card from the X800/X850 line-up to show how ATI compares with their current generation (ideally figures from an X1800 would be thrown in, but NDAs currently prevent that). The omission of a 6800GT or similar was the main problem with the benchmarks though, as many of us have one of them and would like to know well they fare.

    Along with the 6600GT for current mid-range performance, ideally you'd also include an FX5900/5950 series and a 9800Pro as not everyone buys a new card when a new generation of hardware is released. The 9800Pro is still very capable and should be included in all reviews, and an FX5900/5950 should be included too for reference even if it does suffer badly with modern pixel-sharder intensive games, so that people can decide if an upgrade is worthwhile. Anything less than those cards would probably be a waste of time for this review though as they'd be too slow.

    In fact I'd say a 9800Pro and FX5900/5950 should be included in *all* graphics-card / game-performance reviews, in addition to the usual 7800, 6800, 6600, X800/850. You must have them lying around somewhere ready to drop in a suitable box I'm sure :)

    I'm looking forward to the updated/follow-up article with additional benchmarks, I understand if time was pressing you could only test on a limited number of cards.

Log in

Don't have an account? Sign up now