Introduction

Now is an interesting time for PC gaming. With the release of NVIDIA's 7800 series as well as the upcoming ATI x1000 series graphics cards, the potential for graphics in games is only just starting to be realized. Games like F.E.A.R, Call of Duty 2, and Age of Empires 3 promise to take PC gaming to a new level graphically, and frankly, we couldn't be happier about it. We seem to have a similar situation right now of when ATI's RADEON 9700 series launched, and the new hardware allowed game developers the freedom to experiment with new ideas; therefore, creating a new generation of games. One particular graphics engine that has had an important impact for developers lately is Half life 2's Source engine, and though it has been around for a while now, the developers have recently decided to give the engine a bit of a face-lift, metaphorically speaking.

That's right, Valve has updated their source engine to enable something called High Dynamic Range, and the first two applications to implement this are Day of Defeat (a popular Half life 2 mod) and the upcoming new level for Half life 2: the Lost Coast. High dynamic range is basically a more realistic way to implement lighting in a three-dimensional world. With HDR, light sources will appear brighter, and other effects like blooming are possible. HDR, along with other things like auto-exposure, take lighting to a new level, further enhancing the realism of a virtual world. To give you a better idea of the concept behind HDR, here is a quote from Paul Debevec:
"The 'dynamic range' of a scene is the contrast ratio between its brightest and darkest parts. A plate of evenly-lit mashed potatoes outside on a cloudy day is low-dynamic range. The interior of an ornate cathedral with light streaming in through its stained-glass windows is high dynamic range. In fact, any scene in which the light sources can be seen directly is high dynamic range."
Obviously, one of the first things that we were concerned about regarding this upgrade was how this would affect performance. We weren't sure quite what to expect, but we did some testing on multiple ATI and NVIDIA graphics cards, and we'll take a look at the results later on. First, let's go a bit more in-depth into the technology.

Valve’s HDR Source Implementation
Comments Locked

47 Comments

View All Comments

  • Frackal - Friday, September 30, 2005 - link

    I actually meant to ask about the resolution too.

    In my own bench, starting from spawn point at anzio and then running to the other end of the map while doing some shooting with the bazooka, at:

    1680x1050
    4AA/16AF All High/Reflect All
    MultiSampling AA
    Forced Trilinear mipmaps

    I get just over 70FPS


    4400+ @2.65
    BFG GTX @ 480/1360
    2gigs mushkin at 241 1:1 2/3/3/8
  • ashay - Friday, September 30, 2005 - link

    Is it just me? In EVERY screenshot of HDR vs !HDR that I've seen, I've thought the !HDR looks better. Maybe I need to play and see for my self.

  • wanderer27 - Friday, September 30, 2005 - link

    I have yet to see where either Bloom or HDR makes things look better.

    AOE III - HDR/Bloom looks worse.

    Day of Defeat - HDR/Bloom looks worse.

    Oblivion - Bloom effect looks bad, haven't seen a shot of non-Bloom on this game yet.

    Maybe it will do something for darkly lit games, but so far they all look too glowy (AOE), or washed out (DOD).

    So far this looks like a useless technology they're trying to shove down our throats. Thankfully, in AOE you have the option to turn this crap off.

    My advice to the Devs, stop wasting time on this and find something that'll actually make things look or play better.

  • overclockingoodness - Friday, September 30, 2005 - link

    The image with HDR is smoother.
  • Frackal - Friday, September 30, 2005 - link

    Frankly I was blown away by the graphics in DOD-S and I've played COD2, FEAR, BF2, etc... If you're running the right rez with all details turned up, its like being in a photograph much of the time.

    Valve should have gotten way more props for this

    I hope I don't have to see you guys exclaiming how good FEAR's crappy graphics are if you ever review that game..

    Anyway I love AT but I thought this really downplayed the impressive graphics here.
  • Gigahertz19 - Friday, September 30, 2005 - link

    The very bottom image looks the best to me. Compare it to the top which has alot of the jaggies but witht he HDR the jaggies are missing, it looks alot smoother. Wish I had a better GPU then a 9700 pro.
  • Bonesdad - Friday, September 30, 2005 - link

    All the jaggies are definately still there, the entire image is just more washed out...I think the top image has richer colors. Not really impressed, personally.
  • toyota - Friday, September 30, 2005 - link

    all the jaggies are still there in the bottom pic. they are just a little washed out. its not any smoother.
  • DerekWilson - Friday, September 30, 2005 - link

    We did top end and upper midrange ... this was really just a taste though -- believe me, we'll have more benchmarks with this game soon :-)
  • PrinceGaz - Friday, September 30, 2005 - link

    Interesting article though like many others I was distinctly unimpressed by the static screenshots showing the "benefits" of HDR. Maybe it works a lot better while actually playing the game...

    The choice of cards tested seemed a bit strange to me though. Either a 7800GT or GTX would have been enough for top-end nVidia performance, as would a single good card from the X800/X850 line-up to show how ATI compares with their current generation (ideally figures from an X1800 would be thrown in, but NDAs currently prevent that). The omission of a 6800GT or similar was the main problem with the benchmarks though, as many of us have one of them and would like to know well they fare.

    Along with the 6600GT for current mid-range performance, ideally you'd also include an FX5900/5950 series and a 9800Pro as not everyone buys a new card when a new generation of hardware is released. The 9800Pro is still very capable and should be included in all reviews, and an FX5900/5950 should be included too for reference even if it does suffer badly with modern pixel-sharder intensive games, so that people can decide if an upgrade is worthwhile. Anything less than those cards would probably be a waste of time for this review though as they'd be too slow.

    In fact I'd say a 9800Pro and FX5900/5950 should be included in *all* graphics-card / game-performance reviews, in addition to the usual 7800, 6800, 6600, X800/850. You must have them lying around somewhere ready to drop in a suitable box I'm sure :)

    I'm looking forward to the updated/follow-up article with additional benchmarks, I understand if time was pressing you could only test on a limited number of cards.

Log in

Don't have an account? Sign up now