Valve’s HDR Source Implementation

We recently had the opportunity to head up to Washington to visit with Valve and talk about the new additions to the Source engine. After the issues and delays getting Half-Life 2 out the door, Valve's philosophy on game development has shifted. Rather than setting a long term goal to take one project from start to finish, Gabe and the team will be setting shorter term release goals. The idea is that five one-year projects on the road to a five-year destination can help keep them on track. We can also look forward to incremental updates to the Source engine allowing other game developers to benefit constantly from new technology, bringing us nice little treats like HDR.

As a result of the past year of development, Valve has met their goal to incorporate HDR into their Source engine and now we get to reap the benefits. But before we look at performance numbers and image quality, we will take a look behind the scenes to find out what is going on. At first glance, it is clear that Valve has added the usual blooming features that we would expect from HDR rendering, but there are a couple of new features that Valve has added to keep it interesting.

As we have said, HDR generally speaks to representable contrast in a scene. The way that developers handle this is to represent brightness data beyond the capabilities of the display (for instance, the sun is much brighter than a light bulb, but both could be the same color with traditional rendering). That isn't to say that a game or other HDR applications can make your monitor brighter than possible, but rather that internal light sources and objects in an application can be represented by more brightness than can be displayed. The final rendered image is then (in current incarnations) tonemapped down to a standard 8-bit display colorspace.

This allows objects that are only partially reflective to still reflect enough light to be as bright as what the monitor can display. For instance, a rock in a game may be 20% reflective. Normally rendered, even if a bright light source is perfectly reflected to the camera, the rock can only be 20% as bright as what the monitor can display. If the light source shining on a rock is 5 times brighter than the display, the rock will still be able to reflect a light that shines at 100% of the monitor's brightness.

In addition to this feature, very bright lights also make it more difficult for our eyes to clearly see objects occluding (or nearly occluding) the light source. The effect that game developers use to portray this "bloom" is to blur the light onto the foreground object. The high dynamic range allows light sources to be identified and properly handled.

One of the easiest ways to implement HDR from scratch is to use a floating point format with all art assets designed around HDR. Unfortunately, current hardware isn't able to handle full floating point data as fast as other methods, and no hardware (that is currently out) can allow MSAA to run on a floating point render target. The size of the artwork needed to make this work is also huge and floating point assets cannot be compressed currently using built-in hardware texture compression techniques. On top of this, Valve is working with an existing engine designed around Half-Life 2, so this method would also require a redesign of art assets and how they worked. These problems and others add up to make it difficult to incorporate this technology in the Source engine.

So, rather than carry HDR data through the entire pipeline and all art assets, Valve made a different choice that gives a good balance of performance and HDR characteristics. Data is represented in fp16 or integer 4.12 linear space through in light sources cube maps and static lighting data. This method is unable to store overbright information directly, but Valve is still able to add a blooming shader. Our understanding is that this method eliminates the possibility for transmissive or refracted overbright data (we won't be able to see a bloom inside a stained glass window or from sand under water through which light has passed). But blooming light sources and direct light reflection is still possible and well used in the Source engine.

But on top of blooming, Source also allows for dynamic tonemapping that works something like an auto exposure or a human pupil. This helps maintain a high dynamic range effect without overbright data by allowing the natural lighting of a scene to dictate the exposure of the rendered image. This means that in a dark room, the tonemapping scale will adjust to (essentially) make the brightest parts of the darkness bright enough to see with the natural light. The mapping isn't linear so that very dark pixels are brightened less than lighter pixels. In a bright room, the opposite is true, but in both cases, the definition of HDR is fulfilled: the contrast ratio between bright and dark areas on the same image is greatly increased.

To handle the tonemapping, Valve artists design three different images for the skybox at three different exposures. HDR light maps are built from the environment using a global illumination model and radiosity that uses 8 bounces. The process of generating HDR light maps takes a while, but (together with the HDR cube maps built from the HDR light maps and the skybox) this also allows Valve to represent correctly the lighting of the world. Entire maps can be lit with only the sun as a light source. Normally, to brighten hallways or dark areas away from static lights, small point lights are used. This is no longer necessary and can actually make the scene look bad. Not to worry though, level designers can build HDR lighting information into the same BSP as non-HDR lighting data and the engine will select the right ones to use depending on the mode in which the game is running.

The HDR SDK for source will ship when the Lost Coast level is released. This will allow modders to implement HDR levels in their games by adding the three exposure sky maps, building HDR and non-HDR lighting into levels, and setting bloom and exposure ranges per area if desired. While bloom and exposure can be handled automatically, it is nice to give the developer control over this.

And now that we've seen how it's done, let's take a look at the end result in performance and image quality.

Index Test Setup
Comments Locked

47 Comments

View All Comments

  • OvErHeAtInG - Saturday, October 1, 2005 - link

    The move to PCIe does make everything harder for them, though, as they would have to build a second box which would be as identical as possible except for a different motherboard, introducing a few potential inconsistencies. Other than that, my thoughts exactly.

    Although, frankly, this preview told me what I wanted to know. Great job, guys!!!
  • Hi - Friday, September 30, 2005 - link

    IMO, bloom looks the best of all three screenshots
  • overclockingoodness - Friday, September 30, 2005 - link

    I personally like full HDR; to me, it's smoother.
  • ksherman - Friday, September 30, 2005 - link

    agreed
  • bob661 - Friday, September 30, 2005 - link

    I agree with the comments above. Looks too washed out. Not natural.
  • Araemo - Friday, September 30, 2005 - link

    Humorously enough, if you go to a bright beach on a bright day, the sand will look 'washed out'. Especially if you're viewing it through a TV camera(which limits the dynamic range of the image in a similar way that your monitor limits the dymanic range of the rendered scene.)

    Plus, this is generation 1 real-time HDR(sorta), don't be TOO hard on them. ;P Anti-aliasing was poo-poo'd early on because it 'made everything blurry'. I can't live without it in most games(As long as I'm playing at 1024x768 or above)
  • pol II - Friday, September 30, 2005 - link

    ...screenshots anyway. Just looks too washed out to me. Good to see that the technology is moving forward though.

Log in

Don't have an account? Sign up now