What's Gamma Correct AA?

Gamma correction is a technique used to map linearly increasing brightness data to a display device in a way that conveys linearly increasing intensity. As displays are nonlinear devices, gamma correction requires a nonlinear adjustment to be made to brightness values before being sent to the display. Ideally, gamma corrected linear steps in the brightness of a pixel will result in linear steps in perceived intensity. The application in antialiasing is that high contrast edges can appear under aliased if the brightness of a pixel isn't adjusted high enough for humans to perceive an increase in intensity after being displayed by the monitor.

Unfortunately, gamma correcting AA isn't always desirable. Different CRT, LCD, and TVs have different gamma characteristics that make choosing one gamma correction scheme more or less effective per device. It can also result in brighter colored sub-samples having a heavier influence on the color of a pixel than darker sub-samples. This causes problems for thing like thin lines.

To illustrate the difference, we'll look at images of Half-Life taken on G80 with and without gamma correction enabled.



16XQ No Gamma 16XQ Gamma

Hold mouse over links to see Image Quality



16XQ No Gamma 16XQ Gamma

Hold mouse over links to see Image Quality

We can see the antenna decrease in clarity due to the fact that each of the brighter subsamples has a disproportionately higher weight than the darker subsamples. As far as the roof line is concerned, our options are to see the roof blurring out into the sky, or watching the sky cut into the roof.

Really, edge AA with and without gamma correction is six of one and half a dozen of the other. Combine this with the fact that the effect is different depending on the monitor being used and the degraded visibility of thin lines and we feel that gamma correct AA isn't a feature that improves image quality as much as it just changes it.

While we are happy that NVIDIA has given us the choice to enable or disable gamma correct AA as we see fit, with G80 the default state has changed to enabled. While this doesn't have an impact on performance, we prefer rendering without gamma correct AA enabled and will do so in our performance tests. We hope that ATI will add a feature to disable gamma correct AA in the future as well. For now, let's take a look at R580 and G80 compared with gamma correction enabled.



G80 4X Gamma ATI 4X Gamma

Hold mouse over links to see Image Quality



G80 4X Gamma ATI 4X Gamma

Hold mouse over links to see Image Quality

At 4xAA with gamma correction enabled, it looks like ATI is able to produce a better quality image. Some of the wires and antenna on NVIDIA hardware area a little more ragged looking while ATI's images are smoothed better.

CSAA Image Quality continued What's Transparency AA?
Comments Locked

111 Comments

View All Comments

  • haris - Thursday, November 9, 2006 - link

    You must have missed the article they published the very next day http://www.theinquirer.net/default.aspx?article=35...">here. saying they goofed.
  • Araemo - Thursday, November 9, 2006 - link

    Yes I did - thanks.

    I wish they would have updated the original post to note the mistake, as it is still easily accessible via google. ;) (And the 'we goofed' post is only shown when you drill down for more results)
  • Araemo - Thursday, November 9, 2006 - link

    In all the AA comparison photos of the power lines, with the dome in the background - why does the dome look washed out in the G80 images? Is that a driver glitch? I'm only on page 12, so if you explain it after that.. well, I'll get it eventually.. ;) But is that just a driver glitch, or is it an IQ problem with the G80 implementation of AA?
  • bobsmith1492 - Thursday, November 9, 2006 - link

    Gamma-correcting AA sucks.
  • Araemo - Thursday, November 9, 2006 - link

    That glitch still exists whether or not gamma-correcting AA is enabled or disabled, so that isn't it.
  • iwodo - Thursday, November 9, 2006 - link

    I want to know if these power hungry monster have any power saving features?
    I mean what happen if i am using Windows only most of the time? Afterall CPU have much better power management when they are idle or doing little work. Will i have to pay extra electricity bill simply becoz i am a cascual gamer with a power - hungry/ ful GPU ?

    Another question pop up my mind was with CUDA would it now be possible for thrid party to program a H.264 Decoder running on GPU? Sounds good to me:D
  • DerekWilson - Thursday, November 9, 2006 - link

    oh man ... I can't believe I didn't think about that ... video decoder would be very cool.
  • Pirks - Friday, November 10, 2006 - link

    decoder is not interesting, but the mpeg4 asp/avc ENCODER on the G80 GPU... man I can't imagine AVC or ASP encoding IN REAL TIME... wow, just wooowww
    I'm holding my breath here
  • Igi - Thursday, November 9, 2006 - link

    Great article. The only thing I would like to see in a follow up article is performance comparison in CAD/CAM applications (Solidworks, ProEngineer,...).

    BTW, how noisy are new cards in comparison to 7900GTX and others (in idle and under load)?
  • JarredWalton - Thursday, November 9, 2006 - link

    I thought it was stated somewhere that they are as loud (or quiet if you prefer) as the 7900 GTX. So really not bad at all, considering the performance offered.

Log in

Don't have an account? Sign up now