What's Gamma Correct AA?

Gamma correction is a technique used to map linearly increasing brightness data to a display device in a way that conveys linearly increasing intensity. As displays are nonlinear devices, gamma correction requires a nonlinear adjustment to be made to brightness values before being sent to the display. Ideally, gamma corrected linear steps in the brightness of a pixel will result in linear steps in perceived intensity. The application in antialiasing is that high contrast edges can appear under aliased if the brightness of a pixel isn't adjusted high enough for humans to perceive an increase in intensity after being displayed by the monitor.

Unfortunately, gamma correcting AA isn't always desirable. Different CRT, LCD, and TVs have different gamma characteristics that make choosing one gamma correction scheme more or less effective per device. It can also result in brighter colored sub-samples having a heavier influence on the color of a pixel than darker sub-samples. This causes problems for thing like thin lines.

To illustrate the difference, we'll look at images of Half-Life taken on G80 with and without gamma correction enabled.



16XQ No Gamma 16XQ Gamma

Hold mouse over links to see Image Quality



16XQ No Gamma 16XQ Gamma

Hold mouse over links to see Image Quality

We can see the antenna decrease in clarity due to the fact that each of the brighter subsamples has a disproportionately higher weight than the darker subsamples. As far as the roof line is concerned, our options are to see the roof blurring out into the sky, or watching the sky cut into the roof.

Really, edge AA with and without gamma correction is six of one and half a dozen of the other. Combine this with the fact that the effect is different depending on the monitor being used and the degraded visibility of thin lines and we feel that gamma correct AA isn't a feature that improves image quality as much as it just changes it.

While we are happy that NVIDIA has given us the choice to enable or disable gamma correct AA as we see fit, with G80 the default state has changed to enabled. While this doesn't have an impact on performance, we prefer rendering without gamma correct AA enabled and will do so in our performance tests. We hope that ATI will add a feature to disable gamma correct AA in the future as well. For now, let's take a look at R580 and G80 compared with gamma correction enabled.



G80 4X Gamma ATI 4X Gamma

Hold mouse over links to see Image Quality



G80 4X Gamma ATI 4X Gamma

Hold mouse over links to see Image Quality

At 4xAA with gamma correction enabled, it looks like ATI is able to produce a better quality image. Some of the wires and antenna on NVIDIA hardware area a little more ragged looking while ATI's images are smoothed better.

CSAA Image Quality continued What's Transparency AA?
Comments Locked

111 Comments

View All Comments

  • Sunrise089 - Thursday, November 9, 2006 - link

    Then I suppose he's in the market to part with an ugly old high-end CRT. I'd love to buy it from him. Seriously.
  • JarredWalton - Thursday, November 9, 2006 - link

    You want an older 21" Cornerstone CRT? It's a beast, but you can have it for the cost of shipping (which unfortunately would probably be ~$50). I'd also sell my Samsung 997DF 19" CRT for about $50, and maybe an NEC FE991-SB for $50 (which unfortunately has a scratch from my daughter in the anti-glare coating). If anyone lives in the Olympia, WA area, you know how to contact me (I hope). I'd rather someone come by to pick up any of these CRTs rather than shipping, as I don't think I have the original boxes.
  • DerekWilson - Thursday, November 9, 2006 - link

    lol next thing you know links to ebay auctions are gonna start showing up in our articles :-)
  • yyrkoon - Thursday, November 9, 2006 - link

    lol, I've got a 21" techtronics I'll sell for $200 usd, plus shipping ;) Hasnt been used since I purchased my Viewsonic VA1912wb (well, been used very little ).
  • imaheadcase - Wednesday, November 8, 2006 - link

    can't stand AA benchmarks myself :)

    Question: Do you have any info on what kinda card nvidia releasing this feb? Is it something in between these 2 cards or something even lower?

    Im looking for a $300ish g80! :D
  • flexy - Wednesday, November 8, 2006 - link

    if ANYTHING counts then how those high-end cards perform WITH their various AA settings.... the power of those cards (and the money spent on :) RIGHT translated into ---> IMAGE QUALITY/PERFORMANCE.

    Please dont tell you you would get an G80 but do NOT care about AA, this does NOT make any sense...sorry...

    I am especially impressed reading that transparency AA has such a LITTLE performance impact. What game engine did you test this on ?

    On the older ATI cards (and am i right that T.A. is the same as "adaptive antialiasing" ? )...this feature (depending on game engine) is the FPS killer....eg. w/ games like oblivion (WHERE ARE THE GOTHIC 3 BENCHEIS BTW ? :)...much vegetation etc. game-engines.

    Enable transparency AA and see all those trees, grass etc. without jaggies.

  • imaheadcase - Thursday, November 9, 2006 - link

    Well lots of people don't are for AA. Even if i had this card I would not use it. I visually see NO difference with it on or off. Its personal test. I don't even see "jaggies" on my older 9700 PRO card.
  • flexy - Thursday, November 9, 2006 - link

    you sure are talking about ANTIALIASING ???

    What resoltions do you run ? Not that my CRT can even handle more than 1600x1200..but even w/ 1600 i get VERY prominent jaggies if i dont run AA.

    I made it a habit to run at least 4xAA in ANY game, and some engines (hl2:source engine) etc. run extremely well with 4xAA, even 6xAA is very playable at elast with HL2.

    The very recent games, namely NWN2 and G3 now dont support AA, playing at 1280x1024 and it looks utterly horrible ! If you say you dont see jags in say ANY resolution under 1600..very hard to believe
  • imaheadcase - Thursday, November 9, 2006 - link

    Yes im talking about antialiasing. I normally play BF2 and oblivion at 1024x768 (9700 pro remember).

    Fact is most people won't see them unless someone points them out. The brain is still better at rendering stuff the way you want to see it vs hardware :)
  • flexy - Thursday, November 9, 2006 - link

    ok..but then it's also a performance problem. If it doesn't bother you, well ok.
    I also have to settle w/ the fact that many RECENT games are even unable to do AA..however i wish they would.

    But once i get a 8800 i will do &&&& to get the most out of IQ, AA, AF, transparency/texture AF, you name it. ALONE also for the reason that i would need a super-high end monitor first to even run resolutions like 2000xsomething...and as long as i have a lame 19" CRT and CANNOT even go over 1600 (99,99% of games even running everything on 1280x or 1360x) i will use all the power to get out best possible IQ in those low resolutions.

    Also..looking at the benchmarks..its NOT that you lose any real time gaming-experiencee since THOSE monster cards are made for exactly this...eg. running oblivion with all those settings at MAX AND AA on and HDR...and you are still in VERY reasobale FPS ranges.

Log in

Don't have an account? Sign up now