What's Transparency AA?

Transparency AA is NVIDIA's method of applying AA to transparent textures. Because MSAA only looks at one texture sample per pixel per polygon where more than one polygon slices a pixel area, it is not able to smooth jagged edges in transparent textures. In order to combat this, NVIDIA applies supersample AA to transparent textures. Their multisample transparency AA really doesn't do much for visual quality, so we will be ignoring it today. It only allows multisample through transparent texture areas and not of the texture itself.

Supersample AA performs a texture lookup at each sub-pixel to determine how much of the pixel falls on a transparent area of the texture and how much falls on an opaque area. The analog in ATI hardware is called Adaptive AA, which does basically the same thing. This generally has a very large performance impact for 3D scenes with many transparent textures (fence, bushes, leaves, and the like).

All screenshots on this page are 400% zooms of the highlighted portion of the following Half Life 2 screenshot:

Here's a look at Half-Life 2 with and without Transparency AA. We can clearly see how the leaves of the trees get smoothed out and look much better.



G80 No Transparency G80 Transparency AA

Hold mouse over links to see Image Quality

When comparing G70, G80, and R580, we have to remember that for NVIDIA hardware we've disabled gamma correct AA. It isn't possible to do this on ATI hardware, and thus we have a comparison of gamma correct AA on transparent textures as well.



G70 4X No Gamma G80 4X No Gamma ATI 4X Gamma

Hold mouse over links to see Image Quality

G70 and G80 don't look that different, but the R580 creates a kind of mushy look around the trees. This is another side effect of gamma correct AA and its potential negative impact on image quality. Worse examples include wire mesh or fences built with transparent textures: gamma correct AA can end up making parts of a fence disappear. Ideally, if we could apply gamma correct AA to high contrast edges and disable it for everything else, we'd see an image quality improvement. But the downsides just keep piling up with thin lines and transparent textures causing problems for gamma correction.

While transparency AA does enhance image quality a good deal, we do need to consider the performance impact. We'll revisit our antialiasing scaling graph from our CSAA page with Transparency and Adaptive AA enabled.

With G80, we see great performance at high resolution with high levels of AA while Transparency AA is enabled. With this level of performance, as long as R600 is able to keep up, we would love to check the Transparency AA check box every time we test with AA. For now, the performance degradation in R580 is just too high to justify at resolutions over 1600x1200 in most cases. An increase in resolution to a comparable performance level will net a higher gain in image quality.

What's Gamma Correct AA? Texture Filtering Image Quality
Comments Locked

111 Comments

View All Comments

  • JarredWalton - Wednesday, November 8, 2006 - link

    Page 17:

    "The dual SLI connectors are for future applications, such as daisy chaining three G80 based GPUs, much like ATI's latest CrossFire offerings."

    Using a third GPU for physics processing is another possibility, once NVIDIA begins accelerating physics on their GPUs (something that has apparently been in the works for a year or so now).
  • Missing Ghost - Wednesday, November 8, 2006 - link

    So it seems like by substracting the highest 8800gtx sli power usage result with the one for the 8800gtx single card we can conclude that the card can use as much as 205W. Does anybody knows if this number could increase when the card is used in DX10 mode?
  • JarredWalton - Wednesday, November 8, 2006 - link

    Without DX10 games and an OS, we can't test it yet. Sorry.
  • JarredWalton - Wednesday, November 8, 2006 - link

    Incidentally, I would expect the added power draw in SLI comes from more than just the GPU. The CPU, RAM, and other components are likely pushed to a higher demand with SLI/CF than when running a single card. Look at FEAR as an example, and here's the power differences for the various cards. (Oblivion doesn't have X1950 CF numbers, unfortunately.)

    X1950 XTX: 91.3W
    7900 GTX: 102.7W
    7950 GX2: 121.0W
    8800 GTX: 164.8W

    Notice how in this case, X1950 XTX appears to use less power than the other cards, but that's clearly not the case in single GPU configurations, as it requires more than everything besides the 8800 GTX. Here's the Prey results as well:

    X1950 XTX: 111.4W
    7900 GTX: 115.6W
    7950 GX2: 70.9W
    8800 GTX: 192.4W

    So there, GX2 looks like it is more power efficient, mostly because QSLI isn't doing any good. Anyway, simple subtraction relative to dual GPUs isn't enough to determine the actual power draw of any card. That's why we presented the power data without a lot of commentary - we need to do further research before we come to any final conclusions.
  • IntelUser2000 - Wednesday, November 8, 2006 - link

    It looks like putting SLI uses +170W more power. You can see how significant video card is in terms of power consumption. It blows the Pentium D away by couple of times.
  • JoKeRr - Wednesday, November 8, 2006 - link

    well, keep in mind the inefficiency of PSU, generally around 80%, so as overall power draw increases, the marginal loss of power increases a lot as well. If u actually multiply by 0.8, it gives about 136W. I suppose the power draw is from the wall.
  • DerekWilson - Thursday, November 9, 2006 - link

    max TDP of G80 is at most 185W -- NVIDIA revised this to something in the 170W range, but we know it won't get over 185 in any case.

    But games generally don't enable a card to draw max power ... 3dmark on the other hand ...
  • photoguy99 - Wednesday, November 8, 2006 - link

    Isn't 1920x1440 a resolution that almost no one uses in real life?

    Wouldn't 1920x1200 apply many more people?

    It seems almost all 23", 24", and many high end laptops have 1900x1200.

    Yes we could interpolate benchmarks, but why when no one uses 1440 vertical?

  • Frallan - Saturday, November 11, 2006 - link

    Well i have one more suggestion for a resolution. Full HD is 1920*1080 - that is sure to be found in a lot of homes in the future (after X-mas any1 ;0) ) on large LCDs - I believe it would be a good idea to throw that in there as well. Especially right now since loads of people will have to decide how to spend their money. The 37" Full HD is a given but on what system will I be gaming PS-3/X-Box/PC... Pls advice.
  • JarredWalton - Wednesday, November 8, 2006 - link

    This should be the last time we use that resolution. We're moving to LCD resolutions, but Derek still did a lot of testing (all the lower resolutions) on his trusty old CRT. LOL

Log in

Don't have an account? Sign up now