Our DX9FSAAViewer won't show us the exact sample patterns for CSAA, but we can take a look at where ATI and NVIDIA are getting their color sample points:

ATI
G70
G80
G80*

*Gamma AA disabled

As we can see, NVIDIA's 8x color sample AA modes use a much better pseudo random sample pattern rather than a combination of two rotated grid 4xAA patterns as in G70's 8xSAA.

While it is interesting to talk about the internal differences between MSAA and CSAA, the real test is pitting NVIDIA's new highest quality mode against ATI's highest quality.



G70 4X G80 16XQ ATI 6X

Hold mouse over links to see Image Quality



G70 4X G80 16XQ ATI 6X

Hold mouse over links to see Image Quality

Stacking up the best shows the power of NVIDIA's CSAA with 16 sample points and 8 color/z values looking much smoother than ATI's 6xAA. Compared to G70, both ATI and G80 look much better. Now let's take a look at the performance impact of CSAA. This graph may require a little explanation to understand, but it is quite interesting and worth looking at.

As we move from lower to higher quality AA modes, performance generally goes down. The exception is with G80's 16x mode. Its performance is only slightly lower than 8x. This is due to the fact that both modes use 4 color samples alongside more coverage samples. We can see the performance impact of having more coverage samples than color samples by looking at the performance drop from 4x to 8x on G80. There is another slight drop in performance when increasing the number of coverage samples from 8x to 16x, but it is almost nil. With the higher number of multisamples in 8xQ, algorithms that require z/stencil data per sub-pixel may look better, but 16x definitely does great job with the common edge case with much less performance impact. Enabling 16xQ shows us the performance impact of enabling more coverage samples with 8x multisamples.

It is conceivable that a CSAA mode using 32 sample points and 8 color points could be enabled to further improve coverage data at nearly the same performance impact of 16xQ (similar to the performance difference we see with 8x and 16x). Whatever the reason this wasn't done in G80, the potential is there for future revisions of the hardware to offer a 32x mode with the performance impact of 8x. Whether the quality improvement is there or not is another issue entirely.

CSAA Image Quality What's Gamma Correct AA?
Comments Locked

111 Comments

View All Comments

  • dwalton - Thursday, November 9, 2006 - link

    When using older cards sacrificing IQ for performance is typically acceptable. Who needs AA when running F.E.A.R on a 9700 Pro.

    However, on a just launched high-end card, why would anyone feel the need to sacrifice IQ for performance? Some may say resolution over AA, but I find it hard to believe that there is a lot of gaming enthusiasts with deep pockets, who play with insane resolutions yet no AA.
  • JarredWalton - Thursday, November 9, 2006 - link

    If I look for jaggies, I see them. On most games, however, they don't bother me much at all. Running at native resolution on LCDs or at a really high resolution on CRTs, I'd take that over a lower res with 4xAA. If you have the power to enable 4xAA, great, but I'm certainly not one to suggest it's required. I'd rather be able to enable vsync without a massive performance hit (i.e. stay above 60 FPS) than worry about jaggies. Personal preference.
  • munim - Wednesday, November 8, 2006 - link

    "With the latest 1.09 patch, F.E.A.R. has gained multi-core support,"

    Where is this?
  • JarredWalton - Wednesday, November 8, 2006 - link

    I wrote that, but it may be incorrect. I'm trying to get in contact with Gary to find out if I'm just being delusional about Quad Core support. Maybe it's NDA still? Hmmm.... nothing to see here!
  • JarredWalton - Wednesday, November 8, 2006 - link

    Okay, it's the 1.08 patch, and that is what was tested. Since we didn't use a quad core CPU I don't know if it will actually help or not -- something to look at in the future.
  • Nelsieus - Wednesday, November 8, 2006 - link

    I haven't even finished reading it yet, but so far, this is the most comprehensive, in-depth review I've seen on G80 and I just wanted to mention that beforehand.

    :)
  • GhandiInstinct - Wednesday, November 8, 2006 - link

    What upcoming games will be the first to be fully made on DX10 structure? And does the G80 have full support of DX10?
  • timmiser - Thursday, November 9, 2006 - link

    Microsoft Flight Simulator X will be DX10 compliant via a planned patch once Vista comes out.
  • JarredWalton - Wednesday, November 8, 2006 - link

    All DX10 hardware will be full DX10 (see pages 2-4). As for games that will be DX10 ready, Halo 2 for Vista will be for sure. Beyond that... I don't know for sure. As we've explained a bit, DX10 will require Vista, so anything launching before Vista will likely not be DX10 compliant.
  • shabby - Wednesday, November 8, 2006 - link

    They're re-doing a dx8 game in dx10? You gotta be kidding me, whats the point? You cant polish a turd.

Log in

Don't have an account? Sign up now