Adaptive AA

Antialiasing is becoming more and more important as graphics cards get faster. With the 7800 GTX (and the X1800 XT when it comes along), there are very few monitors that can really stress the fill rate potential of these cards. Not everyone has 30" Apple Cinema Displays, and even at resolutions of 1600x1200 we see CPU limitedness start to become a factor. With many 1600x1200 flat panels out there on the market, this "monitor limitedness" will only get worse and worse as graphics cards continue to improve at a significantly faster rate than display technology. Quality enhancing features will get more and more attention in the meantime, and more power can be spent on enhancing a game rather than just rendering it. Thus more focus has recently been put into antialiasing algorithms.

Multisample AA (MSAA) has been the predominant method of antialiasing for quite a few years now, but it is not perfect. MSAA only works around polygon edges by smoothing lines when the area covered by one pixel falls over multiple triangles. SSAA oversamples everything at every pixel and is traditionally implemented by rendering a scene at a larger resolution and then down-sampling the image to fit the display. Lots of power is wasted with SSAA in areas that are covered by the same color, so MSAA wins out in performance while sacrificing a bit of quality.

One of the major down sides of MSAA is its inability to antialias the interior of polygons mapped with a texture that includes transparency. Things like wires, fences, and foliage are often rendered with huge triangles and transparent texture maps. Since MSAA only works on polygon edges, the areas between transparent and opaque parts inside these large polygons can look very jagged. The only way to combat this is to take multiple texture samples per pixel within the same polygon. This can be done by performing multiple texture lookups per pixel rather than simply rendering the scene at a huge resolution.

ATI is including a feature called Adaptive Antialiasing in the Catalyst release that comes along with the X1000 series. Adaptive AA is functionally similar to NVIDIA's Transparency AA. Rather than just doing multi-sample (MSAA), ATI is able to adaptively take multiple texture samples per pixel in areas that would benefit from including additional texture samples (essentially resulting in a combination of MSAA and SSAA where needed). Depending on where ATI determines it is necessary to perform multiple texture samples, poorly designed or easily aliased textures can benefit in addition to those that include transparency.

Unlike NVIDIA's Transparency AA, ATI's Adaptive AA will be available to all ATI hardware owners. How's that for value-add! This could be a very nice thing for X800/X850 series owners stuck with 1280x1024 panels. Apparently ATI has been tweaking this technology for a few years now, but held off on its introduction until this launch. The use of this feature on most older hardware won't be widespread as performance will degrade too much. In these cases, increasing resolution is almost always more effective than increasing AA quality. Here's a look at the Adaptive AA as compared to Transparency AA:

NVIDIA 7800 GTX 4xAA

NVIDIA 7800 GTX 4xAA

Mouse over to cycle images

ATI has also improved their hardware to the point where they can support MSAA on multiple render target (MRT) and fp16 render targets. No other hardware out now can perform MSAA in games that use these techniques. ATI is touting AA on fp16 targets as the ability to perform AA on HDR enabled games. While it is true that having front to back input and output textures and render targets composed of fp16 information is a very high quality way of doing HDR, it is also very memory bandwidth intensive and requires a lot of GPU horsepower (especially since there are no texture compression techniques that work on fp16 textures). Certainly support for MSAA on floating point and MRT output is a welcome addition to the feature set, but we don't currently have a good way to test the performance or quality of this feature as there aren't any good applications around to test them.

Continuing down the path to high quality AA, ATI has improved the sub-pixel accuracy of their antialiasing hardware. Rather than being limited to selecting samples on an 8x8 grid, ATI is now able to work with select samples from a 16x16 grid. Moving up from 64 to 256 potential sub-pixels per pixel, ATI has improved the accuracy of their AA algorithm. This accuracy improvement may not be directly noticeable, but this enhancement will also improve the quality of dense AA methods like CrossFire's SuperAA technology. Workstation users will also benefit as this will likely translate to improved point and line antialiasing quality.

The one thing I would ask for from ATI is the ability to turn off "gamma-correct" AA. Such methods only shift inaccuracies between overly dark and overly bright pixels. Consistent results would only be possible if all displays were the same. Since they are not, it's really a six of one half-dozen of the other choice. Putting the decision in the user's hands as to what looks better is always our favored suggestion.

As if all of these enhancements weren't enough to top off ATI's already industry leading antialiasing (NVIDIA's grid aligned sample patterns just can't touch ATI's fully programmable sample patterns in quality), ATI has also vastly improved antialiasing performance with the X1000 generation of hardware. Neither NVIDIA nor previous generation ATI hardware can match the minimal performance hit the X1000 series incurs when enabling standard AA. The performance we see is likely due to a combination of the improvements made to the AA hardware itself along side enhancements to the memory architecture that allow for higher bandwidth and the prioritization of data moving on the ring bus.
High Quality AF Test Setup and Power Performance
POST A COMMENT

103 Comments

View All Comments

  • DerekWilson - Friday, October 7, 2005 - link

    Hello,

    Rather than update this article with the tables as we had planned, we decided to go all out and collect enough data to build something really interesting.

    http://anandtech.com/video/showdoc.aspx?i=2556">http://anandtech.com/video/showdoc.aspx?i=2556

    Our extended performance analysis should be enough to better show the strengths and weaknesses of the X1x00 hardware in all the games we tested in this article plus Battlefield 2.

    I would like to apologize for not getting more data together in time for this article, but I hope the extended performance tests will help make up for what was lacking here.

    And we've got more to come as well -- we will be doing an in-depth follow up on new feature performance and quality as well.

    Thanks,
    Derek Wilson
    Reply
  • MiLLeRBoY - Thursday, October 6, 2005 - link

    If NVIDIA puts out a 7800XT with a bigger cooler, which makes the video card dual slots, instead of just one slot. This would allow them to increase the speeds of the RAM and GPU. And if they increase it to 512MB ram, they will knock ATI’s X1800XT off the map completely. Reply
  • MiLLeRBoY - Thursday, October 6, 2005 - link

    oops, 7800 GTX, I mean, lol. Reply
  • stephenbrooks - Thursday, October 6, 2005 - link

    Maybe a solution for all the complaints about review-quality would be for AnandTech to put its reviews through "beta"? :p Reply
  • waldo - Thursday, October 6, 2005 - link

    So, I am back, and as always confused!

    Where are we now? We have at THG the same card beating teh 7800GTX hands down in several instances....and here at Anand, we have the ATI version barely holding its head above water.....talk about weird inconsistencies....someone is tweaking the numbers or the machines....one or the other.

    Some of me would like to give the nod to THG because they have a history of doing more accurate more complete video card reviews, but this is just crazy....can someone at Anand please explain, cause well, I know THG won't.
    Reply
  • tomoyo - Thursday, October 6, 2005 - link

    In terms of pricing, I think Nvidia has Ati beaten in every category of card currently.

    I think the competition that ATI is marketing each card against is as follows(even if the prices have a huge disparity currently):
    X1800XT vs 7800GTX
    X1800XL vs 7800GT
    X1600XT vs 6800/6600GT
    X1600Pro vs 6600GT/6600
    x1300Pro vs 6600
    x1300 vs 6200

    From what I've seen of the reviews from anandtech, techreport, and a couple other sources it looks like the X1800XT/XL are pretty competitive with their competition, however I really dislike the extra power consumption and of course the cost of the card. I think the 7800 is a far better solution in terms of most categories except a few minor features like having HDR/AA at the same time. It looks like it's possible the X1800 might have some gains in future games because of the better memory controller and threading pixel shader, but it seems rather useless for now.

    The x1600 looks like the biggest disappointment by far. It's nowhere near the league of the 6800 cards and barely outperforms the 6600gt, which has a huge price advantage. The x800gto2 looks like a far better card than the x1600 here. Personally I'm hoping nvidia does what's expected and puts out a 90nm 7600 that has a decent performance gain over the 6600gt. That might be one of the best silent computing cards around when it comes out. (I'm hoping to replace my 6600 with this now that the x1600 is no upgrade for me)

    The x1300 actually looks like the most promising chip to me. It's obviously not worthwhile for gamers, but I think it might turn out to be a pretty good drop-in card for non-gaming systems. It's all dependent on whether it can hit the price point for the under $100(or is that under $70) market well. It certainly looks like it'll outperform the 6200 and x300 and be the new standard for entry level systems... until nvidia's next entry card. Not to mention most of the x1x00 generation features are still included with the x1300 card.
    Reply
  • AtaStrumf - Thursday, October 6, 2005 - link

    Totaly disappointed in both ATi and AT.

    As for X1300 don't forget this is the best version out of X1300 family and I can't help but remember the FX 5200 Ultra, which looked great but was never really available, because they could not produce it at low enough price point. I think same will happen here.
    Reply
  • bob661 - Thursday, October 6, 2005 - link

    Very nice summary. Reply
  • andyc - Wednesday, October 5, 2005 - link

    So what card is the "real" competitor to the 7800GT, becuase frankly, I'm totally confused which card ATI is trying to use to compete against it. Reply
  • Pete - Wednesday, October 5, 2005 - link

    X1800XL. Reply

Log in

Don't have an account? Sign up now