The Return of Supersample AA

Over the years, the methods used to implement anti-aliasing on video cards have bounced back and forth. The earliest generation of cards such as the 3Dfx Voodoo 4/5 and ATI and NVIDIA’s DirectX 7 parts implemented supersampling, which involved rendering a scene at a higher resolution and scaling it down for display. Using supersampling did a great job of removing aliasing while also slightly improving the overall quality of the image due to the fact that it was sampled at a higher resolution.

But supersampling was expensive, particularly on those early cards. So the next generation implemented multisampling, which instead of rendering a scene at a higher resolution, rendered it at the desired resolution and then sampled polygon edges to find and remove aliasing. The overall quality wasn’t quite as good as supersampling, but it was much faster, with that gap increasing as MSAA implementations became more refined.

Lately we have seen a slow bounce back to the other direction, as MSAA’s imperfections became more noticeable and in need of correction. Here supersampling saw a limited reintroduction, with AMD and NVIDIA using it on certain parts of a frame as part of their Adaptive Anti-Aliasing(AAA) and Supersample Transparency Anti-Aliasing(SSTr) schemes respectively. Here SSAA would be used to smooth out semi-transparent textures, where the textures themselves were the aliasing artifact and MSAA could not work on them since they were not a polygon. This still didn’t completely resolve MSAA’s shortcomings compared to SSAA, but it solved the transparent texture problem. With these technologies the difference between MSAA and SSAA were reduced to MSAA being unable to anti-alias shader output, and MSAA not having the advantages of sampling textures at a higher resolution.

With the 5800 series, things have finally come full circle for AMD. Based upon their SSAA implementation for Adaptive Anti-Aliasing, they have re-implemented SSAA as a full screen anti-aliasing mode. Now gamers can once again access the higher quality anti-aliasing offered by a pure SSAA mode, instead of being limited to the best of what MSAA + AAA could do.

Ultimately the inclusion of this feature on the 5870 comes down to two matters: the card has lots and lots of processing power to throw around, and shader aliasing was the last obstacle that MSAA + AAA could not solve. With the reintroduction of SSAA, AMD is not dropping or downplaying their existing MSAA modes; rather it’s offered as another option, particularly one geared towards use on older games.

“Older games” is an important keyword here, as there is a catch to AMD’s SSAA implementation: It only works under OpenGL and DirectX9. As we found out in our testing and after much head-scratching, it does not work on DX10 or DX11 games. Attempting to utilize it there will result in the game switching to MSAA.

When we asked AMD about this, they cited the fact that DX10 and later give developers much greater control over anti-aliasing patterns, and that using SSAA with these controls may create incompatibility problems. Furthermore the games that can best run with SSAA enabled from a performance standpoint are older titles, making the use of SSAA a more reasonable choice with older games as opposed to newer games. We’re told that AMD will “continue to investigate” implementing a proper version of SSAA for DX10+, but it’s not something we’re expecting any time soon.

Unfortunately, in our testing of AMD’s SSAA mode, there are clearly a few kinks to work out. Our first AA image quality test was going to be the railroad bridge at the beginning of Half Life 2: Episode 2. That scene is full of aliased metal bars, cars, and trees. However as we’re going to lay out in this screenshot, while AMD’s SSAA mode eliminated the aliasing, it also gave the entire image a smooth makeover – too smooth. SSAA isn’t supposed to blur things, it’s only supposed to make things smoother by removing all aliasing in geometry, shaders, and textures alike.


8x MSAA   8x SSAA

As it turns out this is a freshly discovered bug in their SSAA implementation that affects newer Source-engine games. Presumably we’d see something similar in the rest of The Orange Box, and possibly other HL2 games. This is an unfortunate engine to have a bug in, since Source-engine games tend to be heavily CPU limited anyhow, making them perfect candidates for SSAA. AMD is hoping to have a fix out for this bug soon.

“But wait!” you say. “Doesn’t NVIDIA have SSAA modes too? How would those do?” And indeed you would be right. While NVIDIA dropped official support for SSAA a number of years ago, it has remained as an unofficial feature that can be enabled in Direct3D games, using tools such as nHancer to set the AA mode.

Unfortunately NVIDIA’s SSAA mode isn’t even in the running here, and we’ll show you why.


5870 SSAA


GTX 280 MSAA


GTX 280 SSAA

At the top we have the view from DX9 FSAA Viewer of ATI’s 4x SSAA mode. Notice that it’s a rotated grid with 4 geometry samples (red) and 4 texture samples. Below that we have NVIDIA’s 4x MSAA mode, a rotated grid with 4 geometry samples and a single texture sample. Finally we have NVIDIA’s 4x SSAA mode, an ordered grid with 4 geometry samples and 4 texture samples. For reasons that we won’t get delve into, rotated grids are a better grid layout from a quality standpoint than ordered grids. This is why early implementations of AA using ordered grids were dropped for rotated grids, and is why no one uses ordered grids these days for MSAA.

Furthermore, when actually using NVIDIA's SSAA mode, we ran into some definite quality issues with HL2: Ep2. We're not sure if these are related to the use of an ordered grid or not, but it's a possibility we can't ignore.


4x MSAA   4x SSAA

If you compare the two shots, with MSAA 4x the scene is almost perfectly anti-aliased, except for some trouble along the bottom/side edge of the railcar. If we switch to SSAA 4x that aliasing is solved, but we have a new problem: all of a sudden a number of fine tree branches have gone missing. While MSAA properly anti-aliased them, SSAA anti-aliased them right out of existence.

For this reason we will not be taking a look at NVIDIA’s SSAA modes. Besides the fact that they’re unofficial in the first place, the use of a rotated grid and the problems in HL2 cement the fact that they’re not suitable for general use.

Angle-Independent Anisotropic Filtering At Last AA Image Quality & Performance
POST A COMMENT

327 Comments

View All Comments

  • mapesdhs - Saturday, September 26, 2009 - link


    > That is quite all right, you fellas make sure to read it all, ...

    But that's the thing S.D., I pretty much don't read any of it. :D (does
    anyone?) First sentence only, then move on.

    Ian.

    Reply
  • SiliconDoc - Monday, September 28, 2009 - link

    Oh, ha ha, another lowlife smart aleck.

    One has to wonder if you do as you say, and only read the first sentence, and move on, why you would care what I've typed, since you cannot imagine anyone does anything different. Heck you shouldn't even notice this, right liar ?

    Yes, another liar, not amazing, not at all.

    No need to modify or delete the sentence prior to this JaredWalton, smarty pants insulter won't read it, but I'm sure you can't resist, for "convenience's" sake of course.

    Oh, I don't have to bring anything up on topic at all, because neither did lowlife skum not reading, he just got his nose awfully browner.
    Reply
  • JarredWalton - Friday, September 25, 2009 - link

    Very happy to have everyone here convinced you don't know what you're talking about? That's the only "truth" you've brought to this party. Marketing generally wants reputable people to promote a product - the "every man" approach. Funny that we don't see crazy people espousing products on TV (well, excepting stuff like Sham Wow!)

    Being crazy like you are in this thread only cements your status as someone who doesn't have a firm grip on reality - someone that can't be trusted. Thanks again for clearing that up so thoroughly.

    I am very happy about it as well! :-D
    Reply
  • erple2 - Tuesday, September 29, 2009 - link

    Yeah, but that "Sham Wow" product works like a freakin' charm...

    http://www.popularmechanics.com/blogs/home_journal...">http://www.popularmechanics.com/blogs/home_journal...
    Reply
  • SiliconDoc - Wednesday, September 30, 2009 - link

    Im' sure you spend your time drooling in front of a TV after you spank your joystick for fps, so know all about wacky commercials you have memorized, and besides, it's a pathetic, all you have left insult, off topic, who cares, pure hatred, no real response, and the 5870 double epic fail IS THE HOTTEST ATI CARD OF ALL TIME! Reply
  • erple2 - Wednesday, September 30, 2009 - link

    What's with the personal attacks? Does that mean that you concede defeat?

    Meh, you've no more credibility. Chill out.
    Reply
  • SiliconDoc - Friday, September 25, 2009 - link

    Yeah, now down to insults, since you lost everything else.

    Let's have your claimed specialty outlined here in context, let's have you come clean on LAPTOP GRAPHICS, and spread the truth about how NVIDIA is so far ahead and has been for quite some time, that it's a JOKE to buy a gaming laptop with ATI graphics on board.
    Come on mmister!
    --
    Now that is REALLY FUNNY ! You grabbed your arrogant unscrupulous self and proclaimed your fairness, but picked a spot where ati is completely EPIC FAIL, and NVIDIA is 1000% the only way to go, PERIOD, and left that MAJOR slap in the face high and dry.
    --
    Great job, yeah, you're the "sane one".
    LOL
    Reply
  • dieselcat18 - Saturday, October 3, 2009 - link

    Nvidia fan-boy, troll, loser....take your gforce cards and go home...we can now all see how terrible ATi is thanks you ...so I really don't understand why people are beating down their doors for the 5800 series, just like people did for the 4800 and 3800 cards. I guess Nvidia fan-boy trolls like you have only one thing left to do and that's complain and cry like the itty-bitty babies that some of you are about the competition that's beating you like a drum.....so you just wait for your 300 series cards to be released (can't wait to see how many of those are available) so you can pay the overpriced premiums that Nvidia will be charging AGAIN !...hahaha...just like all that re-badging BS they pulled with the 9800 and 200 cards...what a joke !.. Oh my, I must say you have me in a mood and the ironic thing is I do like Nvidia as much as ATi, I currently own and use both. I just can't stand fools like you who spout nothing but mindless crap while waving your team flag (my card is better than your's..WhaaWhaaWhaa)...just take yourself along with your worthless opinions and slide back under that slimly rock you came from. Reply
  • JarredWalton - Friday, September 25, 2009 - link

    You've been insulting in this whole thread, so don't go crying to mamma about someone pointing that out. I did go and delete the posts from the person calling you gay and suggesting you should die in various ways, because as bad as you've been you haven't stooped quite that low (yet).

    Laptop issues with ATI... you mean http://www.anandtech.com/mobile/showdoc.aspx?i=356...">like this. Granted, I gave them a chance to address the issues. They failed and my full article on the various Clevo high-end notebooks will make it quite clear how far ahead NVIDIA is in the mobile sector right now.

    "Fair" is treating both sides objectively. ATI has major problems with getting updated graphics drivers out on mobile products, and that's horrible. On the desktop, they don't have such issues for the most part. Yeah, you might have to wait a month or so for a driver update to fix the latest hot release and add CrossFire support... but you have to do the exact same thing for NVIDIA with about the same frequency. Only SLI and CF setups really need the regular driver updates, and in many cases the latest 18x and 19x NVIDIA drivers are slower than 16x and 17x on games that are older than six months.

    Fair is also looking at these results and saying, "gee, I can get a 5870 for $400 (or $360 if you wait a few weeks for supply to bolster up), and that same card has no CrossFire or SLI wonkiness and costs less than the GTX 295 and 4870X2. Okay, 4870X2 and GTX 295 beat it in raw performance in some cases, but I don't think there's a single game where you can say one HD 5870 offers less than acceptable performance at 2560x1600, and I can guarantee there are titles that still have issues with SLI and CrossFire. (Yeah, you need to turn down some details in Crysis to get acceptable performance, but that's true of anything other than the top SLI and CF configs.) I would be more than happy to give up a bit of performance to avoid dealing with the whole multi-GPU ordeal. Why don't you tell us how innovative and awesome tri-SLI and quad-SLI are while you're at it?

    At present, you have contributed more than 20% of the comments to this article, and not a single one has been anything but trolling. Screaming and yelling, insulting others, lying and making stuff up, all in support of a company that is just like any other big company. We don't ban accounts often, but you've more than warranted such action.
    Reply
  • SiliconDoc - Friday, September 25, 2009 - link

    I think it is more than absolutely clear, that in fact, I said my peace, my first post, and was absolutely attacked. I didn't attack, I got attacked, and in fact you have done plenty of attacking as well.
    I have also provided links, to back up my assertions and counter arguments, added the text for easy viewing, and pointed out in very specific detail whay issues with bias I had and why.
    --
    Now you've claimed "all I've done is post FUD".
    It is nothing short of amazing for you to even posit that, however I can certainly understand anyone pointing out the obvious bias problems (in the article no less) is "on thin ice", and after getting attacked, is solely blamed for "no facts".
    ---
    I certainly won't disagree that the 5870 is a good value as appearing if especially if you don't like to deal with 2 cards or 2 cores.
    But my posts never claimed otherwise. I first claimed it was not as good as wanted, was disappointing, and therefore was not the end of what ati had in store.
    Since I have posted on the 5890, which will in fact be 512 bit.
    Now, you don't like losing your points, or someone adept enough, smart enough, and accurate enough to counter them.
    Sorry about that, and sorry that I won't just lay down, as more heaps are shovelled my way.
    You skip my actual points, and go some other tangent.
    1. PhysX is an advantage and best implementation so far.
    your response: "It sucks because only 2 games ar available"
    ---
    Is that correct for you to do ? Is it not the very best so far ? Yes, it is in fact.
    I have remained factual and reasonable, and glad enough to throw back when I'm attacked.
    But the fact remains, I have made absolutely solid 100% poijnts no matter how many times you claim " lying and making stuff up, "
    ---
    Yet of course, what I just said about NVIDIA and laptp chips, you agreed with. So accoring to your own characterization (quite unfair), all you do is scream and lie, too.
    Just wonderful.
    The GT300 is going to blow this 5870 away - the stats themselves show it, and betting otherwise is a bad joke, and if sense is still about, you know it as well.
    Reply

Log in

Don't have an account? Sign up now