Another New Anti-Aliasing Mode: Enhanced Quality AA

With the 6800 series AMD introduced Morphological Anti-Aliasing (MLAA), a low-complexity post-processing anti-aliasing filter. As a post-processing filter it worked with a wide variety of games and APIs, and in most cases the performance overhead was not very severe. However it’s not the only new anti-aliasing mode that AMD has been working on.

New with the 6900 series is a mode AMD is calling Enhanced Quality Anti-Aliasing. If you recall NVIDIA’s Coverage Sample Anti-Aliasing (CSAA) introduced with the GeForce 8800GTX, then all of this should sound quite familiar – in fact it’s basically the same thing.

Under traditional MSAA, for a pixel covered by 2 or more triangles/fragments, 2, 4, or 8 subpixel samples are taken to determine what the final pixel should be. In the process the color of the triangle and the Z/depth of the triangle are both sampled and stored, and at the end of the process the results are blended together to determine the final pixel value. This process works well for resolving aliasing along polygon edges at a fraction of the cost of true super sampling, but it’s still expensive. Collecting and storing the Z and color values requires extra memory to store the values and extra memory bandwidth to work with the values. Ultimately while we need enough samples to determine colors of the involved triangles, we do not always need a great deal of them. With a few color/Z samples we have all of the color data we need in most cases, however the “hard” part of anti-aliasing becomes what the proper blending of color values should be.


1 Pixel Covred by 2 Triangles/Fragments

Thus we have EQAA, a compromise on the idea. Color/Z samples are expensive, but just checking if a triangle covers part of a subpixel is very cheap. If we have enough color/Z samples to get the necessary color information, then just doing additional simple subpixel coverage checks would allow us better determine what percentage of a pixel is covered by a given polygon, which we can then use to blend colors in a more accurate fashion. For example with 4x MSAA we can only determine if a pixel is 0/25/50/75/100 percent covered by a triangle, but with 4x EQAA where we take 4 color samples and then 4 additional coverage-only samples, we can determine blending values down to 0/12/25/37/50/62/75/87/100 percent coverage, the same amount of accuracy as using 8x MSAA. Thus in the right situation we can have quality similar to 8x MSAA for only a little over 4x MSAA’s cost.


MSAA & EQAA Sample Patterns

In reality of course this doesn’t always work out as well. The best case scenario is that the additional coverage samples are almost as good as having additional color/Z samples, while the worst case scenario is that additional coverage samples are practically worthless. This depends on a game-by-game, if not pixel-by-pixel basis. In practice additional coverage samples are a way to slightly improve MSAA quality for a very, very low cost.

While NVIDIA has had the ability to take separate coverage samples since G80, AMD has not had this ability until now. With the 6900 hardware their ROPs finally gain this ability.

Beyond that, AMD and NVIDIA’s implementations are nearly identical except for the naming convention. Both can take a number of coverage samples independent of the color/Z samples based on the setting used; the only notable difference we’re aware of is that like AMD’s other AA modes, their EQAA mode can be programmed to use a custom sample pattern.

As is the case with NVIDIA’s CSAA, AMD’s EQAA mode is available to DirectX applications or can be forced through the drivers. DirectX applications can set it through the Multisample Quality attribute, which is usually abstracted to list the vendor’s name for the mode in a game’s UI. Otherwise it can be forced via the Catalyst Control Center, either by forcing an AA mode, or as is the case with NVIDIA, enhancing the AA mode by letting the game set the AA mode while the driver overrides the game and specifies different Multisample Quality attribute. Thus the “enhance application settings” AA mode is new to AMD with the 6900 series.

To be honest we’re a bit ruffled by the naming choice. True, NVIDIA did go and have to pick daft names for their CSAA modes (when is 8x not 8 sample MSAA?), but ultimately CSAA and EQAA are virtually identical. NVIDIA has a 4 year lead on AMD here, and we’d just as well use NVIDIA’s naming conventions for consistency. Instead we have the following.

Coverage Sampling Modes: CSAA vs EQAA
NVIDIA Mode
(Color + Coverage)
AMD
2x 2+0 2x
N/A 2+2 2xEQ
4x 4+0 4x
8x 4+4 4xEQ
16x 4+12 N/A
8xQ 8+0 8x
16xQ 8+8 8xEQ
32x 8+24 N/A

AMD ends up having 1 mode NVIDIA doesn’t, 2xEQ, which is 2x MSAA + 2x cover samples; meanwhile NVIDIA has 16x (4x MSAA + 12 cover samples) and 32x (8x MSAA + 24 cover samples). Finally, as we’ll see, just as is the case for NVIDIA additional coverage samples are equally cheap for AMD.

Tweaking PowerTune Meet the 6970 & 6950
Comments Locked

168 Comments

View All Comments

  • MeanBruce - Wednesday, December 15, 2010 - link

    TechPowerUp.com shows the 6850 as 95percent or almost double the performance of the 4850 and 100percent more efficient than the 4850@1920x1200. I also am upgrading an old 4850, as far as the 6950 check their charts when they come up later today.
  • mapesdhs - Monday, December 20, 2010 - link


    Today I will have completed by benchmark pages comparing 4890, 8800GT and
    GTX 460 1GB (800 and 850 core speeds), in both single and CF/SLI, for a range
    of tests. You should be able to extrapolate between known 4850/4890 differences,
    the data I've accumulated, and known GTX 460 vs. 68xx/69xx differences (baring
    in mind I'm testing with 460s with much higher core clocks than the 675 reference
    speed used in this article). Email me at mapesdhs@yahoo.com and I'll send you
    the URL once the data is up. I'm testing with 3DMark06, Unigine (Heaven, Tropics
    and Sanctuary), X3TC, Stalker COP, Cinebench, Viewperf and PT Boats. Later
    I'll also test with Vantage, 3DMark11 and AvP.

    Ian.
  • ZoSo - Wednesday, December 15, 2010 - link

    Helluva 'Bang for the Buck' that's for sure! Currently I'm running a 5850, but I have been toying with the idea of SLI or CF. For a $300 difference, CF is the way to go at this point.
    I'm in no rush, I'm going to wait at least a month or two before I pull any triggers ;)
  • RaistlinZ - Wednesday, December 15, 2010 - link

    I'm a bit underwhelmed from a performance standpoint. I see nothing that will make me want to upgrade from my trusty 5870.

    I would like to see a 2x6950 vs 2x570 comparison though.
  • fausto412 - Wednesday, December 15, 2010 - link

    exactly my feelings.

    it's like thinking Miss Universe is about to screw you and then you find out it's her mom....who's probably still hot...but def not miss universe
  • Paladin1211 - Wednesday, December 15, 2010 - link

    CF scaling is truly amazing now, I'm glad that nVidia has something to catch up in terms of driver. Meanwhile, the ATI wrong refresh rate is not fixed, it stucks at 60hz where the monitor can do 75hz. "Refresh force", "refresh lock", "ATI refresh fix", disable /enable EDID, manually set monitor attributes in CCC, EDID hack... nothing works. Even the "HUGE" 10.12 driver can't get my friend's old Samsung SyncMaster 920NW to work at its native 1440x900@75hz, both in XP 32bit and win 7 64bit. My next monitor will be an 120hz for sure, and I don't want to risk and ruin my investment, AMD.
  • mapesdhs - Monday, December 20, 2010 - link


    I'm not sure if this will help fix the refresh issue (I do the following to fix max res
    limits), but try downloading the drivers for the monitor but modify the data file
    before installing them. Check to ensure it has the correct genuine max res and/or
    max refresh.

    I've been using various models of CRT which have the same Sony tube that can
    do 2048 x 1536, but every single vendor that sells models based on this tube has
    drivers that limited the max res to 1800x1440 by default, so I edit the file to enable
    2048 x 1536 and then it works fine, eg. HP P1130.

    Bit daft that drivers for a monitor do not by default allow one to exploit the monitor
    to its maximum potential.

    Anyway, good luck!!

    Ian.
  • techworm - Wednesday, December 15, 2010 - link

    future DX11 games will stress GPU and video RAM incrementally and it is then that 6970 will shine so it's obvious that 6970 is a better and more future proof purchase than GTX570 that will be frame buffer limited in near future games
  • Nickel020 - Wednesday, December 15, 2010 - link

    In the table about whether PowerTune affects an application or not there's a yes for 3DMark, and in the text you mention two applications saw throttling (with 3DMark it would be three). Is this an error?

    Also, you should maybe include that you're measuring the whole system power in the PowerTune tables, it might be confusing for people who don't read your reviews very often to see that the power draw you measured is way higher than the PowerTune level.

    Reading the rest now :)
  • stangflyer - Wednesday, December 15, 2010 - link

    Sold my 5970 waiting for 6990. With my 5970 playing games at 5040x1050 I would always have a 4th extended monitor hooked up to a tritton uve-150 usb to vga adapter. This would let me game while having the fourth monitor display my teamspeak, afterburner, and various other things.
    Question is this!! Can i use the new 6950/6970 and use triple monitor and also use a 4th screen extended at the same time? I have 3 matching dell native display port monitors and a fourth with vga/dvi. Can I use the 2 dp's and the 2 dvi's on the 6970 at the same time? I have been looking for this answer for hours and can't find it! Thanks for the help.

Log in

Don't have an account? Sign up now