• What
    is this?
    You've landed on the AMD Portal on AnandTech. This section is sponsored by AMD. It features a collection of all of our independent AMD content, as well as Tweets & News from AMD directly. AMD will also be running a couple of huge giveaways here so check back for those.
    PRESENTED BY

Further Image Quality Improvements: SSAA LOD Bias and MLAA 2.0

The Southern Islands launch has been a bit atypical in that AMD has been continuing to introduce new AA features well after the hardware itself has shipped. The first major update to the 7900 series drivers brought with it super sample anti-aliasing (SSAA) support for DX10+, and starting with the Catalyst 12.3 beta later this month AMD is turning their eye towards further improvements for both SSAA and Morphological AA (MLAA).

On the SSAA side of things, since Catalyst 9.11 AMD has implemented an automatic negative Level Of Detail (LOD) bias in their drivers that gets triggered when using SSAA. As SSAA oversamples every aspect of a scene – including textures – it can filter out high frequency details in the process. By using a negative LOD bias, you can in turn cause the renderer to use higher resolution textures closer to the viewer, which is how AMD combats this effect.

With AMD’s initial release of DX10+ SSAA support for the 7900 series they enabled SSAA DX10+ games, but they did not completely port over every aspect of their DX9 SSAA implementation. In this case while there was a negative LOD bias for DX9 there was no such bias in place for DX10+. Starting with Catalyst 12.3 AMD’s drivers have a similar negative LOD bias for DX10+ SSAA, which will bring it fully on par with their DX9 SSAA implementation.

As far as performance and image quality goes, the impact to both is generally minimal. The negative LOD bias slightly increases the use of higher resolution textures, and thereby increases the amount of texels to be fetched, but in our tests the performance difference was non-existent. For that matter in our tests image quality didn’t significantly change due to the LOD bias. It definitely makes textures a bit sharper, but it’s a very subtle effect.


Original uncropped screenshots

4x SSAA 4x SSAA w/LOD Bias

Moving on, AMD’s other AA change is to Morphological AA, their post-process pseudo-AA method. AMD first introduced MLAA back in 2010 with the 6800 series, and while they were breaking ground in the PC space with a post-process AA filter, game developers quickly took the initiative 2011 to implement post-process AA directly into their games, which allowed it to be applied before HUD elements were drawn and avoiding the blurring of those elements.

Since then AMD has been working on refining their MLAA implementation, which will be replacing MLAA 1.0 and is being launched as MLAA 2.0. In short, MLAA 2.0 is supposed to be faster and have better image quality than MLAA 1.0, reflecting the very rapid pace of development for post-process AA over the last year and a half.

As far as performance goes the performance claims are definitely true. We ran a quick selection of our benchmarks with MLAA 1.0 and MLAA 2.0, and the performance difference between the two is staggering at times. Whereas MLAA 1.0 had a significant (20%+) performance hit in all 3 games we tested, MLAA 2.0 has virtually no performance hit (<5%) in 2 of the 3 games we tested, and in the 3rd game (Portal 2) the performance hit is still reduced by some. This largely reflects the advancements we’ve seen with games that implement their own post-process AA methods, which is that post-process AA is nearly free in most games.

Radeon HD 7970 MLAA Performance
  4x MSAA 4x MSAA + MLAA 1.0 4x MSAA + MLAA 2.0
Crysis: Warhead 54.7

43.5

53.2
DiRT 3 85.9 49.5 78.5
Portal 2 113.1 88.3 92

As for image quality, that’s not quite as straightforward. Since MLAA does not have access to any depth data and operates solely on the rendered image, it’s effectively a smart blur filter. Consequently like any post-process AA method there is a need to balance the blurring of aliased edges with the unintentional burring of textures and other objects, so quality is largely a product of how much burring you’re willing to put up for any given amount of de-aliasing. In other words, it’s largely subjective.


Original uncropped screenshots

  Batman AC #1 Batman AC #2 Crysis: Warhead Portal 2
MLAA 1.0 Old MLAA Old MLAA Old MLAA Old MLAA
MLAA 2.0 New MLAA New MLAA New MLAA New MLAA

From our tests, the one thing that MLAA 2.0 is clearly better at is identifying HUD elements in order to avoid blurring them – Portal 2 in particular showcases this well. Otherwise it’s a tossup; overall MLAA 2.0 appears to be less overbearing, but looking at Portal 2 again it ends up leaving aliasing that MLAA 1.0 resolved. Again this is purely subjective, but MLAA 2.0 appears to cause less image blurring at a cost of less de-aliasing of obvious aliasing artifacts. Whether that’s an improvement or not is left as an exercise to the reader.

Meet The Radeon HD 7870 & Radeon HD 7850 The Test
POST A COMMENT

173 Comments

View All Comments

  • mak360 - Monday, March 05, 2012 - link

    Enjoy, now go and buy Reply
  • ImSpartacus - Monday, March 05, 2012 - link

    Yeah, I'm trying to figure out if a 7850 could go in an Alienware X51. It looks like it uses a 6 pin power connector and puts out 150W of heat.

    While we would lose Optimus, would it work?
    Reply
  • taltamir - Monday, March 05, 2012 - link

    optimus is laptops only. You do not have optimus with your desktop. Reply
  • ImSpartacus - Monday, March 05, 2012 - link

    The X51 has desktop Optimus.

    "The icing on the graphics cake is that the X51 is the first instance of desktop Optimus we've seen. That's right: you can actually plug your monitor into the IGP's HDMI port and the tower will power down the GPU when it's not in use. This implementation functions just like the notebook version does, and it's a welcome addition."

    http://www.anandtech.com/show/5543/alienware-x51-t...

    In reality, if I owned an X51, I would wait so I could shove the biggest 150W Kepler GPU in there for some real gaming.

    But I'm sure the X51 will be updated for Kepler and Ivy Bridge, so now wouldn't be the best time to get an X51.

    Waiting games are lame...
    Reply
  • scook9 - Monday, March 05, 2012 - link

    Wrong. Read a review.....The bigger issue will be the orientation of the PCIe Power Connector I expect. I have a tightly spaced HTPC that currently uses a GTX 570 HD from EVGA because it was the best card I could fit in the Antec Fusion enclosure. If the PCIe power plugs were facing out the side of the card and not the back I would have not been able to use it. I expect the same consideration will apply to the even smaller X51 Reply
  • kungfujedis - Monday, March 05, 2012 - link

    he does. x51 is a desktop with optimus.

    http://www.theverge.com/2012/2/3/2768359/alienware...
    Reply
  • Samus - Monday, March 05, 2012 - link

    EA really screwed AMD over with Battlefield 3. There's basically no reason to consider a Radeon card if you plan on heavily playing BF3, especially since most other games like Skyrim, Star Wars, Rage, etc, all run excellent on any $200+ card, with anything $300+ being simply overkill.

    The obvious best card for Battlefield 3 is a Geforce GTX 560 TI 448 Cores for $250-$280, basically identical in performance to the GTX570 in BF3. Even those on a budget would be better served with a low-end GTX560 series card unless you run resolutions above 1920x1200.

    If I were AMD, I'd concentrate on increasing Battlefield 3 performance with driver tweaks, because it's obvious their architecture is superior to nVidia's, but these 'exclusive' titles are tainted.
    Reply
  • kn00tcn - Monday, March 05, 2012 - link

    screwed how? only the 7850 is slightly lagging behind, & historically BC2 was consistently a little faster on nv

    also BF3 has a large consistent boost since feb14 drivers (there was another boost sometime in december, benchmark3d should have the info for both)
    Reply
  • chizow - Tuesday, March 06, 2012 - link

    @ Samus

    BF3 isn't an Nvidia "exclusive", they made sure to remain vendor agnostic and participate in both IHV's vendor programs. No pointing the finger and crying foul on this game, it just runs better on Nvidia hardware but I do agree it should be running better than it does on this new gen of AMD hardware.

    http://www.amd4u.com/freebattlefield3/
    http://sites.amd.com/us/game/games/Pages/battlefie...
    Reply
  • CeriseCogburn - Monday, March 26, 2012 - link

    In the reviews here SHOGUN 2 total war is said to be the very hardest on hardware, and Nvidia wins that - all the way to the top.
    --
    So on the most difficult game, Nvidia wins.
    Certainly other factors are at play on these amd favored games like C1 and M2033 and other amd optimized games.
    --
    Once again, on the MOST DIFFICULT to render Nvidia has won.
    Reply

Log in

Don't have an account? Sign up now