Further Image Quality Improvements: SSAA LOD Bias and MLAA 2.0

The Southern Islands launch has been a bit atypical in that AMD has been continuing to introduce new AA features well after the hardware itself has shipped. The first major update to the 7900 series drivers brought with it super sample anti-aliasing (SSAA) support for DX10+, and starting with the Catalyst 12.3 beta later this month AMD is turning their eye towards further improvements for both SSAA and Morphological AA (MLAA).

On the SSAA side of things, since Catalyst 9.11 AMD has implemented an automatic negative Level Of Detail (LOD) bias in their drivers that gets triggered when using SSAA. As SSAA oversamples every aspect of a scene – including textures – it can filter out high frequency details in the process. By using a negative LOD bias, you can in turn cause the renderer to use higher resolution textures closer to the viewer, which is how AMD combats this effect.

With AMD’s initial release of DX10+ SSAA support for the 7900 series they enabled SSAA DX10+ games, but they did not completely port over every aspect of their DX9 SSAA implementation. In this case while there was a negative LOD bias for DX9 there was no such bias in place for DX10+. Starting with Catalyst 12.3 AMD’s drivers have a similar negative LOD bias for DX10+ SSAA, which will bring it fully on par with their DX9 SSAA implementation.

As far as performance and image quality goes, the impact to both is generally minimal. The negative LOD bias slightly increases the use of higher resolution textures, and thereby increases the amount of texels to be fetched, but in our tests the performance difference was non-existent. For that matter in our tests image quality didn’t significantly change due to the LOD bias. It definitely makes textures a bit sharper, but it’s a very subtle effect.


Original uncropped screenshots

4x SSAA 4x SSAA w/LOD Bias

Moving on, AMD’s other AA change is to Morphological AA, their post-process pseudo-AA method. AMD first introduced MLAA back in 2010 with the 6800 series, and while they were breaking ground in the PC space with a post-process AA filter, game developers quickly took the initiative 2011 to implement post-process AA directly into their games, which allowed it to be applied before HUD elements were drawn and avoiding the blurring of those elements.

Since then AMD has been working on refining their MLAA implementation, which will be replacing MLAA 1.0 and is being launched as MLAA 2.0. In short, MLAA 2.0 is supposed to be faster and have better image quality than MLAA 1.0, reflecting the very rapid pace of development for post-process AA over the last year and a half.

As far as performance goes the performance claims are definitely true. We ran a quick selection of our benchmarks with MLAA 1.0 and MLAA 2.0, and the performance difference between the two is staggering at times. Whereas MLAA 1.0 had a significant (20%+) performance hit in all 3 games we tested, MLAA 2.0 has virtually no performance hit (<5%) in 2 of the 3 games we tested, and in the 3rd game (Portal 2) the performance hit is still reduced by some. This largely reflects the advancements we’ve seen with games that implement their own post-process AA methods, which is that post-process AA is nearly free in most games.

Radeon HD 7970 MLAA Performance
  4x MSAA 4x MSAA + MLAA 1.0 4x MSAA + MLAA 2.0
Crysis: Warhead 54.7

43.5

53.2
DiRT 3 85.9 49.5 78.5
Portal 2 113.1 88.3 92

As for image quality, that’s not quite as straightforward. Since MLAA does not have access to any depth data and operates solely on the rendered image, it’s effectively a smart blur filter. Consequently like any post-process AA method there is a need to balance the blurring of aliased edges with the unintentional burring of textures and other objects, so quality is largely a product of how much burring you’re willing to put up for any given amount of de-aliasing. In other words, it’s largely subjective.


Original uncropped screenshots

  Batman AC #1 Batman AC #2 Crysis: Warhead Portal 2
MLAA 1.0 Old MLAA Old MLAA Old MLAA Old MLAA
MLAA 2.0 New MLAA New MLAA New MLAA New MLAA

From our tests, the one thing that MLAA 2.0 is clearly better at is identifying HUD elements in order to avoid blurring them – Portal 2 in particular showcases this well. Otherwise it’s a tossup; overall MLAA 2.0 appears to be less overbearing, but looking at Portal 2 again it ends up leaving aliasing that MLAA 1.0 resolved. Again this is purely subjective, but MLAA 2.0 appears to cause less image blurring at a cost of less de-aliasing of obvious aliasing artifacts. Whether that’s an improvement or not is left as an exercise to the reader.

Meet The Radeon HD 7870 & Radeon HD 7850 The Test
Comments Locked

173 Comments

View All Comments

  • Kaboose - Monday, March 5, 2012 - link

    7870, beats GTX 570 and is about even with the GTX 580, uses 150watts less power at load, is quieter, is cooler, and has idle power draw > 23watts less. How is this a disappointment? The only disappointment i see is the price which is the result of no competition from Nvidia.
  • Kiste - Monday, March 5, 2012 - link

    A new generation of GPUs used to give us a whole hell of a lot more performance at any given price point. The current AMD stuff does not and that is a disappointment.

    Case in point: you even have to talk these things up by basically saying "oh, well, at least they draw less power".
  • Kaboose - Monday, March 5, 2012 - link

    dropping power consumption by over 50% is something of a gimmick? Dropping load temps by 14c compared to the GTX 570 is not significant? 14c is a fairly large degree of deference, this gives higher room for overclocking as well as a cooler system overall. When Nvidia releases Kepler and we have both companies with 28nm then we can (hopefully) see some competition in price. In my opinion the 7870 at $325 would be a great card right now. Once Kepler is out $285-300 I think would be nice. I agree it is over priced right now however.

    If Nvidia releases Kepler and gives us a LOT more performance over last generation then I will concede that the 7xxx series is a failure. However from the way AMD is behaving it doesn't appear Kepler is going to do much in terms of raw performance either.
  • Kiste - Monday, March 5, 2012 - link

    While reducing power consumption might not be a gimmick, it is the result of the new process node and thus in itself not particularly impressive, especially when you more or less keep the performance the same as with the previous generation.

    I'm still not impressed, sorry. Price/performance plain and simply sucks ass with these cards, barely beating the stuff that's on the market right now in that regard.

    And even with the high-end SI cards there's barely much of a performance boost compared to what's already been on the market for months.

    Sure, less power draw is nice. I won't complain about it but if a brand new generation of GPUs comes out and I am not even one little bit compelled to upgrade from my aging, heavily overclocked GTX570, then something is cleary wrong here.
  • Exodite - Monday, March 5, 2012 - link

    We'll just have to wait and see what Nvidia provides when they finally decide to put competition on the market, won't we?

    I'd happily agree to finding the 7900-series not as high-performing as I'd like, and the 7700-series too expensive.

    From the reviews I've read so far the 7800-series, the 7850 especially, is pretty much the perfect card ATM.

    Low power, low noise, cool, 2GB VRAM and runs between a 560 Ti and 570 in performance.

    It's definitely the card I'd recommend to anyone at this point, especially given the fact that we'll see better coolers than AMDs atrocities once we get release versions.
  • Iketh - Monday, March 5, 2012 - link

    You must live in a cold climate. You're happy with a heavily overclocked 570?? I live in FL, and that card increases my power bill $30-$70 each month over my 6870 during 3 of the 4 seasons, and I'm talking from experience. Do you have any idea how hard an A/C has to work in a small 2 bedroom house to counter the blast of heat from an overclocked gaming rig??

    If you live in a hot climate, test it for yourself.

    You don't compare just the power draw of the cards themselves....
  • Kiste - Monday, March 5, 2012 - link

    I'm not quite sure if you're actually expecting a serious answer to that kind of hyperbolic drivel.
  • Jamahl - Monday, March 5, 2012 - link

    You really don't get it do you? These cards REALLY DO heat up rooms. Where do you think the heat goes? Ever heard of the law of conservation of energy?
  • londiste - Monday, March 5, 2012 - link

    oh damn, i need to get two of those, maybe they'll reduce my heating bill at winter :)
  • Kiste - Monday, March 5, 2012 - link

    Spelling "really do" in capital letter doesn't make it any more less ridiculous a statement.

    My whole PC (GPU, OCed CPU, 4 HDDS) draws slightly more than 300W under typical gaming loads. You can't "heat up a room" with that, much less with just the GPU.

Log in

Don't have an account? Sign up now