When NVIDIA initially briefed us on their new GM204 GPUs (aka Maxwell 2.0), there were several new features discussed. Most of these are now publicly available – Dynamic Scalable Resolution (DSR) was initially available only on GM204 GPUs but it was later enabled on Kepler and even Fermi GPUs. Other features required some changes to the hardware, for example Voxel Global Illumination (VXGI) cannot be implemented on older architectures – at least not without a severe performance hit. There was a third feature also discussed and demonstrated called Multi-Frame Anti-Aliasing (MFAA), and with their latest 344.75 driver NVIDIA MFAA is finally publicly available. Like VXGI, MFAA apparently requires new hardware to function, which means MFAA will only be enabled on GM204 GPUs – specifically, you need a GTX 980, GTX 970, GTX 980M, or GTX 970M GPU.

Where things get a bit interesting is when we discuss exactly what it is that MFAA is doing. Anti-aliasing is used to smooth out edges of polygons in order to eliminate jaggies – the stair step edges that are created when a line is rendered on the pixel grid of a modern display. Many people despise jaggies, but there's one big issue with anti-aliasing: it can exact a pretty heavy performance penalty. Depending on your hardware and tolerance for jaggies, it's often necessary to balance the use of anti-aliasing with other aspects of image quality.

It's worth noting at this point that there are many ways of accomplishing anti-aliasing. From an algorithmic point of view, the easiest approach is to simply render a scene at a higher resolution and then scale that image down with a high quality filter (e.g. bicubic), and in many ways this also produces the best overall result as everything on the screen gets anti-aliased. This method of anti-aliasing is called super-sampling (SSAA), and while it looks good it also tends to be the most costly in terms of performance. At the other end of the spectrum is multi-sample anti-aliasing (MSAA), which focuses only on polygon edges and samples multiple points for each pixel to determine coverage. This tends to be the least costly in terms of performance, but it doesn't always produce the best result unless you use at least four samples.

While NVIDIA's DSR is in many ways just a new take on SSAA, MFAA is a modification to MSAA plus a bit of extra "secret sauce". The sauce in this case consists of one hardware change – the sample patterns are now programmable and stored in RAM instead of ROM so they can be varies between frames or even within a frame – and there's also some form of inter-frame filtering. ATI (now AMD) first started doing this a while back as well, though not necessarily in the same way as NVIDIA is doing MFAA. The idea is also similar to Temporal AA, which swaps sampling patterns every other frame in order to approximate 4xMSAA while only doing 2xMSAA, but now MFAA is doing some image synthesis over the frames to help detect and remove jaggies.

NVIDIA's goal with MFAA is to provide quality that approaches the level of 4xMSAA with 4xMFAA (or 8xMSAA using 8xMFAA), with a performance cost closer to 2xMSAA/4xMSAA respectively. At a technicaly level, 4xMFAA is using an alternative 2xMSAA sample pattern combined with some inter-frame interpolation, while 8xMFAA uses alternating 4xMSAA sampling patterns with the filtering. Finally, let's also clear up how MFAA differs from TXAA. MFAA can be programmed to implement MSAA or TXAA, but it has additional flexibility as well, so effectively it's a superset of MSAA and TXAA. The key difference is that the sample patterns can now be altered rather than being hard-coded into ROM. NVIDIA has also posted a video overview of MFAA and anti-aliasing in general that's worth watching if you're not familiar with some of the concepts.

Investigating MFAA Performance

While the discussion of how MFAA works is interesting, ultimately it comes down to two questions: what is the performance hit of MFAA compared to MSAA, and what is the image quality comparison between MFAA and MSAA? To test this, we have used two games form NVIDIA's list of currently supported titles, Assassin's Creed: Unity and Crysis 3. The complete list of supported games (at the time of writing – more should be added over time) can be seen in the following table.

I've actually got a separate Benchmarked piece with further investigations of Assassin's Creed: Unity (ACU), which I'll try to finish up later today. Suffice it to say that ACU is extremely demanding when it comes to GPU performance, particularly at the highest quality settings. For the testing of MFAA, I have used just a single GTX 970 and GTX 980 GPU, and I tested at 1080p with High and Ultra settings in ACU, while for Crysis 3 I used Very High textures with High and Very High machine spec. I then tested both with and without MFAA enabled (this is a new setting in the NVIDIA Control Panel). Here's how things look in terms of performance, with the minimum frame rates being an average of the lowest 1% of frames.

Assassins Creed: Unity Average FPS

Assassins Creed: Unity Average FPS

Assassins Creed: Unity Minimum FPS

Assassins Creed: Unity Minimum FPS

 

Crysis 3 Average FPS

Crysis 3 Average FPS

Crysis 3 Minimum FPS

Crysis 3 Minimum FPS

There does appear to be a slight performance hit from enabling MFAA, though it's small enough that in most cases it wouldn't be noticeable without running benchmarks. Using 4xMFAA to deliver the equivalent of 4xMSAA looks to be viable, at least from the performance figures in our testing. The other aspect of course is image quality, so let's look at that as well.

Here's where things get a bit interesting. It's a bit hard to compare them in the games we're using as you have to restart the game in order to enable/disable MFAA, so we can't just use a static shot (though the first image from ACU is basically a static location). I also feel like MFAA has a better chance to apply a temporal filter when the game isn't in motion, so I consider the "moving" sequences to be more important in some ways. Of course, when the game is in motion, seeing jaggies sometimes becomes more difficult (depending on the game and scene), but that's another topic…. Regardless, there are clearly differences between MSAA and MFAA in terms of rendering.

2xMSAA is easy to pick out as having the most jaggies still present, but picking a "best" image from the 4xMFAA, 4xMSAA, and 8xMFAA options would be hard. Then toss in the rain from the sequence in Crysis 3 and it becomes very difficult to see what's going on. One cool aspect of MFAA is that similar to TXAA, it has the ability to help remove jaggies on transparent textures and not just on polygon edges. You can see this on some of the elements in the above images (e.g. look at the fence in the second set of images for Assassin's Creed), though it's not always perfect.

Anti-Aliasing Videos

Still images don't often do anti-aliasing justice, so I went ahead and created videos of a scene in Paris with all eight supported anti-aliasing modes. Note that the process of transcoding and uploading to YouTube degrades the fidelity quite a bit and makes it difficult to compare AA modes, so this is at best a rough demonstration of the AA modes. The game natively supports no anti-aliasing, FXAA, 2xMSAA, 4xMSAA, 8xMSAA, and TXAA; with the updated drivers and MFAA enabled, 4xMFAA and 8xMFAA are also available.

The most visible aliasing in the videos is on the facade of the building, and you can see some shimmer as there's a slight shift in the camera perspective over time. The top-right of the video has some dark wood highlighted against the sky and also shows some clear aliasing, as does the front of Arno's outfit.

Of the various modes, FXAA seems to show the least aliasing (depending on where you look), but it also looks a little blurry overall. 8xMSAA and TXAA look good, with the performance hit on 8xMSAA being quite heavy, and 4xMSAA and 4xMFAA are pretty close in quality as well. 2xMSAA meanwhile almost looks like the game doesn't have anti-aliasing enabled...until you actually see what the game looks like with no AA. I've seen at least one video of Assassin's Creed: Unity with MFAA enabled that showed some severe issues but I never experienced any such problems. If MFAA is rendering poorly on your system, drop me a note as I'd love to try and reproduce the problem.

Game Ready

Wrapping things up, there are a few other items to note with the 344.75 drivers that we've used for testing. Along with enabling MFAA for GM204 graphics cards, the 344.75 drivers are NVIDIA's latest "Game Ready" WHQL release, with newly added support for Far Cry 4 and Dragon Age: Inquisition, and The Crew. The first two of those were officially launched today, November 18, while The Crew is currently slated for release on December 2. We're working to get those games for additional testing as part of our Benchmarked series of articles, so stay tuned for that.

Closing Thoughts

Getting rid of jaggies is always a good thing in my book, and if you can do it better/faster then that's great. While there's definitely going to be some debate on whether 4xMFAA actually looks as good as 4xMSAA, for the most part I think it's close enough that it's the way to go unless you're already hitting sufficiently high frame rates. The performance aspect on the other hand is a clear win, at least in the games we tested. Assassin's Creed: Unity and Crysis 3 can absolutely punish even high-end GPUs, but other games can be equally demanding at higher resolutions.

Perhaps the biggest issue isn't whether or not 4xMFAA can produce a good approximation of 4xMSAA with less "work", but it's simply a question of support for MFAA. The current list of 20 games is hardly all-encompassing, and several of the games on the list aren't even known to be all that demanding (I'd put about half of the games on the list into that category, including DiRT 3, DiRT Showdown, F1 2013, F1 2014, Just Cause 2, and Saints Row IV). What we need now is more titles where MFAA will work, and hopefully that will come sooner rather than later.

One nice thing with MFAA is that it currently ties into the existing MSAA support in games, so there's no need for extra programming work on the part of the developers (unlike TXAA). Of course there are drawbacks with MSAA, specifically the fact that it doesn't work with deferred rendering techniques, which is why some games only support FXAA or SSAA for anti-aliasing. MFAA doesn't do anything for such titles, but that's no surprise. Considering the performance benefit of MFAA over MSAA and the fact that it can be enabled in the control panel and will automatically work on supported games, I don't see much problem in simply leaving it enabled for most users.

Source: NVIDIA

Comments Locked

28 Comments

View All Comments

  • darth415 - Thursday, November 20, 2014 - link

    Why are you using Watch Dogs as an example when that game's engine is unoptimized and an idiot in general, and why are you comparing the 290x to the 980, when the 970 is its actual competition? 900 series cards scale at least 90%, and usually 95-100% in sli, which Amd cards cannot approach, and as someone intamately familiar with things like frame time, unless something rather drastic has happened in the few weeks since the release of the 900 series, Nvidia is still the king of multi-gpu smoothness, though since then I haven't really been paying any attention to it. I don't hate Amd cards, but Crossfire is simply inferior, and pricing/stability/performance wise, the 970 is king in multi-gpu configurations. That said, I personally wouldn't put any modern 4 gb card in cf/sli, because you are going to run out of vram in some situations before you can even use your potential gpu performance!
  • FlushedBubblyJock - Thursday, November 20, 2014 - link

    Why even buy the hot as heck AMD patched and wanting junk ? Drivers are a pain, compatibility is a pain, and they always have ongoing game issues, and a severe lack of features.

    I suppose the only reason possible is being a simpleton and feeling like 1 or 2 or 10 percent of an already sufficient frame rate, and frame rate only - minus all the other issues, or a frame rate that is unplayable yet produces a 4k "win", would be the reason.

    So for the foolish and easily manipulated and boring gamer who only stares blankly at his fraps counter ( while purportedly paying attention to the game ), it might be proper and just punishment.
    Otherwise, forget it.
    They are so far behind the nVidia curve and new technologies AMD is no longer desirable at all.
    AMD's latest falter, the 285, is a 3 year old sidestep failure and completely unacceptable.
    Their cpu's also show the same lack of skill and engineering and software mastery.
    Plus they lose money nearly every quarter, going on the better part of a decade.

    Forget the failure and endlessly promoting like a verdetrol. It will only work with the weak minded and desperate.
  • FlushedBubblyJock - Wednesday, November 26, 2014 - link

    Yes, whatever, after 2 years of horrible CF for $579+ release cards, it's about time.
    However everyone likes a really good deal so maybe now is the time, since AMD has stretched the 290's life... so yeah I check Newegg and yep, well it's a 20 rebate so 270 and some paperwork and waiting - but 4 games too ? Who wouldn't want that deal for an upgrade?
    So I check and it's the space gold games 3 + 1 Civ .. ok forget the Civ but 3 others I can choose great - so Newegg has the AMD pic of gamers gold but you can't read all the games - I mean what the heck are they thinking ?
    So I google to AMD's list and guess what - the same tiny, tiny little dvd box pics - WITH NO GAMES LIST VISIBLE...
    https://www.amd4u.com/radeonrewards/
    Man I'm telling you - I was ready to give AMD another chance but the frustration they cause with their immense stupidity is not only undesirable it's scary... it makes me wonder what other massive oversights I'll be running into and how many times their idiocy (or covering up the cruddy games by not listing them and making their pics so small blowing the browser to 400% view still results in hits ad misses) will cause me anger...
    Just forget it AMD ! No polish, common sense not present... the games you give away - well -- hard to tell what they might be --- wow what great marketing skills !
    OMG - I AM NOT BUYING NOW - IT WAS SO CLOSE BUT AMD MADE SURE TO BLOW IT AGAIN !
  • Daniel Egger - Wednesday, November 19, 2014 - link

    Am I the only one who finds it more than just plain annoying that NVIDIA doesn't distinguish between Maxwell (v2) and Maxwell (v1)? For christ sake, every f'ing thing they offered in the last two weeks says "You need Maxwell to run this" and all users think: Great, that's what I have, let's try it out. When you actually try (like the Apollo demo which only took 6(!) hours to download -- no thanks to NVIDIAs damn slow servers) you're greeted with "You need Maxwell to run this application". I mean WTF?!?!?!?
  • Lerianis - Monday, November 24, 2014 - link

    Yeah, they should have something like Intel has where it can scan your computers and tell you "You have the proper equipment to run this demo or tester!"
    Of course, many of those things use JAVA (which is verboten somewhat unjustly with a lot of people) so......
  • SeanJ76 - Monday, November 24, 2014 - link

    344.75 driver sucks!!! Crysis 3 performs way better with 344.65 driver.
  • Lerianis - Saturday, November 29, 2014 - link

    Good drivers for those who have a graphics card that NVidia still supports, but I am a little twerked that they no longer support 9800 series graphics cards - 300 series graphics cards.
    These drivers are only for the 400 series or newer and there is no way to 'fool' the computer I am on into installing the drivers. Even a manual attempted installation fails.
  • Revdarian - Tuesday, December 2, 2014 - link

    Can i be a bit of an ass and point out that this "new technology" is actually ~10y old and pioneered on the Radeon x800 pro? Sure it has been refined a bit (vsync isnt a requirement nowadays) but the main issues that existed back then exist right now, that it does have a minimum performance required to actually appreciate the effect & it doesn't really work on most games.
    http://techreport.com/review/6672/ati-radeon-x800-...

    IMHO shader based AA is actually where they should be doing the lion's share of their research going forward.

Log in

Don't have an account? Sign up now