The Increasing Complexity of AA

As we mentioned, the first major difficulty with antialiasing is compatibility. Generally the burden is on the game developer to assess the capabilities of the hardware on which it is running and make options available to users based on what is available. Problems arise in that game developers aren't able to look into the future and program for hardware that isn't available. For example, X1000 Series and GeForce 8 Series hardware can run Oblivion with both HDR and AA enabled, but at least GeForce 8 hardware wasn't available when Oblivion launched, so the developers don't test for the capability to antialias floating point surfaces and simply disable the AA option when HDR is enabled.

When games that could benefit from AA on current hardware don't offer the option, we have no choice but to look to the driver for support. Of course, we do have bigger problems on the horizon. Some developers are currently choosing options such as deferred rendering for their games. Current techniques make use of multiple render targets (MRTs) to render objects or effects which are later combined to form a final image. MSAA does not play well with this technique, as one of the basic requirements is knowing what surfaces overlap a single pixel on the screen. Forcing AA on in the driver can cause problems in games where MSAA simply will not work. Current examples of this can be seen in these games:

  • Ghost Recon: Advanced Warfighter
  • Rainbow Six: Vegas
  • S.T.A.L.K.E.R.

With the functionality of driver enabled AA dependent on the game being run, graphics hardware makers are not able to guarantee that the user will get the results he or she desires. This means that the driver setting is more like asking the hardware to enable AA if possible. This uncertainty as to the behavior of the feature can cause problems for end users of both AMD and NVIDIA hardware.

As for NVIDIA specifically, its new CSAA (Coverage Sample Antialiasing) technology adds another layer to the complexity of antialiasing settings in the driver. Now, rather than just selecting a desired level of antialiasing, we need to decide to what degree we want to either enhance or override the application. Enhancing only works when AA is enabled in-game as well, and override won't override games that make use of technology that is incompatible with MSAA. While the features function as they should, even some hardcore gamers out there may not know what they are getting when they enable AA in the control panel.

At AnandTech, we have avoided using driver AA settings as much as possible since the initial release of Far Cry which produced inconsistent results between graphics hardware manufacturers when set through their respective control panels. These specific problems were worked out in later driver and game updates, but we find it more effective to rely on the game developer for consistency between common hardware features. Where there is an in-game setting, we use it. For us, other than disabling vsync, driver settings are a last resort.

It is safe to say that AMD and NVIDIA feel the same way. The only way they currently have to inform their users about the lack of support for AA in specific games is though their release notes. No one wants the end user to have a bad experience through glitchy performance.

One of the best ways to make sure gamers stick with in-game settings is to make sure developers offer clearly defined, well documented, and complete settings for features such as AA. In order to better enable this, NVIDIA has been working with Microsoft to enable CSAA through DirectX. With the in-game option for CSAA, users won't have to wade through the driver options and can directly select the type and degree of AA they want applied to their game.

In DirectX 9 and 10, requesting AA on a surface involves determining the level (number of subsamples) of AA and the quality of AA. Most games just set quality to 0, as hardware previous hardware didn't really do anything with this field. The method developers can use to set CSAA in-game is to set the level of AA to either 4 or 8 and then set the quality to 8 or 16 (2 or 4 in DX9, as quality levels are limited to 7 or less). This functionality is exposed in NVIDIA's 100 series driver.

This has had some unwanted side effects though. In the past it hasn't mattered, but some developers would detect the highest quality setting available and select it when enabling AA in-game. These games when paired with NVIDIA's 100 series driver will inadvertently enable 16x CSAA when 4x AA is selected. Currently the games that exhibit this behavior are:

  • Battlefield 2
  • Battlefield 2142
  • Sin Episodes
  • Half-Life 2
  • Half-Life 2: Lost Coast
  • Dark Messiah of Might and Magic

This is only an issue on Vista for now, but 100 series drivers will be coming to XP soon. It isn't that either NVIDIA or these game developers are doing anything wrong, it's just how things ended up working out. The ability to enable CSAA in games does outweigh these minor issues in our minds though. We hope to see this number rise, but currently there are only two games that support enabling CSAA in-game:

  • Half-Life 2: Episode 1
  • Supreme Commander

So with the 100 series driver, future games will be able to enable all of NVIDIA's AA modes in-game. Setting AA levels in-game is safer than using the hardware makers' driver overrides and more convenient than game specific profiles. Aside from heavily encouraging developers to enable in-game AA settings when possible, NVIDIA is exploring other options to make the gamer aware of the caveats associated with driver settings and encourage the use of override AA as a last resort.

Index The Driver Enabled AA Question


View All Comments

  • mino - Saturday, March 17, 2007 - link

    Yeah, you can do that. Except that setting works
    a) non-proportionally
    b) obly on some fonts at some locations

    The result being it is clearly inferior to big dot size.

    Reason is simlpe, Windows was inherently build for certain dpi and all those "big-font" options were just later added.

    Also for doing that you would need to have all desktop onjects vectorised - fonts, icons, images...
  • Optimummind - Thursday, March 15, 2007 - link

    This is my ideal scenario for how AA and AF should be handled for games.

    The game developers should include options for AA and AF in their games if it's possible with the way their game engines render graphics. If not, the game developers should inform the GPU manufacturers of these AA/AF limitations so the GPU manufacturers can use the information to find workarounds or use the info to inform the consumer about such limitations.

    Perhaps, both nVIDIA and ATI can have an applet within their respective driver control panels that is dedicated to a list of games that don't support AA/AF in an alphabetical list with a search option box.

    This way, if I can't find AA or AF settings in the game I'm playing, I can exit the game, open up the driver control panel, click on the applet with the list of games not supporting AA/AF, find my game on the list, then read about the situation. The information would be deemed informational and useful by me if it talked about such things as:

    (a) The game the way is rendered via its engine, it's not compatible with the AA and/or AF modes supported by our GPU. This issue may be resolved if the game developer decides to patch the game engine.

    (b) The game is compatible with our GPU hardware but due to driver bugs, it's currently not working as intended. The fix is coming from us (GPU company) in the next revision.

    (c) Although the game doesn't include AA/AF options, forcing AA and AF via our driver control panel will work as it has been tested and Q&A by us. Enjoy!!

    I think such a list via an optional applet within a driver control panel would be more visible and very informational. Sure, it would take some time to compile such a long list, but once you have the list, it's simply a matter of adding newer games to it and it can keep growing. It would also be doing the consumer a huge favor by taking the guesswork out of why AA/AF doesn't work and who's actions are needed before it can be made to work. This will save trips to forums where we might not even find the answer to our questions.

    As for the other concepts about game profiles and override settings, I think those two should be kept intact. The profiles are useful b/c I can simply launch a game on my desktop shortcut and the settings I've determined through experiments to work can be automatically launched. But I think they will be more powerful with a list of games I've mentioned earlier. If this "list" idea is too hard to integrate into the driver, then perhaps ATI/nvidia should have a link within the driver control panel to a website that has this list.

  • crimson117 - Thursday, March 15, 2007 - link

    I'm confused by having AA settings both in game and in the driver. Which gets precedence? Last set? Always the driver? Always the game? If I set it in the driver, then load the game, does the game override the driver?

    I'm sufficiently confused by this that I don't think I turn it on at all. I wish it was easier to understand.
  • Imnotrichey - Friday, March 16, 2007 - link

    I never even knew there was an AA setting outside the game. I always did it through the game. Since I could change it within game, I guess that means my CCC has is set to off. Reply
  • DigitalFreak - Thursday, March 15, 2007 - link

    As far as I know, the driver will always override the in game settings. Reply
  • sc0rpi0 - Thursday, March 15, 2007 - link

    Why should I have to tell the game/driver what <xxx> settings to use? I only care about the level of play (easily expressed in FPS) of the game I'm currently in. All I should have to specify (in the game) is the min and max FPS settings. That's it! The game/driver should then go off and figure out how to give me my acceptable level of gaming "experience." Sheesh! Reply
  • gmallen - Thursday, March 15, 2007 - link

    I find that graphics horsepower for the games I play (i.e., Oblivion) is better used for textures and resolution. I can't stand the poor color and lousy visual quality of LCD so I use CRTs (I have squirreled away a couple of good ones for the future.); thus I have no need for odd resolutions. Drivers still seem to be easy to use but the UI (Catalyst for my ATI) needs to either be tossed or completely redesigned. I cannot believe that anyone with any UI design experience came near the Catalyst project. Reply
  • VooDooAddict - Thursday, March 15, 2007 - link

    Personally, I like the idea of out of game profiles to set rez/AA/quality settings. The profiles should be managed by a separate app that can keep track of what games allow what settings. Reply
  • Araemo - Thursday, March 15, 2007 - link

    Seriously, why won't someone seriously push super-sampling AA? If I'm not mistaken, SSAA doesn't require any fancy tricks, and will work correctly with any rendering technique as long as it renders to a 2d raster image(IE, the framebuffer) at the end of the pipeline.

    SSAA just requires you render at 4x the resolution (2x vert and 2x horizontal) for 4x anti-aliasing, internally to the card, and then down-sample it for output, which cleans up jaggies everywhere. With the extra power of modern video cards, I'd think ATI would be hyping this quite a bit if they had some way to do this with less performance hit than it takes on nvidia hardware(Or vice versa if nvidia has the hardware advantage), as a way to sell higher end cards.

    I'd love to go back to SSAA. My Voodoo5 spoiled me with image quality back in the day, even if driver compatibility was horrible. I generally don't have a 24" or 30" screen, so I don't need a card capable of rendering 2560x2048... unless you're going to let me use SSAA at that res, and downsample it for my 1280x1024 LCD. THAT would make me a happy, paying customer.
  • Ver Greeneyes - Friday, March 16, 2007 - link

    I second this. Me and a friend of mine are both going to buy GeForce 8800s soon, but I know they'll be too powerful for the games we play :P My old CRT monitor can do 1600x1200, but resolutions even that high still cost a lot of money on LCDs.. so why not spice things up with some super-sampling? On LCDs you don't generally get or need refreshrates higher than 60Hz, and tend to feel smooth as low as 30FPS, so all the horsepower the newest cards have is just going to waste right now. (sure you can, say, tweak the Oblivion .ini file 'till you're cross-eyed, but I think SSAA would give better image quality painlessly) Reply

Log in

Don't have an account? Sign up now