The Increasing Complexity of AA

As we mentioned, the first major difficulty with antialiasing is compatibility. Generally the burden is on the game developer to assess the capabilities of the hardware on which it is running and make options available to users based on what is available. Problems arise in that game developers aren't able to look into the future and program for hardware that isn't available. For example, X1000 Series and GeForce 8 Series hardware can run Oblivion with both HDR and AA enabled, but at least GeForce 8 hardware wasn't available when Oblivion launched, so the developers don't test for the capability to antialias floating point surfaces and simply disable the AA option when HDR is enabled.

When games that could benefit from AA on current hardware don't offer the option, we have no choice but to look to the driver for support. Of course, we do have bigger problems on the horizon. Some developers are currently choosing options such as deferred rendering for their games. Current techniques make use of multiple render targets (MRTs) to render objects or effects which are later combined to form a final image. MSAA does not play well with this technique, as one of the basic requirements is knowing what surfaces overlap a single pixel on the screen. Forcing AA on in the driver can cause problems in games where MSAA simply will not work. Current examples of this can be seen in these games:

  • Ghost Recon: Advanced Warfighter
  • Rainbow Six: Vegas
  • S.T.A.L.K.E.R.

With the functionality of driver enabled AA dependent on the game being run, graphics hardware makers are not able to guarantee that the user will get the results he or she desires. This means that the driver setting is more like asking the hardware to enable AA if possible. This uncertainty as to the behavior of the feature can cause problems for end users of both AMD and NVIDIA hardware.

As for NVIDIA specifically, its new CSAA (Coverage Sample Antialiasing) technology adds another layer to the complexity of antialiasing settings in the driver. Now, rather than just selecting a desired level of antialiasing, we need to decide to what degree we want to either enhance or override the application. Enhancing only works when AA is enabled in-game as well, and override won't override games that make use of technology that is incompatible with MSAA. While the features function as they should, even some hardcore gamers out there may not know what they are getting when they enable AA in the control panel.

At AnandTech, we have avoided using driver AA settings as much as possible since the initial release of Far Cry which produced inconsistent results between graphics hardware manufacturers when set through their respective control panels. These specific problems were worked out in later driver and game updates, but we find it more effective to rely on the game developer for consistency between common hardware features. Where there is an in-game setting, we use it. For us, other than disabling vsync, driver settings are a last resort.

It is safe to say that AMD and NVIDIA feel the same way. The only way they currently have to inform their users about the lack of support for AA in specific games is though their release notes. No one wants the end user to have a bad experience through glitchy performance.

One of the best ways to make sure gamers stick with in-game settings is to make sure developers offer clearly defined, well documented, and complete settings for features such as AA. In order to better enable this, NVIDIA has been working with Microsoft to enable CSAA through DirectX. With the in-game option for CSAA, users won't have to wade through the driver options and can directly select the type and degree of AA they want applied to their game.

In DirectX 9 and 10, requesting AA on a surface involves determining the level (number of subsamples) of AA and the quality of AA. Most games just set quality to 0, as hardware previous hardware didn't really do anything with this field. The method developers can use to set CSAA in-game is to set the level of AA to either 4 or 8 and then set the quality to 8 or 16 (2 or 4 in DX9, as quality levels are limited to 7 or less). This functionality is exposed in NVIDIA's 100 series driver.

This has had some unwanted side effects though. In the past it hasn't mattered, but some developers would detect the highest quality setting available and select it when enabling AA in-game. These games when paired with NVIDIA's 100 series driver will inadvertently enable 16x CSAA when 4x AA is selected. Currently the games that exhibit this behavior are:

  • Battlefield 2
  • Battlefield 2142
  • Sin Episodes
  • Half-Life 2
  • Half-Life 2: Lost Coast
  • Dark Messiah of Might and Magic

This is only an issue on Vista for now, but 100 series drivers will be coming to XP soon. It isn't that either NVIDIA or these game developers are doing anything wrong, it's just how things ended up working out. The ability to enable CSAA in games does outweigh these minor issues in our minds though. We hope to see this number rise, but currently there are only two games that support enabling CSAA in-game:

  • Half-Life 2: Episode 1
  • Supreme Commander

So with the 100 series driver, future games will be able to enable all of NVIDIA's AA modes in-game. Setting AA levels in-game is safer than using the hardware makers' driver overrides and more convenient than game specific profiles. Aside from heavily encouraging developers to enable in-game AA settings when possible, NVIDIA is exploring other options to make the gamer aware of the caveats associated with driver settings and encourage the use of override AA as a last resort.

Index The Driver Enabled AA Question
Comments Locked

50 Comments

View All Comments

  • Trisped - Thursday, March 15, 2007 - link

    Why not write a specification that allows developers to create a generic hardware option system, then let the drivers decide what is and is not available in game?

    The developer would be responsible for providing the basic interface components including button, slider, and box graphics, as well as fonts, back grounds, and what ever else is natively visual in operation.

    The drivers would provide a list of options that can be configured for the hardware. This list would include the display name of the option, and what user interface it would use.

    The list would then be compiled by the game, and would then generate the control panel for the user, just like a web page takes a list of HTML instructions and blends them with pictures to generate the page we view.

    Since the page is dynamic, old games could take advantage of new hardware abilities. Also, the driver could specify which settings were not compatible with others, for example if AA is not compatible with HDR the driver would state this in the list. The list would then be used to generate the control panel where if the gamer selected AA, HDR would be grayed out, or if the gamer selected HDR, AA would be grayed out.

    This is great for gamers, as it gives them complete control, and great for developers as it allows them to use hardware not yet available. Hardware manufactures also win as they don't have to wait for new games to come out which will utilize the new abilities of their cards.
  • Scrogneugneu - Thursday, March 15, 2007 - link

    That's exactly what I was thinking about :)

    A generic (programming) interface, allowing for games to register themselves and specify which feature they do NOT support whatsoever, which features they do support, and which feature they allow even if it's unsupported. "Feature" is regarded as a kind of feature, not as a level of feature (meaning, the application can support AA, but can't specify it supports AA only up to 8x).

    Just make sure to include a "required computation power" system. Some kind of measurement where the application can say that to perform well under X settings, it would require a fill rate about that high, would require that much shaders and such to get around X FPS on average.

    The application can then just define what it believes is a "Fluid playing experience", a "Decent playing experience" and a "Beautiful playing experience" in terms of FPS (which is bound to differ between games styles, as a shooter will require higher FPS than a RTS) and shoot what the user decides to choose, along with the performance evaluation ratings. Mister Joe Schmoe then just has to choose "Fluid playing experience", and the application will give the raw computing power requirements as well as the target FPS average to the drivers.

    The driver then shoots back the settings to adopt in order to achieve that level of gameplay. Since the driver knows what card the application will run on, the driver also knows what the capabilities of the card are. The task of the driver is then to evaluate what settings the card can support, considering the amount of computing power the application requires (as defined by the application) and considering the target performance (defined by the application too)­.


    In the end, the application developers have to implement an interface of their own on top of the graphics programming interface offered by the drivers, as well as perform several tests to pinpoint the real computational power required to run the application. Having this in hands, they blindly shoot that at the driver, who chooses the best settings. The driver only has to know the specific distribution of computing capacity of every card it supports, which will allow it to simply generate the best settings for the required application. Should the developers tweak their application, they can rerun their tests to get a more accurate picture of the requirements of their application, and the settings will adjust themselves.

    As drivers and graphics cards evolve, so does their computational power. But since the driver is deciding itself what settings it should use (on a per-application basis), these new advances can and will be used by older games. Say we create AAx64 in a year. If a game implemented the interface right, it will have to ask the driver what features are supported, and at what generic level. Any card supporting AA will answer they support AA, and this new super card will let the application know that it supports it up to "64" level. Therefore, the AAx64 mode will be available right away. An all new feature, already in use everywhere.

    Application developers are happy since their product get the advantages of new technologies without them even moving a finger. Driver developers are happy since their new technology is being used very fast, removing that annoying "it's great, but no game uses it yet" period. Average users are happy because they get way more out of their money. And as long as there is that magic "Custom" button available, enabling anyone to choose what settings they want, gamers will be happy too.

    The only drawback : this has to be implemented by every vendor. So NVidia and AMD should work together to produce the basis of that interface (which has to be XML-like, as in extensible so a new option by a vendor won't change the interface all over again, and generic, as in the application should never have to handle special cases). After that, I promise, you can tweak the hell out of your own drivers to get the best match for your cards capabilities :)
  • waika - Thursday, March 15, 2007 - link

    This would be an excellent arcation for NVIDIA to overhaul the entire driver interface; the current approach offers no less then three separate interfaces (tray menu, legacy control panel, and new interface) is just a confusing mess to the novice user, and is cumbersome, poorly organized and of dubious utility to the more experienced and technically savvy Fan of the technology.

    None of NVIDIA's current driver interface presentations offers a clear, well organized, and consistent paradigm over driver feature control -- and adding new features to any of the current interfaces is sure to just make matters worse. It's a very sad state of affairs when many third party driver control interfaces developed by amateurs offer better interface design that is easier to understand for end users at all levels.

    This isn't a very good venue for addressing better approaches interface design, as that requires illustration with mock-ups and working examples of good interface design; but I'd certainly welcome any opportunity to apply some of my skills and knowledge to NVIDIA products if such a venue avails itself or if they contact me through these forums.

    I have a strong affection for NVIDIA products and hope they'll be addressing the issues of driver interface design and feature control in future iterations of their driver products as this is one place NVIDIA products have not been improving.
  • CrashMaster - Thursday, March 15, 2007 - link

    One thing that could help is to have an overlay on the screen (like Fraps) at least for a few seconds after a game loads (or loads a map), that would tell the user what res and what AA/AF mode the card is rendering.

    One other thing that would help is if the setting is being overridden by something to have note of that on the overlay (eg: the game is forcing it to 4x AA when you have it set to 8xaa in the control panel)
  • BrightCandle - Thursday, March 15, 2007 - link

    Vista comes with a performance testing tool to determine what OS features should be enabled. They completely missed a trick though when it comes to DX10. DX10 lacks optional features so all the cards are guaranteed to have them all, so why not have a more detailed performance test that set all these settings up for you by default?

    Humour for a minute and consider if games came with just a few measures of their usage on CPU, hard drive and graphics power. They could include information on the settings they use and corresponding the number of shaders it would then use to achieve the effect. A simple developer tool could be used to capture the numbers across the entire usage (debug version of DirectX which all developers use anyway).

    The OS (Vista) has also captured a set of performance metrics against all these particular metrics, it knows how your PC compares to the performance usage of the game. It can combine that with the known native resolution of your monitor and choose the appropriate sound mode for your speakers.

    The OS can compare its own stats against the game, set up the graphics, sound etc and only return the information the game needs to either choose any settings or that it needs (such as the sound mode). This completely removes the need for the graphics options in the game at all, indeed sound could go as well, as do the "game settings". All that really needs to be there in the future is for Vista to support user setup gaming profiles such that if one game did perform badly (or you wanted to try eye candy that the PeeCee says you don't have hardware to run but you think you know better) you could override it. Maybe after you have installed/launched a game it shows you what settings you choose and you could override them.

    To me it is a very telling sign of bad design that every game has to implement its own menu of audio/video settings. If you've ever tried to program DirectX games you'll know there is a lot of boiler plate code and global variables for all this information and its all there because no one has tried to pull it into the OS yet. It is the best place for it, users shouldn't need to know about these things to just play a game.

    I think this also opens another avenue where you could ask the system - "OK what would it take to be able to use soft shadows?" and it could come back and say you need 128 shaders running with a performance of blah. Graphics card companies publish those stats and you have the ability to shift hardware and for people easily know what they need to do to play games as they want with decent performance.
  • kevjay - Thursday, March 15, 2007 - link

    I agree with the guy who said SSAA is the way to go. It's pathetic that nothing has approached 3dfx's level 7 long years later and that people are so happy with crap like MSAA and CSAA.
    Here is exactly what they should do :
    Ressurect 3dfx's FSAA which is 4x RGSS
    make a double mode for transparent textures.
    give option to disable color compression!
    don't force gamma correction.
  • DerekWilson - Thursday, March 15, 2007 - link

    I used to be a huge fan of things like SRAM for system memory and SSAA all the time, but more and more I see the need to maintain a balance of acceptible quality and performance.

    OK, so I'm still for SRAM as system memory and SSAA, but I've learned not to expect these options :-)

    And really, there is a better option than SSAA -- higher resolution, smaller pixel size monitors.

    As pixel size gets smaller, aliasing artifacts also get smaller and antialiasing becomes less useful. AA is a stop gap until we can reach sufficiently small pixel sizes.

    Really, MSAA and SSAA are just forms of edge blurring. This decreases detail even where we may not want detail decreased. MSAA sticks to polygon edges, but SSAA blurs all color edges.

    4xSSAA is managable, but going further than that is incredibly memory intensive and impractical unless the target resolution is very small (even rendering a 640x480 images with 8xSSAA would result in more pixels than a 30" 2560x1600 monitor).

    I'm all for options though, and it might be fun for graphics hardware makers to include full SSAA options just to play with even if performance suffers greatly.
  • mongo lloyd - Saturday, March 17, 2007 - link

    Hate to break it to you, but pixel pitch is getting LARGER.
    People are retards and buy larger monitors with the same resolution as their 2-4" smaller old monitors.

    There's still no LCD out which has a comparable pixel pitch to the dot pitch of my 2002 iiyama CRT... and the trend is actually going backwards.
  • Ver Greeneyes - Friday, March 16, 2007 - link

    While I'm in full agreement with you that SSAA would just be a stopgap solution, with LCD TVs reaching sizes of 108" with -the same resolution- as 30" ones I don't think we should expect higher resolutions anytime soon. There just doesn't seem to be any demand for it right now.

    One thing that I think should be addressed on an operating system level is how small things get on the desktop at higher resolutions. I mean, what's stopping OS developers from using bigger fonts? It'll look better and stop people complaining that high resolutions make things tiny, something that should not be an argument.
  • Trisped - Friday, March 16, 2007 - link

    You can increase the font sizes for desktop etc in windows. Just right click on the desk top, select properties, Appearance tab, advanced, and select icon from the drop down menu. You can change a lot of things in there. Other OSes probably have the same thing, you just need to find it.

Log in

Don't have an account? Sign up now