The Driver Enabled AA Question

Currently driver AA settings are very complex, especially for NVIDIA hardware. In-game settings are more compatible with each game on an individual basis, and future games will be able to support AA with floating point textures and enable CSAA in-game on NVIDIA hardware. So wouldn't the world be a better place if we could just throw out driver AA settings and rely on games?

Such a theory might work well for future games, but the reality is that driver AA settings are a necessity for enabling the functionality on older games or current games that neglected to include AA support in favor of floating point surfaces. In order to allow gamers to continue to benefit from these features while avoiding compatibility issues inherent in current an future games, NVIDIA is considering altering the presentation of AA in its driver.

We would like to make it clear that NVIDIA hasn't taken any steps in any particular direction at this point. In fact, any feedback we can get from our readers on these options would be of particular interest to their driver team. If any of these ideas stands out as a winner, please let us know on this article's comments.

One direction NVIDIA may go is to remove the override AA options from the general settings while keeping the enhance options on the main screen. This will allow gamers to enable CSAA in games that don't support the option in-game yet while encouraging the use of in-game settings for MSAA. For applications that don't have in-game settings, NVIDIA would still allow override AA to be set in the game profiles. This would allow older applications the ability to benefit from AA, and NVIDIA could disable the option in profiles for games that are fundamentally incompatible with MSAA.

We aren't sold on the idea of profiles, but it was also suggested that a coolbits like feature could be used to expose override AA modes. This would allow gamers who really know what they want to still have access to the feature in a traditional way after setting a specific registry key.

Another less complicated approach being considered is the addition of a warning box that pops up when AA settings are changed. This would be an informational approach to explain the possible complications of enabling override AA on an application that doesn't support it. This would have added benefit if NVIDIA included a list of games known not to support MSAA in this warning box (as these games should already be noted in their release notes).

The bottom line is that NVIDIA wants to provide the "... best default settings for the broadest set of users for the most likely scenarios." We certainly know what we as reviewers would like to see, but we would love to hear from our readers on the subject.

The Increasing Complexity of AA Final Words
Comments Locked

50 Comments

View All Comments

  • Trisped - Thursday, March 15, 2007 - link

    Why not write a specification that allows developers to create a generic hardware option system, then let the drivers decide what is and is not available in game?

    The developer would be responsible for providing the basic interface components including button, slider, and box graphics, as well as fonts, back grounds, and what ever else is natively visual in operation.

    The drivers would provide a list of options that can be configured for the hardware. This list would include the display name of the option, and what user interface it would use.

    The list would then be compiled by the game, and would then generate the control panel for the user, just like a web page takes a list of HTML instructions and blends them with pictures to generate the page we view.

    Since the page is dynamic, old games could take advantage of new hardware abilities. Also, the driver could specify which settings were not compatible with others, for example if AA is not compatible with HDR the driver would state this in the list. The list would then be used to generate the control panel where if the gamer selected AA, HDR would be grayed out, or if the gamer selected HDR, AA would be grayed out.

    This is great for gamers, as it gives them complete control, and great for developers as it allows them to use hardware not yet available. Hardware manufactures also win as they don't have to wait for new games to come out which will utilize the new abilities of their cards.
  • Scrogneugneu - Thursday, March 15, 2007 - link

    That's exactly what I was thinking about :)

    A generic (programming) interface, allowing for games to register themselves and specify which feature they do NOT support whatsoever, which features they do support, and which feature they allow even if it's unsupported. "Feature" is regarded as a kind of feature, not as a level of feature (meaning, the application can support AA, but can't specify it supports AA only up to 8x).

    Just make sure to include a "required computation power" system. Some kind of measurement where the application can say that to perform well under X settings, it would require a fill rate about that high, would require that much shaders and such to get around X FPS on average.

    The application can then just define what it believes is a "Fluid playing experience", a "Decent playing experience" and a "Beautiful playing experience" in terms of FPS (which is bound to differ between games styles, as a shooter will require higher FPS than a RTS) and shoot what the user decides to choose, along with the performance evaluation ratings. Mister Joe Schmoe then just has to choose "Fluid playing experience", and the application will give the raw computing power requirements as well as the target FPS average to the drivers.

    The driver then shoots back the settings to adopt in order to achieve that level of gameplay. Since the driver knows what card the application will run on, the driver also knows what the capabilities of the card are. The task of the driver is then to evaluate what settings the card can support, considering the amount of computing power the application requires (as defined by the application) and considering the target performance (defined by the application too)­.


    In the end, the application developers have to implement an interface of their own on top of the graphics programming interface offered by the drivers, as well as perform several tests to pinpoint the real computational power required to run the application. Having this in hands, they blindly shoot that at the driver, who chooses the best settings. The driver only has to know the specific distribution of computing capacity of every card it supports, which will allow it to simply generate the best settings for the required application. Should the developers tweak their application, they can rerun their tests to get a more accurate picture of the requirements of their application, and the settings will adjust themselves.

    As drivers and graphics cards evolve, so does their computational power. But since the driver is deciding itself what settings it should use (on a per-application basis), these new advances can and will be used by older games. Say we create AAx64 in a year. If a game implemented the interface right, it will have to ask the driver what features are supported, and at what generic level. Any card supporting AA will answer they support AA, and this new super card will let the application know that it supports it up to "64" level. Therefore, the AAx64 mode will be available right away. An all new feature, already in use everywhere.

    Application developers are happy since their product get the advantages of new technologies without them even moving a finger. Driver developers are happy since their new technology is being used very fast, removing that annoying "it's great, but no game uses it yet" period. Average users are happy because they get way more out of their money. And as long as there is that magic "Custom" button available, enabling anyone to choose what settings they want, gamers will be happy too.

    The only drawback : this has to be implemented by every vendor. So NVidia and AMD should work together to produce the basis of that interface (which has to be XML-like, as in extensible so a new option by a vendor won't change the interface all over again, and generic, as in the application should never have to handle special cases). After that, I promise, you can tweak the hell out of your own drivers to get the best match for your cards capabilities :)
  • waika - Thursday, March 15, 2007 - link

    This would be an excellent arcation for NVIDIA to overhaul the entire driver interface; the current approach offers no less then three separate interfaces (tray menu, legacy control panel, and new interface) is just a confusing mess to the novice user, and is cumbersome, poorly organized and of dubious utility to the more experienced and technically savvy Fan of the technology.

    None of NVIDIA's current driver interface presentations offers a clear, well organized, and consistent paradigm over driver feature control -- and adding new features to any of the current interfaces is sure to just make matters worse. It's a very sad state of affairs when many third party driver control interfaces developed by amateurs offer better interface design that is easier to understand for end users at all levels.

    This isn't a very good venue for addressing better approaches interface design, as that requires illustration with mock-ups and working examples of good interface design; but I'd certainly welcome any opportunity to apply some of my skills and knowledge to NVIDIA products if such a venue avails itself or if they contact me through these forums.

    I have a strong affection for NVIDIA products and hope they'll be addressing the issues of driver interface design and feature control in future iterations of their driver products as this is one place NVIDIA products have not been improving.
  • CrashMaster - Thursday, March 15, 2007 - link

    One thing that could help is to have an overlay on the screen (like Fraps) at least for a few seconds after a game loads (or loads a map), that would tell the user what res and what AA/AF mode the card is rendering.

    One other thing that would help is if the setting is being overridden by something to have note of that on the overlay (eg: the game is forcing it to 4x AA when you have it set to 8xaa in the control panel)
  • BrightCandle - Thursday, March 15, 2007 - link

    Vista comes with a performance testing tool to determine what OS features should be enabled. They completely missed a trick though when it comes to DX10. DX10 lacks optional features so all the cards are guaranteed to have them all, so why not have a more detailed performance test that set all these settings up for you by default?

    Humour for a minute and consider if games came with just a few measures of their usage on CPU, hard drive and graphics power. They could include information on the settings they use and corresponding the number of shaders it would then use to achieve the effect. A simple developer tool could be used to capture the numbers across the entire usage (debug version of DirectX which all developers use anyway).

    The OS (Vista) has also captured a set of performance metrics against all these particular metrics, it knows how your PC compares to the performance usage of the game. It can combine that with the known native resolution of your monitor and choose the appropriate sound mode for your speakers.

    The OS can compare its own stats against the game, set up the graphics, sound etc and only return the information the game needs to either choose any settings or that it needs (such as the sound mode). This completely removes the need for the graphics options in the game at all, indeed sound could go as well, as do the "game settings". All that really needs to be there in the future is for Vista to support user setup gaming profiles such that if one game did perform badly (or you wanted to try eye candy that the PeeCee says you don't have hardware to run but you think you know better) you could override it. Maybe after you have installed/launched a game it shows you what settings you choose and you could override them.

    To me it is a very telling sign of bad design that every game has to implement its own menu of audio/video settings. If you've ever tried to program DirectX games you'll know there is a lot of boiler plate code and global variables for all this information and its all there because no one has tried to pull it into the OS yet. It is the best place for it, users shouldn't need to know about these things to just play a game.

    I think this also opens another avenue where you could ask the system - "OK what would it take to be able to use soft shadows?" and it could come back and say you need 128 shaders running with a performance of blah. Graphics card companies publish those stats and you have the ability to shift hardware and for people easily know what they need to do to play games as they want with decent performance.
  • kevjay - Thursday, March 15, 2007 - link

    I agree with the guy who said SSAA is the way to go. It's pathetic that nothing has approached 3dfx's level 7 long years later and that people are so happy with crap like MSAA and CSAA.
    Here is exactly what they should do :
    Ressurect 3dfx's FSAA which is 4x RGSS
    make a double mode for transparent textures.
    give option to disable color compression!
    don't force gamma correction.
  • DerekWilson - Thursday, March 15, 2007 - link

    I used to be a huge fan of things like SRAM for system memory and SSAA all the time, but more and more I see the need to maintain a balance of acceptible quality and performance.

    OK, so I'm still for SRAM as system memory and SSAA, but I've learned not to expect these options :-)

    And really, there is a better option than SSAA -- higher resolution, smaller pixel size monitors.

    As pixel size gets smaller, aliasing artifacts also get smaller and antialiasing becomes less useful. AA is a stop gap until we can reach sufficiently small pixel sizes.

    Really, MSAA and SSAA are just forms of edge blurring. This decreases detail even where we may not want detail decreased. MSAA sticks to polygon edges, but SSAA blurs all color edges.

    4xSSAA is managable, but going further than that is incredibly memory intensive and impractical unless the target resolution is very small (even rendering a 640x480 images with 8xSSAA would result in more pixels than a 30" 2560x1600 monitor).

    I'm all for options though, and it might be fun for graphics hardware makers to include full SSAA options just to play with even if performance suffers greatly.
  • mongo lloyd - Saturday, March 17, 2007 - link

    Hate to break it to you, but pixel pitch is getting LARGER.
    People are retards and buy larger monitors with the same resolution as their 2-4" smaller old monitors.

    There's still no LCD out which has a comparable pixel pitch to the dot pitch of my 2002 iiyama CRT... and the trend is actually going backwards.
  • Ver Greeneyes - Friday, March 16, 2007 - link

    While I'm in full agreement with you that SSAA would just be a stopgap solution, with LCD TVs reaching sizes of 108" with -the same resolution- as 30" ones I don't think we should expect higher resolutions anytime soon. There just doesn't seem to be any demand for it right now.

    One thing that I think should be addressed on an operating system level is how small things get on the desktop at higher resolutions. I mean, what's stopping OS developers from using bigger fonts? It'll look better and stop people complaining that high resolutions make things tiny, something that should not be an argument.
  • Trisped - Friday, March 16, 2007 - link

    You can increase the font sizes for desktop etc in windows. Just right click on the desk top, select properties, Appearance tab, advanced, and select icon from the drop down menu. You can change a lot of things in there. Other OSes probably have the same thing, you just need to find it.

Log in

Don't have an account? Sign up now