The Driver Enabled AA Question

Currently driver AA settings are very complex, especially for NVIDIA hardware. In-game settings are more compatible with each game on an individual basis, and future games will be able to support AA with floating point textures and enable CSAA in-game on NVIDIA hardware. So wouldn't the world be a better place if we could just throw out driver AA settings and rely on games?

Such a theory might work well for future games, but the reality is that driver AA settings are a necessity for enabling the functionality on older games or current games that neglected to include AA support in favor of floating point surfaces. In order to allow gamers to continue to benefit from these features while avoiding compatibility issues inherent in current an future games, NVIDIA is considering altering the presentation of AA in its driver.

We would like to make it clear that NVIDIA hasn't taken any steps in any particular direction at this point. In fact, any feedback we can get from our readers on these options would be of particular interest to their driver team. If any of these ideas stands out as a winner, please let us know on this article's comments.

One direction NVIDIA may go is to remove the override AA options from the general settings while keeping the enhance options on the main screen. This will allow gamers to enable CSAA in games that don't support the option in-game yet while encouraging the use of in-game settings for MSAA. For applications that don't have in-game settings, NVIDIA would still allow override AA to be set in the game profiles. This would allow older applications the ability to benefit from AA, and NVIDIA could disable the option in profiles for games that are fundamentally incompatible with MSAA.

We aren't sold on the idea of profiles, but it was also suggested that a coolbits like feature could be used to expose override AA modes. This would allow gamers who really know what they want to still have access to the feature in a traditional way after setting a specific registry key.

Another less complicated approach being considered is the addition of a warning box that pops up when AA settings are changed. This would be an informational approach to explain the possible complications of enabling override AA on an application that doesn't support it. This would have added benefit if NVIDIA included a list of games known not to support MSAA in this warning box (as these games should already be noted in their release notes).

The bottom line is that NVIDIA wants to provide the "... best default settings for the broadest set of users for the most likely scenarios." We certainly know what we as reviewers would like to see, but we would love to hear from our readers on the subject.

The Increasing Complexity of AA Final Words
Comments Locked

50 Comments

View All Comments

  • mino - Saturday, March 17, 2007 - link

    Yeah, you can do that. Except that setting works
    a) non-proportionally
    b) obly on some fonts at some locations

    The result being it is clearly inferior to big dot size.

    Reason is simlpe, Windows was inherently build for certain dpi and all those "big-font" options were just later added.

    Also for doing that you would need to have all desktop onjects vectorised - fonts, icons, images...
  • Optimummind - Thursday, March 15, 2007 - link

    This is my ideal scenario for how AA and AF should be handled for games.

    The game developers should include options for AA and AF in their games if it's possible with the way their game engines render graphics. If not, the game developers should inform the GPU manufacturers of these AA/AF limitations so the GPU manufacturers can use the information to find workarounds or use the info to inform the consumer about such limitations.

    Perhaps, both nVIDIA and ATI can have an applet within their respective driver control panels that is dedicated to a list of games that don't support AA/AF in an alphabetical list with a search option box.

    This way, if I can't find AA or AF settings in the game I'm playing, I can exit the game, open up the driver control panel, click on the applet with the list of games not supporting AA/AF, find my game on the list, then read about the situation. The information would be deemed informational and useful by me if it talked about such things as:

    (a) The game the way is rendered via its engine, it's not compatible with the AA and/or AF modes supported by our GPU. This issue may be resolved if the game developer decides to patch the game engine.

    (b) The game is compatible with our GPU hardware but due to driver bugs, it's currently not working as intended. The fix is coming from us (GPU company) in the next revision.

    (c) Although the game doesn't include AA/AF options, forcing AA and AF via our driver control panel will work as it has been tested and Q&A by us. Enjoy!!

    I think such a list via an optional applet within a driver control panel would be more visible and very informational. Sure, it would take some time to compile such a long list, but once you have the list, it's simply a matter of adding newer games to it and it can keep growing. It would also be doing the consumer a huge favor by taking the guesswork out of why AA/AF doesn't work and who's actions are needed before it can be made to work. This will save trips to forums where we might not even find the answer to our questions.

    As for the other concepts about game profiles and override settings, I think those two should be kept intact. The profiles are useful b/c I can simply launch a game on my desktop shortcut and the settings I've determined through experiments to work can be automatically launched. But I think they will be more powerful with a list of games I've mentioned earlier. If this "list" idea is too hard to integrate into the driver, then perhaps ATI/nvidia should have a link within the driver control panel to a website that has this list.

    -optimummind
  • crimson117 - Thursday, March 15, 2007 - link

    I'm confused by having AA settings both in game and in the driver. Which gets precedence? Last set? Always the driver? Always the game? If I set it in the driver, then load the game, does the game override the driver?

    I'm sufficiently confused by this that I don't think I turn it on at all. I wish it was easier to understand.
  • Imnotrichey - Friday, March 16, 2007 - link

    I never even knew there was an AA setting outside the game. I always did it through the game. Since I could change it within game, I guess that means my CCC has is set to off.
  • DigitalFreak - Thursday, March 15, 2007 - link

    As far as I know, the driver will always override the in game settings.
  • sc0rpi0 - Thursday, March 15, 2007 - link

    Why should I have to tell the game/driver what <xxx> settings to use? I only care about the level of play (easily expressed in FPS) of the game I'm currently in. All I should have to specify (in the game) is the min and max FPS settings. That's it! The game/driver should then go off and figure out how to give me my acceptable level of gaming "experience." Sheesh!
  • gmallen - Thursday, March 15, 2007 - link

    I find that graphics horsepower for the games I play (i.e., Oblivion) is better used for textures and resolution. I can't stand the poor color and lousy visual quality of LCD so I use CRTs (I have squirreled away a couple of good ones for the future.); thus I have no need for odd resolutions. Drivers still seem to be easy to use but the UI (Catalyst for my ATI) needs to either be tossed or completely redesigned. I cannot believe that anyone with any UI design experience came near the Catalyst project.
  • VooDooAddict - Thursday, March 15, 2007 - link

    Personally, I like the idea of out of game profiles to set rez/AA/quality settings. The profiles should be managed by a separate app that can keep track of what games allow what settings.
  • Araemo - Thursday, March 15, 2007 - link

    Seriously, why won't someone seriously push super-sampling AA? If I'm not mistaken, SSAA doesn't require any fancy tricks, and will work correctly with any rendering technique as long as it renders to a 2d raster image(IE, the framebuffer) at the end of the pipeline.

    SSAA just requires you render at 4x the resolution (2x vert and 2x horizontal) for 4x anti-aliasing, internally to the card, and then down-sample it for output, which cleans up jaggies everywhere. With the extra power of modern video cards, I'd think ATI would be hyping this quite a bit if they had some way to do this with less performance hit than it takes on nvidia hardware(Or vice versa if nvidia has the hardware advantage), as a way to sell higher end cards.

    I'd love to go back to SSAA. My Voodoo5 spoiled me with image quality back in the day, even if driver compatibility was horrible. I generally don't have a 24" or 30" screen, so I don't need a card capable of rendering 2560x2048... unless you're going to let me use SSAA at that res, and downsample it for my 1280x1024 LCD. THAT would make me a happy, paying customer.
  • Ver Greeneyes - Friday, March 16, 2007 - link

    I second this. Me and a friend of mine are both going to buy GeForce 8800s soon, but I know they'll be too powerful for the games we play :P My old CRT monitor can do 1600x1200, but resolutions even that high still cost a lot of money on LCDs.. so why not spice things up with some super-sampling? On LCDs you don't generally get or need refreshrates higher than 60Hz, and tend to feel smooth as low as 30FPS, so all the horsepower the newest cards have is just going to waste right now. (sure you can, say, tweak the Oblivion .ini file 'till you're cross-eyed, but I think SSAA would give better image quality painlessly)

Log in

Don't have an account? Sign up now