Performance and compatibility are always at the top of the list of topics addressed in each new graphics driver release, and our articles usually focus on performance. Over the past couple years we've seen user interface change quite a bit as well. Drivers have been growing in size and complexity a while now, and sometimes it is worthwhile to stop and take a look at where things are headed.

We haven't been hugely impressed with the UI direction either AMD or NVIDIA have taken. We've had quite a few conversations at AnandTech lamenting the loss of the simple driver interface embedded in the advanced display properties panel. It is desirable to enable novice users or the less technically inclined to understand and benefit from driver settings, but decisions on how best to enhance the graphics driver experience can't be taken lightly. It is in this spirit that NVIDIA contacted us about some options it is considering for the future direction of its driver.

For the past few years, the driver setting that has had the single heaviest impact on performance has been antialiasing. In the beginning, applications didn't include AA settings in games, but users were still able to benefit from the feature by enabling the option through the driver control panel. More recently, games have enabled users to set their desired level of antialiasing with in-game graphics settings. But sometimes a game will include a feature that won't allow traditional MSAA (multi-sample antialiasing) to work properly. We saw this very early in Halo, one of the first DX9 games. Later, titles that made use of floating point surfaces (often useful in HDR rendering) also excluded the option for MSAA. Today, while both NVIDIA and AMD have hardware out that can support MSAA on floating point surfaces, some developers are taking entirely different approaches to rendering which get in the way of the very concept of MSAA.

The questions we are going to ask are: how do/should we set AA, how should game developers handle AA, and how should graphics hardware makers address AA in their drivers? Before we get there, let's take a deeper look at some of the complexity associated with AA, in particular with NVIDIA hardware.

The Increasing Complexity of AA
Comments Locked

50 Comments

View All Comments

  • mino - Saturday, March 17, 2007 - link

    Yeah, you can do that. Except that setting works
    a) non-proportionally
    b) obly on some fonts at some locations

    The result being it is clearly inferior to big dot size.

    Reason is simlpe, Windows was inherently build for certain dpi and all those "big-font" options were just later added.

    Also for doing that you would need to have all desktop onjects vectorised - fonts, icons, images...
  • Optimummind - Thursday, March 15, 2007 - link

    This is my ideal scenario for how AA and AF should be handled for games.

    The game developers should include options for AA and AF in their games if it's possible with the way their game engines render graphics. If not, the game developers should inform the GPU manufacturers of these AA/AF limitations so the GPU manufacturers can use the information to find workarounds or use the info to inform the consumer about such limitations.

    Perhaps, both nVIDIA and ATI can have an applet within their respective driver control panels that is dedicated to a list of games that don't support AA/AF in an alphabetical list with a search option box.

    This way, if I can't find AA or AF settings in the game I'm playing, I can exit the game, open up the driver control panel, click on the applet with the list of games not supporting AA/AF, find my game on the list, then read about the situation. The information would be deemed informational and useful by me if it talked about such things as:

    (a) The game the way is rendered via its engine, it's not compatible with the AA and/or AF modes supported by our GPU. This issue may be resolved if the game developer decides to patch the game engine.

    (b) The game is compatible with our GPU hardware but due to driver bugs, it's currently not working as intended. The fix is coming from us (GPU company) in the next revision.

    (c) Although the game doesn't include AA/AF options, forcing AA and AF via our driver control panel will work as it has been tested and Q&A by us. Enjoy!!

    I think such a list via an optional applet within a driver control panel would be more visible and very informational. Sure, it would take some time to compile such a long list, but once you have the list, it's simply a matter of adding newer games to it and it can keep growing. It would also be doing the consumer a huge favor by taking the guesswork out of why AA/AF doesn't work and who's actions are needed before it can be made to work. This will save trips to forums where we might not even find the answer to our questions.

    As for the other concepts about game profiles and override settings, I think those two should be kept intact. The profiles are useful b/c I can simply launch a game on my desktop shortcut and the settings I've determined through experiments to work can be automatically launched. But I think they will be more powerful with a list of games I've mentioned earlier. If this "list" idea is too hard to integrate into the driver, then perhaps ATI/nvidia should have a link within the driver control panel to a website that has this list.

    -optimummind
  • crimson117 - Thursday, March 15, 2007 - link

    I'm confused by having AA settings both in game and in the driver. Which gets precedence? Last set? Always the driver? Always the game? If I set it in the driver, then load the game, does the game override the driver?

    I'm sufficiently confused by this that I don't think I turn it on at all. I wish it was easier to understand.
  • Imnotrichey - Friday, March 16, 2007 - link

    I never even knew there was an AA setting outside the game. I always did it through the game. Since I could change it within game, I guess that means my CCC has is set to off.
  • DigitalFreak - Thursday, March 15, 2007 - link

    As far as I know, the driver will always override the in game settings.
  • sc0rpi0 - Thursday, March 15, 2007 - link

    Why should I have to tell the game/driver what <xxx> settings to use? I only care about the level of play (easily expressed in FPS) of the game I'm currently in. All I should have to specify (in the game) is the min and max FPS settings. That's it! The game/driver should then go off and figure out how to give me my acceptable level of gaming "experience." Sheesh!
  • gmallen - Thursday, March 15, 2007 - link

    I find that graphics horsepower for the games I play (i.e., Oblivion) is better used for textures and resolution. I can't stand the poor color and lousy visual quality of LCD so I use CRTs (I have squirreled away a couple of good ones for the future.); thus I have no need for odd resolutions. Drivers still seem to be easy to use but the UI (Catalyst for my ATI) needs to either be tossed or completely redesigned. I cannot believe that anyone with any UI design experience came near the Catalyst project.
  • VooDooAddict - Thursday, March 15, 2007 - link

    Personally, I like the idea of out of game profiles to set rez/AA/quality settings. The profiles should be managed by a separate app that can keep track of what games allow what settings.
  • Araemo - Thursday, March 15, 2007 - link

    Seriously, why won't someone seriously push super-sampling AA? If I'm not mistaken, SSAA doesn't require any fancy tricks, and will work correctly with any rendering technique as long as it renders to a 2d raster image(IE, the framebuffer) at the end of the pipeline.

    SSAA just requires you render at 4x the resolution (2x vert and 2x horizontal) for 4x anti-aliasing, internally to the card, and then down-sample it for output, which cleans up jaggies everywhere. With the extra power of modern video cards, I'd think ATI would be hyping this quite a bit if they had some way to do this with less performance hit than it takes on nvidia hardware(Or vice versa if nvidia has the hardware advantage), as a way to sell higher end cards.

    I'd love to go back to SSAA. My Voodoo5 spoiled me with image quality back in the day, even if driver compatibility was horrible. I generally don't have a 24" or 30" screen, so I don't need a card capable of rendering 2560x2048... unless you're going to let me use SSAA at that res, and downsample it for my 1280x1024 LCD. THAT would make me a happy, paying customer.
  • Ver Greeneyes - Friday, March 16, 2007 - link

    I second this. Me and a friend of mine are both going to buy GeForce 8800s soon, but I know they'll be too powerful for the games we play :P My old CRT monitor can do 1600x1200, but resolutions even that high still cost a lot of money on LCDs.. so why not spice things up with some super-sampling? On LCDs you don't generally get or need refreshrates higher than 60Hz, and tend to feel smooth as low as 30FPS, so all the horsepower the newest cards have is just going to waste right now. (sure you can, say, tweak the Oblivion .ini file 'till you're cross-eyed, but I think SSAA would give better image quality painlessly)

Log in

Don't have an account? Sign up now