Final Words

After having tested a few of the games that ATI now detects under Catalyst AI, we can evaluate what they are doing. They say that they try to keep image quality nearly the same, and in the games that we've looked at, we see that they do. Of course, the only games on which we had seen any "real" improvement were Doom 3 and (only with AA/AF enabled) UT2K4. Maybe there is an issue with this version of the Catalyst AI functionality (this is a beta driver), or maybe there's really just no performance difference for the games that we tested. Usually using higher performance hardware accentuates small tweaks and performance enhancements, and these games shouldn't be limited by other aspects of the system. We are definitely pleased with ATI's move away from dropping to bilinear filtering.

NVIDIA first started applying tight restrictions on optimizations last year after their issue with Futuremark. We are glad to see that ATI is embracing a move to enhance the user experience for specific games when possible while tightening up the reins on their quality control as well. We see this as a very positive step and hope to see it continue.

We also like the fact that we have the ability to turn on or off Catalyst AI functionality. When we perform the test, we will be using the default option, but for those out there who want the choice, it is theirs to make. Of course, right now, it doesn't look like it makes much sense to have Catalyst AI on except in Doom 3 and Unreal. As the package matures, we will likely see more games affected and more reason to enable Catalyst AI. But ATI can be assured that we and others in the community will be doing image quality spot checks between running with and without Catalyst AI. And the same goes for any optimization over which NVIDIA allows us control.

Over the past year, we have had ample opportunity to speak with the development community about optimizations and their general take on the situation. They tend to agree that as long as the end result is very nearly the same, they appreciate any kind of performance enhancement that they see. Since NVIDIA and ATI cannot physically produce the same mathematical output, they'll never have the same image appear on systems with different vendors' cards in them. But just as these two different results are equally valid under the constraints of the API used and the developer who implemented them, so can optimized results that are faster to render, but not perceivably different to the human eye.

Both ATI and NVIDIA want to maintain an acceptable image quality because they know that they'll be held accountable for not doing so. If it's not the API architects (who can take away the marketing tool of feature set support), then it's game developers. If it's not the game developers, then it's the end users. And the more we know about what we are seeing, the better we are able to help ATI and NVIDIA give us what we want. As the mantra of "optimizations are bad, mmkay" lifts, we should encourage both companies to focus on balancing mathematically accurate output optimally with pure speed to match our own perception.

Unreal Tournament 2004 Analysis
Comments Locked

8 Comments

View All Comments

  • 861 - Tuesday, September 28, 2004 - link

    a great tool!
  • dvinnen - Tuesday, September 28, 2004 - link

    That's good. Only difference I saw was a little lighting difference on the floor. Even then very slight. Good for an extra 10 frams.
  • gordon151 - Monday, September 27, 2004 - link

    They did make some general opengl improvements that helped the 8500 in doom3 and some other opengl games in recent drivers, but thats pretty much it for that class.
  • ksherman - Sunday, September 26, 2004 - link

    it too bad all the new driver enhanchments have abandoned the Radeon 8500... I could really use some of the improvements in games... :(
  • KillaKilla - Sunday, September 26, 2004 - link

    Here it is: its the third image down. Don't know if the whole .net framework could make this imposible, but I don't see how this could have affect...

    http://www.anandtech.com/video/showdoc.aspx?i=1931...
  • KillaKilla - Sunday, September 26, 2004 - link

    Would it be posible to implement the old, but very, very effective trick to see difference between AI settings? I'm talking about the thing where you hover your mouse over an image and it changes to the other image. I'll look for a link in a minute, if you don't see what I'm talking about.
  • acemcmac - Sunday, September 26, 2004 - link

    my question remains...... can I finally leave Cat 3.8 and MMC 8.7 if I want MMC and multimonitor support simultaneously???
  • coldpower27 - Sunday, September 26, 2004 - link

    I am glad ATI, is giving this suite allowing user to choose to use optimized drivers or not and having the ability to disable them as well. Good for ATI, now they are on par with Nvidia in this respect.

Log in

Don't have an account? Sign up now