Introduction

Coinciding with the launch of the X700 line of graphics cards, ATI slipped a little something extra into its driver. The lastest beta version of Catalyst that we got our hands on includes a feature called Catalyst AI. Essentially, ATI took all their optimizations, added a few extra goodies, and rolled it all together into one package.

Optimization has been a touchy subject for quite some time in the world of consumer 3D graphics hardware. Over the past year, we have seen the industry take quite a few steps toward putting what the developers and users want above pure performance numbers (which is really where their loyalty should have been all along). The backlash from the community over optimizations that have been perceived to be questionable seems to have outweighed whatever benefit companies saw from implimenting such features in their drivers. After all, everything in this industry really is driven by the bottom line, and the bottom line rests on public opinion.

There are plenty of difficulties in getting a high quality, real-time 3D representation of a scene drawn something like every 25 thousandths of a second. The hardware that pushes thousands of vertecies and textures into millions of pixels every frame needs to be both fast and wide. Drivers and games alike need to focus on doing the absolute minimum that is necessary to produce the image desired in order to keep frame rates playable. The fight for better graphics in video games isn't due to a lack of knowledge about how to render a 3d scene; faster graphics come as we learn how to approximate a desired effect more quickly. Many tricks and features, and bells and whistles that we see in graphics hardware have worked their way down to "close enough" approximations from complex and accurate algorithms likely used in professional rendering packages.

Determining just what accuracy is acceptable is a very tough job. The best measure that we have right now for what is acceptable is this: the image produced by a video card/driver should look the way that the developer had intended it to look. Game developers know going in that they have to make trade-offs, and they should be the ones to make the choices.

So, what makes a proper optimization and what doesn't? What are ATI and NVIDIA actually doing with respect to optimization and user control? Is application detection here to stay? Let's find out.

Why Optimize?
Comments Locked

8 Comments

View All Comments

  • 861 - Tuesday, September 28, 2004 - link

    a great tool!
  • dvinnen - Tuesday, September 28, 2004 - link

    That's good. Only difference I saw was a little lighting difference on the floor. Even then very slight. Good for an extra 10 frams.
  • gordon151 - Monday, September 27, 2004 - link

    They did make some general opengl improvements that helped the 8500 in doom3 and some other opengl games in recent drivers, but thats pretty much it for that class.
  • ksherman - Sunday, September 26, 2004 - link

    it too bad all the new driver enhanchments have abandoned the Radeon 8500... I could really use some of the improvements in games... :(
  • KillaKilla - Sunday, September 26, 2004 - link

    Here it is: its the third image down. Don't know if the whole .net framework could make this imposible, but I don't see how this could have affect...

    http://www.anandtech.com/video/showdoc.aspx?i=1931...
  • KillaKilla - Sunday, September 26, 2004 - link

    Would it be posible to implement the old, but very, very effective trick to see difference between AI settings? I'm talking about the thing where you hover your mouse over an image and it changes to the other image. I'll look for a link in a minute, if you don't see what I'm talking about.
  • acemcmac - Sunday, September 26, 2004 - link

    my question remains...... can I finally leave Cat 3.8 and MMC 8.7 if I want MMC and multimonitor support simultaneously???
  • coldpower27 - Sunday, September 26, 2004 - link

    I am glad ATI, is giving this suite allowing user to choose to use optimized drivers or not and having the ability to disable them as well. Good for ATI, now they are on par with Nvidia in this respect.

Log in

Don't have an account? Sign up now