Original Link: http://www.anandtech.com/show/1892

Looking Back: ATI’s Catalyst Drivers Exposed

It’s no secret in the hardware world that good software is often just as important as good hardware. The best processor, the best TV tuner, and even the best sound card can only be as good as the software and drivers backing it up. Even a small change in one critical piece of code can result in a massive difference that represents a significant change in performance and sales of a piece of hardware.

Above all, however, this concept is embodied in the realm of video cards, where over the years, we have been spoiled by promises of “A performance improvement between 17 and 24% is noticed in Jedi Knight II” and “up to 25% performance improvement in popular consumer and professional applications”. These days, it’s not just common to see GPU makers find ways to squeeze out more performance out of their parts - it’s expected. Finishing the design of and launching a GPU is just the first steps of a much longer process of maximizing performance out of a part, a process that can quite literally take years.

With the flexible nature of software, however, it has caused a significant shift in the marketing strategies of GPU makers, where the war is not over at launch time, but continues throughout the entire product cycle and in to the next one as new optimizations and bug fixes are worked in to their drivers, keeping the performance landscape in constant motion. Just because a side did not win the battle at launch doesn’t mean that they can’t still take it later, and just because a side won now doesn’t mean that they’ll keep their win.

We have seen on more than one occasion that our benchmarks have been turned upside down and inside out, with cases such as ATI’s Catalyst 5.11 drivers suddenly giving ATI a decisive win in OpenGL games, when they were being soundly defeated just a driver version before. However, we have also seen this pressure to win drive all sides to various levels of dishonesty, hoping to capture the lead with driver optimizations that make a product look faster on a benchmark table, but literally look worse on a monitor. Quake3, 3DMark 2003, and similar incidents have shown that there is a fine line between optimizing and cheating, and that as a cost for the flexibility of software, we may sometimes see that line crossed.

That said, when the optimizations, the tweaks, the bug fixes, and the cheats are all said and done, just how much faster has all of this work made a product? Are these driver improvements really all that substantial all the time, or is much of this over-exuberance and distraction over only minor issues? Do we have any way of predicting what future drivers for new products will do?

Today, we set out to answer these questions by taking a look back at a piece of hardware whose time has come and is nearly gone: ATI’s R300 GPU and the Radeon 9700 Pro.

R300 & The Test

As the patriarch of nearly 3 years worth of technology from ATI, it's by no mistake that we start out with one of the most influential GPUs ever made. The R300 not only was the primary architecture of ATI's entire 9500-9800 line of video cards starting in later 2002, but was also the father of much of the design elements that we saw go in to ATI's R4xx GPUs, and only finally replaced with the R5xx series in the later part of 2005, a testament to the strong design of the R300.

Because of these reasons, not to mention the strong sales of both R3xx and R4xx based video cards, the R300 and its host video card, the Radeon 9700 Pro, are a prime example of what developments in device drivers can mean for a product. ATI has now had over 3 years to make the most of the software that drives the 9700 Pro, allowing us to see just how much more performance ATI could get out of the card with later drivers that was not obvious upon the card's launch.

To identify and analyze these improvements, we have taken a 9700 Pro and run it through a modified version of one of our earlier benchmark suites, testing a slew of games and benchmarks in a regression test against a dozen versions of ATI's drivers, taking a quarterly snapshot of performance and image quality. Unfortunately, in spite of the R300 hardware supporting Shader Model 2.0 from the start, ATI did not offer such support in their official drivers until some months after the card shipped, so for our testing purposes, we had to start with the first drivers that offered such support, Catalyst 3.0, which are a couple of versions newer than the first drivers for the 9700 Pro. Still, we will see even excluding the first few months of the 9700 Pro's life doesn't skip on the performance improvements that ATI's Catalyst team was able to work out.

The specific games/benchmarks tested were:
  • D3DAFTester
  • Unreal Tournament 2004
  • Jedi Knight: Jedi Academy
  • Warcraft 3: The Frozen Throne
  • Halo
  • 3dMark 2003
In order to minimize any potential bottlenecks outside of the video card, we did not equip the 9700 Pro on an equally dated system, and instead, put it on the fastest AGP hardware that we had. As such, the test was done on the following:

AMD Athlon 64 3400+(S754)
Abit KV8-MAX3 motherboard
2GB DDR400 RAM 2:2:2
120GB Maxtor DiamondMax Plus 9 Hard Drive
Antec TruePower 430W Power Supply

All tests were done at 1280x1024 unless otherwise noted. To view larger images, please download them from here.


Although it’s not a benchmark in and of itself, D3DAFTester is a great diagnostic tool to see if there has been any sort of global change in anisotropic filtering quality, and if so, what kind of impact that change offers. Before we settle down with the benchmarks, it’s best to see if there have been any global changes made to the drivers that would help or harm overall IQ.

D3DAFTester, Tunnel Mode, 8x AF

D3DAFTester, Plane Mode @ 15x75, 8x AF

After going through all our selected Catalyst drivers at all AF quality levels, we really could have picked any screenshot to show for any given quality level. There is absolutely no difference in AF quality in any Catalyst driver, so ATI gets a clean bill of health here with regard to later performance changes.

Jedi Knight: Jedi Academy

Released in 2003, Jedi Academy represents the pinnacle of what the Quake3 engine could offer. With massive levels, dynamic glow, and lightsabers abound, it's one of the most punishing Quake3 engine games ever made, and a good representation of the vast number of games made in the early 2000's with this engine. As our only OpenGL title in this roundup, it's also our gauge to see if ATI's OpenGL performance changed at all over the 3-year period That said, even with ATI's traditionally poor OpenGL performance, we still had to increase our testing resolution to 1600x1200 in order to put a sizable dent in to our test setup; otherwise, we would continuously hit the 100fps frame rate cap.

Jedi Academy

Jedi Academy HQ

As the Quake 3 engine was already 3+ years old at the time of the earliest drivers, it should come as no surprise that there is not much variation to speak of here either with or without AA/AF. Even with that, we can see that ATI still managed to work in one significant performance improvement in between the Catalyst 3.00 and 3.04 driver sets, with a 10% frame rate increase. The numbers are a bit more mixed with AA/AF enabled, but even here, the peak performance difference is a very noticeable 14%.

Looking at the screen captures, however, we see a very interesting story that the benchmarks do not show, and it's not all performance related.

Catalyst 3.09 versus 3.06 (mouse over to see 3.06)

Unlike the earlier comparison, there is a very noticeable IQ difference between the 3.06 and 3.09 drivers, but looking at our charts, there is no such difference in performance. This is a prime example of how drivers aren't just about performance improvements, as the IQ difference is the result of a bug fix by ATI with dynamic glow. On drivers previous to 3.09, the JA team had to use a hack to get around a bug in ATI's drivers, causing the inferior image quality seen above. These hacks are not used in drivers 3.09 and later, and as we can see, ATI was able to fix the bug without a performance hit. There was no further change to IQ after the 3.09 drivers.

Overall, however, Jedi Academy shows that other than early improvements and a bug fix, there was little change in performance in this game with the 9700 Pro.

Unreal Tournament 2004

As UT2003 and UT2004 are near-perfect substitutes for each other, we went with Epic's latest version of their best-selling multi-player FPS in order to put the 9700 Pro up against 2004's more refined engine. UT2004 is a good example of a near-modern game, utilizing some SM 1.x features, along with being the engine of choice for many more games, including America's Army. With the number of games built on the Unreal Engine 2.x, UT2004 represents an important engine to optimize for, given the era.


UT2004 HQ

Here, we see almost no performance difference among the Catalyst drivers without AA/AF, with the performance actually dropping just a bit between the 3.00 drivers and later drivers. Enabling AA/AF, however, shows a more positive picture with a near-20% performance improvement between the 3.00 and 3.04 drivers, and a little more of a pickup after that.

Catalyst 5.11 versus 3.00 (mouse over to see 3.00)

Looking at the screenshots, you'd be hard pressed to find a difference in image quality between the 3.00 and 5.11 drivers, and it's the same story for all of the other Catalyst versions.

Other than improving AA/AF performance, it seems that ATI had little need to optimize for UT2004. Without a performance or IQ difference, there is little to say about the 9700 Pro with regards to UT2004.

Warcraft 3: The Frozen Throne

Leaving the realm of FPS's for a bit, we take a look at Blizzard's massively popular RTS Warcraft 3, and its expansion pack, The Frozen Throne. As we have mentioned in the past, even for its superb image quality, WC3 is not a terribly performance-intensive game, so we aren't expecting any surprises here. Because WC3 does not have a benchmarking mode, all frame rates are approximate using a custom replay and FRAPS.

Warcraft 3

Warcraft 3 HQ

With only a marginal blip with the 3.04 drivers, you could practically set a watch to the 9700 Pro's performance on WC3 without AA/AF. Turning on these features causes a little more variance in results, but even here, there is ultimately no significant change in performance in the end. Though, we are curious about the overall lower frame rates compared to the high mark of 58.9fps with the 3.06 drivers.

Catalyst 5.11 versus 3.00 (mouse over to see 3.00)

Like with UT2004, there's not much of a story here with image quality. Although the nature of how we benchmarked and screenshot WC3 means that it's never exactly the same twice, there is no appreciable difference in IQ between the first and the last drivers, or anything in between.

Even more than with UT2004, Warcraft 3 is a no-story. ATI did not make any driver changes that significantly impacted either IQ or performance.


Halo represents both a curse and a blessing among possible titles to benchmark. As one of the first FPSs to make good use of SM 2.0 and a very popular title on both the PC and Xbox, it's an important title to use to see what ATI could do with the newest feature of the 9700 Pro; but at the same time, it was still a console port that was in many ways mediocre. While we would have liked to test Gearbox's "Custom Edition" that implements properly optimized shaders, the lack of single player support in that version has limited us to the less optimized original version of the game. The lack of AA support has also limited us to only benchmarking Halo without any advanced features turned on.


First of all, the 0 next to the Catalyst 3.00 drivers is not a typo; with the 3.00 drivers, Halo suffers from a massive stuttering problem that caused the time demo to play at unequal speeds and otherwise returned senseless benchmark results, and was excluded as a result. Beginning with the 3.04 drivers then, we see Halo receive a significant performance boost late in to the life of the 9700 Pro, rather than in the beginning where we'd expect it. ATI has attributed this to z-buffer optimizations in their driver for dealing with this game; this optimization was good for a nearly a whopping 25% improvement in performance, and it's a shame that ATI didn't do this earlier. Otherwise, we see a drop off between the 3.04 and 3.06 drivers, and then a slow increase of 10% through the 5.05 drivers; the drop off is not significant at less than 11%, but it's still noteworthy that the game actually got a bit slower before its release around the time of the 3.06 drivers.

Catalyst 4.02 versus 3.09 (mouse over to see 3.09)

Looking at the screenshots, there is a very noticeable IQ difference, but not one that correlates between any of the performance changes. The above is a comparison between the Catalyst 3.09 drivers and the 4.02 drivers, with a focus on the flashlight. With the 3.09 drivers, the light has a very noticeable hard edge, while this becomes a soft edge under the 4.02 drivers. This is something that we mentioned previously in our Fall 2003 IQ Analysis, where with the 3.09 drivers, our 9800 Pro was doing hard edges while NVIDIA's 5900 was doing the proper and more complex soft edges. ATI fixed this problem after the 3.09 drivers - the flashlight was working correctly without a performance hit. While we'll never know if it was a bug or an intentional way to improve frame rates (as ATI never mentioned this in any of their release notes), it's good to know that they could get it right without a performance hit.

As all the other pre-3.09 shots are indistinguishable from our 3.09 shot, and all post-4.02 shots are just like our 4.02 shot, there appears to be no other IQ change other than the flashlight fix. Overall, Halo stands apart as a game that received a constant (if small) improvement in performance, and the second game to receive an IQ-related fix.

3dMark 2003

3dMark is not a benchmark that we routinely bring you here at AnandTech, as our editorial policy is to bring you benchmarks from real-world games and engines, and not synthetic metrics. That said, it would be inappropriate to leave out 3dMark in this case due to the significant cheating incidents with it. And, as a flashy, system draining benchmark backed by a unique database for comparisons, it's still an important title in the eyes of many consumers, OEMs, and the GPU makers looking for bragging rights.

With 3dMark, its importance in this regression is not so much the performance improvements as a sign of what happened with the card - the improvements were most certainly exaggerated due to in part by the synthetic nature of the benchmark - but rather a possibility of what can happen when ATI dedicates its resources to a game/benchmark that it considers most important. We should note that ATI has admitted to "cheating" on 3dMark 2003; however, these were what we consider honest shader-replacement optimizations (same mathematical output) that ATI voluntarily removed, though they were apparently re-introduced at some point. We used the latest version of 3dMark 2003, so this "cheat" was not activated in the older drivers.

For these benchmarks, 3dMark was run at its default resolution of 1024x768.

3DMark 2003

3DMark 2003 HQ

With 3dMark, we are starting to see a very common theme, which we have seen with most of our other benchmarks that worked with the Catalyst 3.00 drivers; there's a very significant performance improvement between them and 3.04 when AA/AF are used. Otherwise, 3dMark shows a very slow, very steady performance improvement over the life of the 9700 Pro both with and without AA/AF.

Catalyst 5.11 versus 3.00 (mouse over to see 3.00)

As far as IQ goes, however, we may as well have just shown you the same screenshot twice - there is no difference between the 3.00 and 5.11 drivers. It's the same story for all of the other screenshots in between. It should be noted, however, that this IQ comparison highlights one of the flaws with 3dMark - its non-interactive nature means that certain cheats can be used based on the fact that all perspectives are known ahead of time. So, while we have no reason to believe that ATI is being dishonest here, we have no way of being completely sure that they aren't using any sort of perspective cheating.

Overall then, 3dMark is much like Halo, a benchmark that received a slow, but steady improvement, without any fixes.


So, now that we have gone through 6 applications and 12 drivers, what have we learned? Not much, if we want to talk about consistency.

In general, there was one significant performance improvement across all games via the driver, and this was the move from the Catalyst 3.00 drivers to the 3.04 drivers. Otherwise, for anyone who would have been expecting multiple across - the-board improvements, this would be a disappointment.

Breaking down the changes by game, we see an interesting trend among what games had the greatest performance improvement. Jedi Academy, UT2004, and really every non-modern/next-gen game saw no significant performance improvements to which we can isolate to just the driver offering targeting optimizations, and there was only the one aforementioned general improvement. However, with our next-gen benchmarks, Halo and 3dMark, we saw a similar constant performance improvement among the two, unlike with the other games.

There is also a consistent performance improvement among most of the titles we used that was isolated to when we enabled AA/AF, which is a positive sign to see given just how important AA/AF has become. With the latest cards now capable of running practically everything at a high resolution with AA/AF, it looks like ATI made a good bet in deciding to put some of their time in these kinds of optimizations.

Getting back to the original question then, are drivers all they're cracked up to be? Yes and no. If the 9700 Pro is an accurate indicator, than other cards certainly have the possibility of seeing performance improvements due to drivers, but out of 3 years of drivers, we only saw one general performance improvement, so it seems unreasonable to expect that any given driver will offer a massive performance boost across the board, or even that most titles will be significantly faster in the future. However, if you're going to be playing next-generation games that will be pushing the latest features of your hardware to its limits, then it seems likely that you'll find higher performance as time goes on, but again, this will be mostly in small increments, and not a night-and-day difference among a related set of drivers.

As for the future, the Radeon 9700 Pro is by no means a crystal ball in to ATI's plans, but it does give us some places to look. We've already seen ATI squeeze out a general performance improvement for OpenGL titles for their new X1000-series, and it seems likely that their memory controller is still open enough that there could be one more of those improvements. Past that, it seems almost a given that we'll see future performance improvements on the most feature-intensive titles, likely no further game-specific changes on lighter games, and plenty of bug fixes along the way.

Log in

Don't have an account? Sign up now