Original Link: http://www.anandtech.com/show/1952

The last time that we took an in-depth look at a driver set, it was the Catalyst series on a Radeon 9700 Pro, in which we saw just how much or how little had changed over the two-and-a-half-year lifespan of the card. Overall, we found that ATI’s breakaway hit of a video card changed very little once it was out of its youth, but where ATI did put its biggest investments in improving performance paid off very well, improving anti-aliasing and anisotropic filtering performance in ways that now reflect their ubiquitous status these days.

However, while we had a good handle on the 9700 Pro and its R3xx lineage, we couldn’t help but wonder about the R420, the successor and very much the offspring of the R300 design. Built even more for shader power and based on an already strong design, could the R420 tell us something the R300 couldn’t? How did ATI handle the R420’s drivers in the face of real competition with NVIDIA’s 6000 series, versus the landslide over the 5000 series? To answer these questions and more, we’re back today once again putting the Catalyst drivers to the test, this time with the R420-based X800 Pro.

While this series of investigations is very much organic in nature and continuing to grow and change to fit the needs of the readers, we have made several modifications based on user feedback from our first effort. For our look at the R420 and the forthcoming NV40, using more modern video cards has allowed us to also use more modern games, something many of you requested. We still can’t use the newest games because of the slim number of drivers that fully support them, but with this selection, we’ve tried to reach a better balance on the number of modern games versus the need to use games old enough to span the entire life of the video card. As always, if you have any further suggestions to take into consideration for future video cards, we’d love to hear them in our comments section.

With that said, our overall objective in doing this has not changed. As a recap from our first article:

When the optimizations, the tweaks, the bug fixes, and the cheats are all said and done, just how much faster has all of this work made a product? Are these driver improvements really all that substantial all the time, or is much of this over-exuberance and distraction over only minor issues? Do we have any way of predicting what future drivers for new products will do?

Today, we’ll once again answer that and more on the R420.

R420 & The Test

As previously mentioned, the R420 occupies an interesting position in ATI's GPU history, as it's neither something completely new, nor is it a complete rehash of a previous GPU. The R300 was a very strong design for ATI, so they had no need to replace the design completely when it only needed to be moderately upgraded to meet ATI's needs. The end result, of course, was additional pipelines, some new features, and expanding the pixel shading abilities of the card past the base Shader Model 2.0 specification. In theory then, the R420 should behave fairly similar to what we saw in the R300.

While the R420 launched when there was already ample and growing support in the marketplace for pixel and vertex shading, virtually all of our games this time around use shaders to some degree. The specific games/benchmarks tested this time were:

  • X2: The Threat
  • Final Fantasy XI, Benchmark 2
  • Doom 3
  • Half-Life 2
  • Far Cry
  • Battlefield 2
  • 3dMark 2005
  • D3DAFTester
Our benchmarking setup for this series remains unchanged, and is once again the following:

Benchmarking Testbed
Processor: AMD Athlon 64 3400+(S754)
Motherboard: Abit KV8-MAX3
Memory: 2GB DDR400 RAM 2:2:2
Hard Drive: 120GB Maxtor DiamondMax Plus 9
Power Supply: Antec TruePower 430W

All tests were done at 1280x1024 unless otherwise noted.


As with the R300, the first place to look at is the D3DAFTester, which would indicate any obvious and global changes in anisotropic filtering quality. Although it doesn’t see any game-specific changes, this is always a good place to start.

D3DAFTester, Tunnel Mode, 8x AF

D3DAFTester, Plane Mode @ 10x75, 8x AF

There’s really not much to say here about the quality of ATI’s filtering. It was consistent throughout all Catalyst revisions.

Far Cry

Keyword: ATI Radeon X800

Getting to our first game, Far Cry and its CryEngine 1 represent the first of the modern graphics engines that truly utilized the abilities of SM2.0+ hardware. With its lush jungles and sandy beaches, even as it’s pushing 2 years old, Far Cry is still unrivaled in presenting what a tropical paradise should look like. As a game that has traditionally performed better on ATI’s hardware than NVIDIA’s, it also gives us a chance to look at what, if anything, ATI did for performance when it already had a clear lead in a game.

Far Cry

Far Cry HQ

Without anti-aliasing or anisotropic filtering, the results aren’t too surprising. There is some performance improvement, but given ATI’s lead and the fact that the game is older than the R420 itself, minimal performance improvements are to be expected.

By enabling AA/AF, however, we see an entirely different story. With the 5.03 drivers, ATI posted a very impressive 30% performance improvement, moving the game from the realm of being fairly playable with these settings to extremely playable. ATI cites this as being due to efficiency improvements in vertex processing on the R420, which impacted this game heavily. While we can’t see this change without AA/AF, it’s very obvious here with it.

Catalyst 4.05 versus 6.01 (mouse over to see 4.05)

Although the lack of a quick-save/quick-load feature prevents us from taking a set of screenshots perfectly alike, it doesn’t take much effort to see here that there’s not a difference to speak of between the earliest and latest Catalyst drivers, and this is the case throughout all of the drivers, including between the 5.01 and 5.03 drivers.

When it comes to Far Cry, there’s little to say here other than praise for being able to pull off this kind of performance improvement without touching image quality or simply fixing a bug.

Final Fantasy XI

Moving away from first-person shooters for a bit, the Final Fantasy XI benchmark is an interesting case because of its basis as an MMORPG instead of a single player game. In general, there is much more of a focus on a large number of characters, rather than on the environment or a few highly detailed characters, making it unique among most benchmarks. This is a limited benchmark in that it doesn’t offer a resolution above 1024x768 or works with anti-aliasing (and, hence, no HQ benchmarks), but the different perspective alone makes it worth the effort.

Final Fantasy XI

Since this game does not seem to utilize the SM2.0+ feature set, the performance numbers are not particularly surprising. The slight decrease is a bit interesting, since it’s consistent rather than being a product of normal variations in benchmark scores, but even then, there’s not much to say about a sub-5% performance decrease given the relatively high scores. We only hope that this won’t be a continuing trend for this game given the nature of MMORPGs to increase some in graphical complexity over their lifetime.

Catalyst 4.05 versus 6.01 (mouse over to see 4.05)

Looking at the screenshots, and given the lack of a change in performance, the consistency of the screenshots is what we would expect in this case. Clearly, ATI hasn’t made any optimizations either for this game or global optimizations that significantly affect this game.

X2: The Threat

Continuing the trend of non-shooter games, Egosoft’s X2 represents one of the best space-habitation games, and also one of the few surviving space series. Not held back by the need for environments or elaborate backgrounds, X2 makes good use of high-quality models and the SM2.0+ feature set. It’s recently been replaced by the even more advanced X3: Reunion, but this game can still bring a system to a crawl in some places when every quality setting is turned on.

X2: The Threat

X2: The Threat HQ

Given the graphical complexity of X2 in some scenes, we’re a bit surprised to see that there’s no performance change to speak of here. Between the 4.05 and 6.01 drivers, the framerate’s difference is under a single frame per second, which we can classify as being likely due to random variations; for all intents and purposes, there’s been no change in the game.

Catalyst 4.05 versus 6.01 (mouse over to see 4.05)

Given the lack of performance changes, there’s nothing shocking here. Image quality is completely consistent throughout all of the Catalyst drivers.

Doom 3

Moving back in to first-person shooters, Doom 3 is without a doubt the most interesting game tested on the Catalyst drivers today. With its emphasis on darkness and a unified lighting system, Doom 3 presents a very different situation than most first-person shooters do, hopefully giving us a different take on performance in such a game. It’s also the OpenGL game of choice for this roundup; though, as we’ll see, just being OpenGL doesn’t mean it’s a great indicator of OpenGL performance.

Doom 3

Doom 3 HQ

For these benchmarks, we opted to start with the 4.05 drivers, even though they’re a few months older than Doom 3 itself. Doing so also helps bring attention to the large jump in performance between the 4.09 and 4.11 drivers, and what makes Doom 3 such an interesting game to work with. It’s here that ATI implemented its Catalyst AI feature, which is the cause of the performance change.

Shortly after Doom 3 was released, ATI found itself in an interesting situation with regards to what they could do to improve performance. There was a bottleneck in the game in how specular highlighting was applied, and while ATI made some efforts to optimize their drivers for the game, as seen with the 4.09 drivers, this kind of bottleneck was a fundamental issue on which ATI would have to take more serious measures if they wanted to remove it.

What ended up being the bottleneck was that John Carmack, id’s lead programmer, had decided to use a lookup method for determining what highlighting values should be used, based on referencing a specifically constructed texture map with these pre-computed values. It turned out that the R420 could actually calculate such values faster than it could look them up, so to fix the bottleneck would mean replacing the entire shader with what was only a mathematical approximation for the real values in the texture map. However, given the scrutiny over optimizations, it is a difficult choice to make. We’ve covered the issue before, but ultimately, these optimizations are valid in most cases, and the result is the performance improvement as seen above. This case, however, will always serve as a reminder of how fine of a line there is between optimizing and cheating in a game.

But getting back to performance as a whole, outside of ATI’s shader replacement, there’s no further changes in performance. Unfortunately, the replacement means that  Doom3 isn’t too great of an OpenGL benchmark, but as the number of OpenGL games continues to dwindle, there is little else on the market to play that uses OpenGL, which isn’t Doom 3 (or Doom 3 engine based) in the first place.

Catalyst 4.05 versus 6.01 (mouse over to see 4.05)

As for image quality, even though the replacement shader isn’t a perfect replacement, ATI has done a good enough job of designing their replacement, so much that it’s nearly impossible to tell the difference between the two, even with a good screenshot. For all intents and purposes, the image quality is unchanged throughout the entire game on all Catalyst drivers, and here, there is absolutely no visible difference.

Half-Life 2

There’s little that can be said about Half-Life 2 that hasn’t already been said, other than that perhaps Valve put the game’s graphics to a point where the gritty, depressing world of Half-Life was a little too well done. It’s not a tropical paradise, and it’s not one large monster closet, but Half-Life 2 is unique among first-person shooters for what it does with putting players in the middle of a crumbling city. It is for this reason that the game offers a good take on another aspect of performance.

Half-Life 2

Half-Life 2 HQ

With the excitement over this title both before and after it launched, the performance graphs aren’t really surprising here. ATI needed to optimize for this title as they could, and while it’s a bit unusual compared to all of the other shooters tested today to see a gradual performance increase versus one large increase, it’s still a moderate performance increase overall. Of course, since the Source engine was being used in Counter-Strike: Source a few months before Half-Life 2 was released, ATI had the chance to get a jump on optimizing for the engine before the game’s release (and hence, our use of the oldest drivers), so we can’t fully quantify all of the optimizations that ATI made in their drivers for this game.

Here again, we also see some more aggressive performance improvements in the case of AA/AF than without.

Catalyst 4.05 versus 6.01 (mouse over to see 4.05)

If it wasn’t for the fog and water around in the background, we’d have never been able to tell these screenshots apart. There is clearly no change in image quality in spite of the performance improvement seen, which is exactly what we want to find (or rather, not find).

Battlefield 2

As the newest game in our latest investigation, Battlefield 2 is almost too new to include. With only a handful of drivers released in the lifetime of the game, there’s a lack of data points to work with to draw a strong conclusion, but given the number of requests to include this game, we have included it anyhow. As a primarily large multiplayer game, Battlefield 2 strikes an interesting balance between the desire for high quality graphics and the need to be able to render a large firefight without slowing a system to a crawl. Battlefield 2 also is a unique game out of everything that we’ve tested because it’s the only game here that requires pixel shading, whereas everything else merely uses the ability if it’s there.

Battlefield 2

Battlefield 2 HQ

In spite of the lack of data points, the performance graph for Battlefield 2 is surprisingly boring. Outside of a performance improvement with the 5.07 drivers, the first drivers released after the game was launched, there’s a distinct lack of performance improvements. Considering how brutal that this game is, specifically, we would have expected a larger and more continuous performance improvement.

Catalyst 5.05 versus 6.01 (mouse over to see 5.05)

Looking at the screenshot comparison, there’s no change in image quality, which again is what we would expect given the lack of a performance change. We are, however, still intrigued by the game’s poor, blocky shadowing for such a modern title.

3dMark 2005

As we noted in our previous article, 3dMark isn't something that we normally use in an article due to its nature as a synthetic benchmark instead of being a real game. That being said, it's an excellent diagnostic tool both for its wide customizability and the ability to render specific frames. It's also highly prone to being manipulated (both fairly and unfairly) due to the value that some groups attach to it, so while it has little worth as a good way to compare products, it's a great indicator of just what kind of performance improvements a company can wring out of a video card when given the proper motivation.

3dMark 2005

3dMark 2005 HQ

With 3dMark, we're not entirely sure what the reason is for everything that we are seeing - in particular, the large jump between the 4.09 and 4.11 drivers. The 4.9's were the first set of drivers available when 3dMark 2005 was released, so our best guess is that ATI found a way to implement some major 3dMark-specific optimizations between then and 4.11, but there is no mention of this in any release notes.

Otherwise, there is a very interesting progressive increase in 3dMark performance throughout the entire series of Catalyst drivers, which sits in stark contrast to our game tests.

Catalyst 4.05 versus 6.01 (mouse over to see 4.05)

As for image quality, there's no difference to be seen, even when you factor in that the 4.05 and 4.07 drivers have not been approved by FutureMark, since they pre-date 3dMark 2005. As always, it's good to see consistent screenshots when it comes to these matters, as there's nothing here to indicate any foul play on the part of ATI.

9700 Pro vs. X800 Pro

Now that we’ve seen what the X800 Pro has been through in its lifespan, what about the 9700 Pro? One of the biggest requests for inclusion in this series is a direct comparison between the two cards, which we have set up here. Along with the request to use newer games on the 9700 Pro, we have gone ahead and run the 9700 Pro through the same paces on some of the more promising games that we’ve run the X800 Pro through, and recorded the results as a performance factor for each card over its performance on the previous driver revision. For the sake of time and minimizing any impact that a CPU-limited scenario would have, all tests were run with 4x anti-aliasing and 8x anisotropic filtering. We have also included a performance summary, showing the performance factor between the first 4.05 drivers, and the latest 6.01 drivers on these games.

3dMark 2005 HQ Comparison

Far Cry HQ Comparison

Half-Life 2 HQ Comparison

Battlefield 2 HQ Comparison

Overall HQ Comparison

Looking at the numbers, what we see is not what we would have initially expected. Certainly, starting with Far Cry, a 2.05 performance factor is not a typo. The performance of the game actually more than doubled over the scope of these drivers. While, as we’ve mentioned before, it’s not unusual to see a large performance boost due to a single driver, ATI did it twice, significantly reducing the performance difference between the 9700 Pro and X800 Pro. In fact, with the exception of 3dMark 2005 - the only benchmark here specifically capable of testing the differences between the shader abilities of the R3xx and R4xx designs - it’s a similar story for all of the games used in this cross-comparison. In spite of the X800 Pro being the newer, faster card with more potential, it’s the 9700 Pro that saw the biggest performance improvements.

Of course, at around half the framerate of an X800 Pro, the 9700 Pro is measuring some of its performance changes in fractions of a frame per second, so a 17% improvement in Battlefield 2 performance may not change playability at all, but nonetheless, this is a stark reminder of the power of drivers that comes in to play well after the launch of a product. Although this may be a rare scenario due to the architectural similarities between the 9700 Pro and the X800 Pro, it’s good to see that the 9700 series was not forgotten about at ATI when it was replaced by the X800. Hopefully, this isn’t a trend that will be forgotten with the X800 series either, now that the X1000 series is ATI’s high-end product line.


Having now taken a look at two different series of ATI video cards over their lifetimes, we can finally pinpoint some trends with regards to ATI’s drivers that weren’t so clear having seen just one card.

First and foremost, there’s really very little of a progressive performance increase in most games. On any given game, unless there’s something about the driver specifically for it, either a major performance improvement or a bug fix, there’s no reason to upgrade drivers as it’s not going to change anything. It may still be a good idea overall, since most gamers play more than 1 game at once, but on a per-game basis, there’s little reason to upgrade drivers.

Secondly, what performance boosts do come can almost be guaranteed to be in the form of significant one-time performance boosts. On comparing any two drivers, it may seem like performance has gone up or down depending on the natural variations in benchmarking, but only a single game on the R420, Half-Life 2, showed that this was a meaningful improvement instead of the aforementioned variation.

Thirdly, most significant performance improvements occur either early in the life of a card or early in the life of a game depending on which is newer. Although Far Cry is an example of this occurring a bit later in life, as was Halo on the 9700 Pro, these are the exceptions rather than the rule. For any new game that’s been out for more than a couple of months, don’t expect ATI to make any further significant performance changes.

Fourthly, and this isn’t something that we were originally looking at when we started this series, but after installing the Catalyst Control Center on our test bed for the R420, we’re growing increasingly worried about ATI’s direction with their driver control utilities. The Catalyst Control Center increased the booting time of our test bed by approximately 10 seconds using the informal “how long until the hard drive stops working” method. And now that ATI has discontinued their control panel, this is the only 1 st-party way of adjusting an ATI card. ATI seems to have learned little since it first launched the Catalyst Control Center over a year ago.

Lastly, ATI seems to have taken a keener interest in 3dMark lately than they did with the 9700 Pro and 3dMark 2003. For whatever reasons, their 3dMark 2005 has kept increasing while it hasn’t in games, once again providing a practical example of how synthetic benchmarks can be deceiving versus what happens to performance in real games.

So, getting back to the primary questions at hand, how will this translate in to what we can expect from ATI in the future with the R5xx series? Considering what we’ve seen with both the R300 and R420, there seems to be little reason at this moment to expect that ATI will deviate from what they’ve done on their last two generation products. This isn’t going to be a perfect prediction, especially since the R5xx architecture is both brand new this time around and further deviates from traditional GPU design with a heavy shift towards pixel shading, but all signs point to ATI continuing to follow the trends above.

But what about NVIDIA, you may ask? Look for the ForceWare drivers to go under the knife in the near future.

Log in

Don't have an account? Sign up now