Friday afternoon we got an email pointing us to the latest beta driver from ATI. One of the key features of this driver is a performance boost for dual core systems. Building a driver to take advantage of parallel processing is quite a task, and extracting any noticeable performance gain out of it is even more difficult. So we are here today to see just what ATI has gotten out of their efforts thus far.

Admittedly, the highest performance gains come from low resolutions without antialiasing. It stands to reason that the more CPU limited a test is, the more benefit the game will get from freeing up CPU resources. The real benefit to end users if only lower resolutions benefit is questionable, but every step helps. With the future of computer hardware firmly planted in parallelism, the burden of improving performance shifts a little further towards software developers. Coming up with new and interesting ways to parallelize code efficiently is going to be quite a new task for desktop software programmers to tackle. And in the end, Amdahl's Law reminds us that we are still limited by the benefit we can get from parallelism. The percentage of code that must remain sequential will become the limiting factor. But every little bit of parallelization still helps.

There are really quite a few questions to be asked about this driver. After adding up everything we wanted to do, the sheer number of tests we had laid out was enormous. In an effort to be more efficient ourselves, we decided to break our analysis of the 5.12 driver up. This article is meant as a quick look at the benefits of ATI's dual core enhancements on a few select games running on X1K series hardware. We will compare this driver to the old one as well as dual core performance to single core performance.

Our next look at the 5.12 driver will include a comparison to NVIDIA performance in single and dual core systems, more than one ATI card, more games, and as many more things as we can pack in. Of course we are open to suggestion. But for now, we'll take a look at what we've got to work with.

The Test


View All Comments

  • mbhame - Sunday, December 4, 2005 - link

    Who wrote this article? Reply
  • stephenbrooks - Monday, December 5, 2005 - link

    Derek Wilson Reply
  • PrinceGaz - Sunday, December 4, 2005 - link

    I have an X2 4400+ and like many other people have been forced to revert to the 7x.xx Forceware drivers because the new dual-core drivers cause certain well known OpenGL applications (3DS Max and PaintShop Pro for instance) to hang when trying to start them. If you haven't heard of this problem, just try googling and you'll get plenty of hits.

    I'd rather have nVidia fix bugs before adding new performance enhancing features, but sadly it is all about getting a few extra pecent over ATI in the latest games it seems.
  • hondaman - Monday, December 5, 2005 - link

    Nvidia claims that their drivers have DC optimisations, although i havent seen any review that shows one way or the other if it really does.

    I personally found this "review" to be quite interesting, and hope anandtech does the same for nvidia and their newest drivers.
  • mmp121 - Sunday, December 4, 2005 - link


    Do the drivers show any improvement while using a single core CPU w/HT enabled? Is it supposed to? How does it affect previous generation hardware? Are the tweaks only good for the X1000 hardware? You asked for suggestions, I gave some. Hope to see some of em answered.
  • stephenbrooks - Monday, December 5, 2005 - link

    ^^^ above are good questions Reply
  • johnsonx - Sunday, December 4, 2005 - link

    Seems to me ATI had best get to the bottom of the single-core performance deficit in these 5.12 drivers before they come out of beta. All the fanbois would get their panties in a wad if the new driver hurts performance in the top-end FX-57 gaming rigs. If nothing else, they could include regular and DC-optimized versions of the key driver files and install them based on detecting 1 or 2(+) cores.

    Actually, what might be even better from a marketing point of view is if they have a 'regular' driver that works fine for all systems, and a separate 'dual-core optimized' driver. Nothing gives users the warm fuzzies like being told 'oh, for YOU we have a special, better driver. Later on, once dual-core is almost universal in new systems, they could just unify the driver again.
  • wien - Sunday, December 4, 2005 - link

    Though a good idea, I fear the changes they have made to the driver to "parallellize" it can't be plugged in and out that easily. And if they can't, ATI would have to keep two separate code-trees (single and dual core) for their drivers, and update them both every time they come up with an improvement. What would probably end up happening is that the single core version would be more of less stagnant in terms of development (but with version numbers increasing of course), and the DC version getting the actual improvements. (Or the other way around... for now at least.) Reply
  • Pannenkoek - Sunday, December 4, 2005 - link

    The effort to optimize their dual core drivers to mitigate the single core performance loss is far less than keeping two parallel branches of their drivers in development. This is beta software, it's not as tuned as it can be. We won't know how the performance will be when the driver gets actually released. Reply
  • mlittl3 - Sunday, December 4, 2005 - link

    That's a good idead. Reply

Log in

Don't have an account? Sign up now