You’ve been living too perfect of a life if you’ve never used the phrase “it’s been a long day,” and for NVIDIA it has most definitely been a very long day. Just over two weeks ago the graphics industry was shook by some very hard hitting comments from Gabe Newell of Valve, primarily relating to the poor performance of NVIDIA cards under Half Life 2. All of the sudden ATI had finally done what they had worked feverishly for years to do, they were finally, seemingly overnight, crowned the king of graphics and more importantly – drivers. There were no comments on Half Life 2 day about ATI having poor drivers, compatibility problems or anything even remotely resembling discussions about ATI from the Radeon 8500 days.

Half Life 2 day was quickly followed up with all sorts of accusations against NVIDIA and their driver team; more and more articles were published with new discoveries, shedding light on other areas where ATI trounced NVIDIA. Everything seemed to all make sense now; even 3DMark was given the credibility of being the “I told you so” benchmark that predicted Half Life 2 performance several months in advance of September 12, 2003. At the end of the day and by the end of the week, NVIDIA had experienced the longest day they’ve had in recent history.

Some of the more powerful accusations went far beyond NVIDIA skimping on image quality to improve performance; these accusations included things like NVIDIA not really being capable of running DirectX 9 titles at their full potential, and one of the more interesting ones – that NVIDIA only optimizes for benchmarks that sites like AnandTech uses. Part of the explanation behind the Half Life 2 fiasco was that even if NVIDIA improves performance through later driver revisions, the performance improvements are only there because the game is used as a benchmark – and not as an attempt to improve the overall quality of their customers’ gaming experience. If that were true, then NVIDIA’s “the way it’s meant to be played” slogan would have to go under some serious rethinking; the way it’s meant to be benchmarked comes to mind.

But rewind a little bit; quite a few of these accusations being thrown at NVIDIA were the same ones thrown at ATI. I seem to remember the launch of the Radeon 9700 Pro being tainted with one accusation in particular – that ATI only made sure their drivers worked on popular benchmarking titles, with the rest of the top 20 games out there hardly working on the new R300. As new as what we’re hearing these days about NVIDIA may seem, let us not be victim to the near sightedness of the graphics industry – this has all happened before with ATI and even good ol’ 3dfx.

So who are you to believe? These days it seems like the clear purchase is ATI, but on what data are we basing that? I won’t try to build up suspense senselessly, the clear recommendation today is ATI (how’s that for hype-less journalism), but not because of Half Life 2 or any other conspiracies we’ve seen floating around the web these days.

For entirely too long we’ve been basing GPU purchases on a small subset of tests, encouraging the hardware vendors to spend the majority of their time and resources optimizing for those games. We’re not just talking about NVIDIA, ATI does it too, and you would as well if you were running either of those two companies. We’ve complained about the lack of games with built-in benchmarks and cited that as a reason to sticking with the suite that we’ve used – but honestly, doing what’s easy isn’t a principle I founded AnandTech on 6+ years ago.

So today we bring you quite a few new things, some may surprise you, some may not. ATI has released their Fall refresh product – the Radeon 9800XT and they are announcing their Radeon 9600XT. NVIDIA has counterattacked by letting us publish benchmarks from their forthcoming NV38 GPU (the successor to the NV35 based GeForce FX 5900 Ultra). But quite possibly more important than any of those announcements is the suite of benchmarks we’re testing these cards in; how does a total of 15 popular games sound? This is the first installment of a multipart series that will help you decide what video card is best for you, and hopefully it will do a better job than we have ever in the past.

The extensive benchmarking we’ve undertaken has forced us to split this into multiple parts, so expect to see more coverage on higher resolutions, image quality, anti-aliasing, CPU scaling and budget card comparisons in the coming weeks. We’re working feverishly to bring it all to you as soon as possible and I’m sure there’s some sort of proverb about patience that I should be reciting from memory to end this sentence but I’ll leave it at that.

Now that the long-winded introduction is done with, let’s talk hardware before we dive into a whole lot of software.

The Newcomers
Comments Locked

263 Comments

View All Comments

  • Anonymous User - Wednesday, October 1, 2003 - link

    I think that the new reviews should include Half-life 2....when available
    Also when UT2k4 comes out toward the end of the year (or is available to anand), UT2k3 should be replaced as a benchmarking tool. It seems likely that the graphics engine will be tweaked and better looking, as well as include very large levels in UT2k4

  • Anonymous User - Wednesday, October 1, 2003 - link

    This is what I want to see used for CPU articles. Your old crap tests suck (well, unreal 2003 is still used). This is MUCH more useful to someone trying to find out how the latest games will run on their new cpu. Why use quake3 in cpu articles when you can use a bunch of games like this? Do people care more about quake3 or the batch of games you're using here for tesing vid cards? The very same games apply to picking a new cpu. NOT Q3. That game is DEAD.
  • Anonymous User - Wednesday, October 1, 2003 - link

    You guys should really indicate what the API used for each game is -- DX8, 8.1, 9 or Open GL. That would help out a lot in determining if a company optimizes for an API, a single game, or everything... not everyone follows the game industry enough to know which games are programmed in which graphics API....
  • Jeff7181 - Wednesday, October 1, 2003 - link

    Just so ya know... overclocking will dramatically increase the performance... check this thread I created here for some overclock GeForceFX5900 results...

    http://forums.anandtech.com/messageview.cfm?catid=...
  • Davegod - Wednesday, October 1, 2003 - link

    "This is the first installment of a multipart series that will help you decide what video card is best for you, and hopefully it will do a better job than we have ever in the past.

    The extensive benchmarking we’ve undertaken has forced us to split this into multiple parts, so expect to see more coverage on higher resolutions, image quality, anti-aliasing, CPU scaling and budget card comparisons in the coming weeks. We’re working feverishly to bring it all to you as soon as possible and I’m sure there’s some sort of proverb about patience that I should be reciting from memory to end this sentence but I’ll leave it at that."

    Worth repeating since least 3/4 of whiners seem to have not noticed it. About 1/4 remains for the driver 'issues', which isnt mentioned but still might be/hopefully is intended, although I'd assume it to take at least as much time as the entire rest of the roundup.

    Ye, hopefully parts I-III will include something to give more of an indication of Dx9. With a bit of luck it'll be the HL2 bench - the delay of which maybe being the reason for little in the way of Dx9?

    - DG
  • Anonymous User - Wednesday, October 1, 2003 - link

    please include every game that has been made in the past 5 years, so everyone will be happy and will shut the hell up! :)


  • Anonymous User - Wednesday, October 1, 2003 - link

    regarding AA on halo, disabling the alpha render targets prevents the game from turning it off.
  • Anonymous User - Wednesday, October 1, 2003 - link

    nvidia can compete only in dx.7 and dx.8 or opengl 1.2 games due to wrong strategy of their ceo mr.hu ho ha nv 35 architecture has failed do you really think that nvidia can force microsoft to include nvidia custom shader language [code] in dx.9
  • Anonymous User - Wednesday, October 1, 2003 - link

    While I can appreciate the work it took to generate all these benchmarks...what a complete and utter waste of time! Less than 10% bumps in the clockspeed? Zzzzzzz. I'd have sent it back to ATI and told them to call when they had something interesting.
  • Anonymous User - Wednesday, October 1, 2003 - link

    It would be much more helpful if you included an older video card for reference. Like a geforce 4200, 4600. I am sure there are several users like myself who bought one of these cards in the past year or so and would like to see how it compares to what is new to see how benificial a new upgrade would be.

Log in

Don't have an account? Sign up now