You’ve been living too perfect of a life if you’ve never used the phrase “it’s been a long day,” and for NVIDIA it has most definitely been a very long day. Just over two weeks ago the graphics industry was shook by some very hard hitting comments from Gabe Newell of Valve, primarily relating to the poor performance of NVIDIA cards under Half Life 2. All of the sudden ATI had finally done what they had worked feverishly for years to do, they were finally, seemingly overnight, crowned the king of graphics and more importantly – drivers. There were no comments on Half Life 2 day about ATI having poor drivers, compatibility problems or anything even remotely resembling discussions about ATI from the Radeon 8500 days.

Half Life 2 day was quickly followed up with all sorts of accusations against NVIDIA and their driver team; more and more articles were published with new discoveries, shedding light on other areas where ATI trounced NVIDIA. Everything seemed to all make sense now; even 3DMark was given the credibility of being the “I told you so” benchmark that predicted Half Life 2 performance several months in advance of September 12, 2003. At the end of the day and by the end of the week, NVIDIA had experienced the longest day they’ve had in recent history.

Some of the more powerful accusations went far beyond NVIDIA skimping on image quality to improve performance; these accusations included things like NVIDIA not really being capable of running DirectX 9 titles at their full potential, and one of the more interesting ones – that NVIDIA only optimizes for benchmarks that sites like AnandTech uses. Part of the explanation behind the Half Life 2 fiasco was that even if NVIDIA improves performance through later driver revisions, the performance improvements are only there because the game is used as a benchmark – and not as an attempt to improve the overall quality of their customers’ gaming experience. If that were true, then NVIDIA’s “the way it’s meant to be played” slogan would have to go under some serious rethinking; the way it’s meant to be benchmarked comes to mind.

But rewind a little bit; quite a few of these accusations being thrown at NVIDIA were the same ones thrown at ATI. I seem to remember the launch of the Radeon 9700 Pro being tainted with one accusation in particular – that ATI only made sure their drivers worked on popular benchmarking titles, with the rest of the top 20 games out there hardly working on the new R300. As new as what we’re hearing these days about NVIDIA may seem, let us not be victim to the near sightedness of the graphics industry – this has all happened before with ATI and even good ol’ 3dfx.

So who are you to believe? These days it seems like the clear purchase is ATI, but on what data are we basing that? I won’t try to build up suspense senselessly, the clear recommendation today is ATI (how’s that for hype-less journalism), but not because of Half Life 2 or any other conspiracies we’ve seen floating around the web these days.

For entirely too long we’ve been basing GPU purchases on a small subset of tests, encouraging the hardware vendors to spend the majority of their time and resources optimizing for those games. We’re not just talking about NVIDIA, ATI does it too, and you would as well if you were running either of those two companies. We’ve complained about the lack of games with built-in benchmarks and cited that as a reason to sticking with the suite that we’ve used – but honestly, doing what’s easy isn’t a principle I founded AnandTech on 6+ years ago.

So today we bring you quite a few new things, some may surprise you, some may not. ATI has released their Fall refresh product – the Radeon 9800XT and they are announcing their Radeon 9600XT. NVIDIA has counterattacked by letting us publish benchmarks from their forthcoming NV38 GPU (the successor to the NV35 based GeForce FX 5900 Ultra). But quite possibly more important than any of those announcements is the suite of benchmarks we’re testing these cards in; how does a total of 15 popular games sound? This is the first installment of a multipart series that will help you decide what video card is best for you, and hopefully it will do a better job than we have ever in the past.

The extensive benchmarking we’ve undertaken has forced us to split this into multiple parts, so expect to see more coverage on higher resolutions, image quality, anti-aliasing, CPU scaling and budget card comparisons in the coming weeks. We’re working feverishly to bring it all to you as soon as possible and I’m sure there’s some sort of proverb about patience that I should be reciting from memory to end this sentence but I’ll leave it at that.

Now that the long-winded introduction is done with, let’s talk hardware before we dive into a whole lot of software.

The Newcomers
POST A COMMENT

262 Comments

View All Comments

  • Anonymous User - Saturday, October 04, 2003 - link

    The MS flightsim tests might have v-sync enabled. That would explain the strange test results Reply
  • dswatski - Saturday, October 04, 2003 - link

    AND: Age of Mythology AND: Rendering with Adobe Premiere Pro with support for second monitor. Reply
  • Anonymous User - Saturday, October 04, 2003 - link

    gf Reply
  • Rogodin2 - Saturday, October 04, 2003 - link

    That was a pathetic review because there were way too many varibles and the fact that anand stated that there were no valid premises to reach a conclusion should have been taken to heart before he decided to publish such a POS as this.

    rogo
    Reply
  • Anonymous User - Friday, October 03, 2003 - link

    This info is simply unofficial, as DX doesn't want to stir up the industry more than has alredy been done. As some might recall, 3dfx was given the same ultimatum back in 99', yet the news wasn't even released until 2 years later after Reply
  • Anonymous User - Friday, October 03, 2003 - link

    So by all means, Do Not Download Detonator 50 Drivers!!! Along with this, NV has been caught cheating on benchmarks as they usually do over at Anandtech . Notice that all of the realworld benchmarks perform better on ATi, yet all synthetic benchmarks perform better by a large margin on NV hardware. "These violations are inexcusable" said a DX employee, and I'd have to agree. So without the inside drive on DX10, NV will not be able to even optimize their cards as ATi can and will probably fall into bankruptsy just as 3dfx did before them...
    Reply
  • Anonymous User - Friday, October 03, 2003 - link

    NVIDIA out of DX10? Discuss
    There's an interesting link on Gearbox Software's forums that claim NVIDIA has been shunned by Microsoft's DirectX team for future versions of the API - Thanks SidiasX!

    Nvidia's NV38 (along with the rest of the FX series) has been dubbed as a substandard card by team dx. This means that DX will not include NV in it's developement range for directx10. Team DX made the decision "as a favor to the graphics industry". Team DX claims that NV violated their partnership agreement by changing the DX9 code with their latest set of drivers as caught by Xbit labs recently. This violates the licensing agreement and conpromises DX's quality in order to make it seem as if ATi and NV cards alike display the same image quality (which would be really bad in this case). This can only be fixed by reinstalling dx9b.

    ATI's "Development Agreement"


    it's looking bad for Nvidia..
    Reply
  • Anonymous User - Friday, October 03, 2003 - link

    NVIDIA out of DX10? Discuss
    There's an interesting link on Gearbox Software's forums that claim NVIDIA has been shunned by Microsoft's DirectX team for future versions of the API - Thanks SidiasX!

    Nvidia's NV38 (along with the rest of the FX series) has been dubbed as a substandard card by team dx. This means that DX will not include NV in it's developement range for directx10. Team DX made the decision "as a favor to the graphics industry". Team DX claims that NV violated their partnership agreement by changing the DX9 code with their latest set of drivers as caught by Xbit labs recently. This violates the licensing agreement and conpromises DX's quality in order to make it seem as if ATi and NV cards alike display the same image quality (which would be really bad in this case). This can only be fixed by reinstalling dx9b.

    ATI's "Development Agreement"


    it's looking bad for Nvidia..
    Reply
  • Anonymous User - Friday, October 03, 2003 - link

    NVIDIA out of DX10? Discuss
    There's an interesting link on Gearbox Software's forums that claim NVIDIA has been shunned by Microsoft's DirectX team for future versions of the API - Thanks SidiasX!

    Nvidia's NV38 (along with the rest of the FX series) has been dubbed as a substandard card by team dx. This means that DX will not include NV in it's developement range for directx10. Team DX made the decision "as a favor to the graphics industry". Team DX claims that NV violated their partnership agreement by changing the DX9 code with their latest set of drivers as caught by Xbit labs recently. This violates the licensing agreement and conpromises DX's quality in order to make it seem as if ATi and NV cards alike display the same image quality (which would be really bad in this case). This can only be fixed by reinstalling dx9b.

    ATI's "Development Agreement"


    it's looking bad for Nvidia..
    Reply
  • Anonymous User - Friday, October 03, 2003 - link

    The CPu is not out.
    the NV38 is not out
    the new drivers 52.14 are not out.
    and these drivers have issues and probably IQ degradation.

    the test should go up to 1600 x 1200 at least,
    we should stress video cards not CPU's.
    DX9 needs to be included in the benches.

    I know what my next card will be ,
    ATI will be replacing my Nvidia soon.

    I want to play HL2 and TR aod.(I love the game).

    I remember , years ago ,when ATI came out with a faster card and the next day Nvidia had a new driver that increased performance by 25%.

    I'm still disgusted since the cheat drivers with bad IQ, and poor DX9 .





    Reply

Log in

Don't have an account? Sign up now