You’ve been living too perfect of a life if you’ve never used the phrase “it’s been a long day,” and for NVIDIA it has most definitely been a very long day. Just over two weeks ago the graphics industry was shook by some very hard hitting comments from Gabe Newell of Valve, primarily relating to the poor performance of NVIDIA cards under Half Life 2. All of the sudden ATI had finally done what they had worked feverishly for years to do, they were finally, seemingly overnight, crowned the king of graphics and more importantly – drivers. There were no comments on Half Life 2 day about ATI having poor drivers, compatibility problems or anything even remotely resembling discussions about ATI from the Radeon 8500 days.

Half Life 2 day was quickly followed up with all sorts of accusations against NVIDIA and their driver team; more and more articles were published with new discoveries, shedding light on other areas where ATI trounced NVIDIA. Everything seemed to all make sense now; even 3DMark was given the credibility of being the “I told you so” benchmark that predicted Half Life 2 performance several months in advance of September 12, 2003. At the end of the day and by the end of the week, NVIDIA had experienced the longest day they’ve had in recent history.

Some of the more powerful accusations went far beyond NVIDIA skimping on image quality to improve performance; these accusations included things like NVIDIA not really being capable of running DirectX 9 titles at their full potential, and one of the more interesting ones – that NVIDIA only optimizes for benchmarks that sites like AnandTech uses. Part of the explanation behind the Half Life 2 fiasco was that even if NVIDIA improves performance through later driver revisions, the performance improvements are only there because the game is used as a benchmark – and not as an attempt to improve the overall quality of their customers’ gaming experience. If that were true, then NVIDIA’s “the way it’s meant to be played” slogan would have to go under some serious rethinking; the way it’s meant to be benchmarked comes to mind.

But rewind a little bit; quite a few of these accusations being thrown at NVIDIA were the same ones thrown at ATI. I seem to remember the launch of the Radeon 9700 Pro being tainted with one accusation in particular – that ATI only made sure their drivers worked on popular benchmarking titles, with the rest of the top 20 games out there hardly working on the new R300. As new as what we’re hearing these days about NVIDIA may seem, let us not be victim to the near sightedness of the graphics industry – this has all happened before with ATI and even good ol’ 3dfx.

So who are you to believe? These days it seems like the clear purchase is ATI, but on what data are we basing that? I won’t try to build up suspense senselessly, the clear recommendation today is ATI (how’s that for hype-less journalism), but not because of Half Life 2 or any other conspiracies we’ve seen floating around the web these days.

For entirely too long we’ve been basing GPU purchases on a small subset of tests, encouraging the hardware vendors to spend the majority of their time and resources optimizing for those games. We’re not just talking about NVIDIA, ATI does it too, and you would as well if you were running either of those two companies. We’ve complained about the lack of games with built-in benchmarks and cited that as a reason to sticking with the suite that we’ve used – but honestly, doing what’s easy isn’t a principle I founded AnandTech on 6+ years ago.

So today we bring you quite a few new things, some may surprise you, some may not. ATI has released their Fall refresh product – the Radeon 9800XT and they are announcing their Radeon 9600XT. NVIDIA has counterattacked by letting us publish benchmarks from their forthcoming NV38 GPU (the successor to the NV35 based GeForce FX 5900 Ultra). But quite possibly more important than any of those announcements is the suite of benchmarks we’re testing these cards in; how does a total of 15 popular games sound? This is the first installment of a multipart series that will help you decide what video card is best for you, and hopefully it will do a better job than we have ever in the past.

The extensive benchmarking we’ve undertaken has forced us to split this into multiple parts, so expect to see more coverage on higher resolutions, image quality, anti-aliasing, CPU scaling and budget card comparisons in the coming weeks. We’re working feverishly to bring it all to you as soon as possible and I’m sure there’s some sort of proverb about patience that I should be reciting from memory to end this sentence but I’ll leave it at that.

Now that the long-winded introduction is done with, let’s talk hardware before we dive into a whole lot of software.

The Newcomers
Comments Locked

263 Comments

View All Comments

  • Anonymous User - Wednesday, October 1, 2003 - link

    #22,

    /me waves.

    Thanks for the personal attack though. I admit to not knowing every last detail about 3D that there is to know, but some things don't take an EE degree to figure out.

    If you want to see my detailed reasons for not liking this review and its conclusion, read the following url:

    http://www.beyond3d.com/forum/viewtopic.php?p=1743...
  • Anonymous User - Wednesday, October 1, 2003 - link

    #73 makes a good point...but at the same time I've made a few observations on that note. I've seen a lot more motherboards with a gap between the AGP slot and PCI slots...and while some people would be led to believe it is just for Nvidia cards, this is most likely not the case. Graphics cards in general put out a lot of heat, and it's never a good idea to put a big card right next to your graphics card anyway, you're just begging for heat problems. For the most part it's just the Nvidia reference design that takes up two slots. The boards distributors usually use their own cooling anyway and plenty are available that only use one slot.

    What it all boils down to is that it's not the size it's how you use it. :)

    Now as far as ON topic ;) I thought the benchmarks did what they should....they showed performance in today's popular games and some signs of what is to come. For those of you crying because there are no significant DX9 entries...guesss what...DX9 games aren't available in any kind of quantity and won't be any time soon. Granted there will be some, but the bulk of games that are released in the next 6 months will be built on DX8 with some DX9 features. By the time the publishers start churning out DX9 titles guess what...the new chips will be ready for release which will run full DX9 titles better.

    Coincidence? Not at all. Does Nvidia or ATI want you to buy their 500 dollar card now and use it for the next two years...hell no. They want you to buy bleeding edge technology for now, then buy another new one in a year or less...and so on and so forth. There's a reason they release a whole line of cards at once (performance, mainstream, budget), that's so they can tackle the whole market with each release. If they make a card too good now you won't need to buy their next one...welcome to the world of trying to make money :).

    Ok I'll stop rambling...good job with benchies Anand :)
  • Anonymous User - Wednesday, October 1, 2003 - link

    Please add a benchmark for MMORPGs of some sort to your suite (Dark Age of Camelot, Everquest, etc.)
  • sorren - Wednesday, October 1, 2003 - link

    For those of us with 17"+ LCD Monitors, the 1280x1024 resolution results would be more useful since this is the most common native resolution for these monitors. The games look great, just as long as we keep some games other than just action and FPS games. I mostly play strategy and RPGs so it's good to see Warcraft and NWN on the list. Keep up the great work!
  • Anonymous User - Wednesday, October 1, 2003 - link

    my guess is that because of the massive # of games that they were using for benchmarks...they didnt have time to test in more resolutions?

    also #42 makes a lot of good points.
  • Insomniac - Wednesday, October 1, 2003 - link

    #68: To be thourough:

    Command & Conquer Generals: Zero Hour
    - 5600 Ultra -> 28.3 to 32.9

    Homeworld 2: Benchmark 1
    - 5600 Ultra -> 25.1 to 38.4

    Homeworld 2: Benchmark 2
    - NV38 -> 43.8 to 44.3
    - 5600 Ultra -> 15.5 to 25

    Neverwinter Nights: Shadows of the Undertide
    - 5600 Ultra -> 26.9 to 30.5

    Simcity 4
    - 9800 Pro 256MB -> 55.7 to 56
    - 9800 XT -> 55.7 to 56
    - 9800 -> 55.4 to 56
    - 9700 Pro -> 54.6 to 56

    I included every card and benchmark I saw it on for thoroughness and to avoid being accused of being a fanboy. ;)
  • Anonymous User - Wednesday, October 1, 2003 - link

    to question 75:
    If thats the case why dident they test the cards at 1280X1024 for this PRELIMINARY review as they do with all other high performance cards, seems sort of odd to me.
  • Anonymous User - Wednesday, October 1, 2003 - link

    Just a friendly reminder:

    NV38 is still not a 'finished' design; and by finished I mean there is still not a publicly available set of drivers supporting the card. The card itself is not even publicly available much less on the OEM market, therefore it makes it rather difficult to fully benchmark this product. Likewise, to a certain extent the 9800XT is not a finished design even though it's on the market, as the Overdrive (ATI supported overclockin) feature is unavailable until the Catalyst 3.8 drivers become publicly available in the next week or so.

    The point of this rant is that the information presented here in Anandtech's review is PRELIMINARY. Regardless of Anantech having engineering samples or final products, beta drivers or publicly available drivers, they can only work with what they have available to them at the present time, and when reading this review you HAVE to take that sort of non-explicitly-stated information into context to guage credibility.

    Personally, I believe that given what is available at the present time Anandtech has done a very good job of providing a sample guage of what to expect from the newest 'refresh' video cards which are still incomplete in regards to being able to be all that they can be (special application optimizations not withstanding of course). While I would like to see them guage these cards against older cards as someone mentioned earlier in this thread to see if upgrading is worth it, I don't see the point of doing so until these products are fully completed (i.e.: they're readily available in stores and they have publicly available drivers).

    So perhaps after the NV38 truly comes to market that would be a better time to insist on seeing an all-out battle of the GPU's. Just my two cents on the matter, have a good day.
  • Johnbear007 - Wednesday, October 1, 2003 - link

    I would like to see Battlefield 1942 added
  • Anonymous User - Wednesday, October 1, 2003 - link

    when is nvidia ganna work on making the card smaller so it doesnt take up a good pci slot

Log in

Don't have an account? Sign up now