It's almost ironic that the one industry we deal with that is directly related to entertainment has been the least exciting for the longest time. The graphics world has been littered with controversies surrounding very fickle things as of late; the majority of articles you'll see relating to graphics these days don't have anything to do with how fast the latest $500 card will run. Instead, we're left to argue about the definition of the word "cheating". We pick at pixels with hopes of differentiating two of the fiercest competitors the GPU world has ever seen, and we debate over 3DMark.

What's interesting is that all of the things we have occupied ourselves with in recent times have been present throughout history. Graphics companies have always had questionable optimizations in their drivers, they have almost always differed in how they render a scene and yes, 3DMark has been around for quite some time now (only recently has it become "cool" to take issue with it).

So why is it that in the age of incredibly fast, absurdly powerful DirectX 9 hardware do we find it necessary to bicker about everything but the hardware? Because, for the most part, we've had absolutely nothing better to do with this hardware. Our last set of GPU reviews were focused on two cards - ATI's Radeon 9800 Pro (256MB) and NVIDIA's GeForce FX 5900 Ultra, both of which carried a hefty $499 price tag. What were we able to do with this kind of hardware? Run Unreal Tournament 2003 at 1600x1200 with 4X AA enabled and still have power to spare, or run Quake III Arena at fairytale frame rates. Both ATI and NVIDIA have spent countless millions of transistors, expensive die space and even sacrificed current-generation game performance in order to bring us some very powerful pixel shader units with their GPUs. Yet, we have been using them while letting their pixel shading muscles atrophy.

Honestly, since the Radeon 9700 Pro, we haven't needed any more performance to satisfy the needs of today's games. If you take the most popular game in recent history, the Frozen Throne expansion to Warcraft III, you could run that just fine on a GeForce4 MX - a $500 GeForce FX 5900 Ultra was in no way, shape or form necessary.

The argument we heard from both GPU camps was that you were buying for the future; that a card you would buy today could not only run all of your current games extremely well, but you'd be guaranteed good performance in the next-generation of games. The problem with this argument was that there was no guarantee when the "next-generation" of games would be out. And by the time they are out, prices on these wonderfully expensive graphics cards may have fallen significantly. Then there's the issue of the fact that how well cards perform in today's pixel-shaderless games honestly says nothing about how DirectX 9 games will perform. And this brought us to the joyful issue of using 3DMark as a benchmark.

If you haven't noticed, we've never relied on 3DMark as a performance tool in our 3D graphics benchmark suites. The only times we've included it, we've either used it in the context of a CPU comparison or to make sure fill rates were in line with what we were expecting. With 3DMark 03, the fine folks at Futuremark had a very ambitious goal in mind - to predict the performance of future DirectX 9 titles using their own shader code designed to mimic what various developers were working on. The goal was admirable; however, if we're going to recommend something to millions of readers, we're not going to base it solely off of one synthetic benchmark that potentially may be indicative of the performance of future games. The difference between the next generation of games and what we've seen in the past is that the performance of one game is much less indicative of the performance of the rest of the market; as you'll see, we're no longer memory bandwidth bound - now we're going to finally start dealing with games whose pixel shader programs and how they are handled by the execution units of the GPU will determine performance.

All of this discussion isn't for naught, as it brings us to why today is so very important. Not too long ago, we were able to benchmark Doom3 and show you a preview of its performance; but with the game being delayed until next year, we have to turn to yet another title to finally take advantage of this hardware - Half-Life 2. With the game almost done and a benchmarkable demo due out on September 30th, it isn't a surprise that we were given the opportunity to benchmark the demos shown off by Valve at E3 this year.

Unfortunately, the story here isn't as simple as how fast your card will perform under Half-Life 2; of course, given the history of the 3D graphics industry, would you really expect something like this to be without controversy?

It's Springer Time
Comments Locked

111 Comments

View All Comments

  • atlr - Friday, September 12, 2003 - link


    Quote from http://www.anandtech.com/video/showdoc.html?i=1863...
    "The Radeon 9600 Pro manages to come within 4% of NVIDIA's flagship, not bad for a ~$100 card."

    Anyone know where a ~$100 9600 Pro is sold? I thought this was a ~$200 card.
  • Anonymous User - Friday, September 12, 2003 - link

    Time to load up on ATI stock :)
  • Anonymous User - Friday, September 12, 2003 - link

    Quoted from Nvidia.com:

    "Microsoft® DirectX® 9.0 Optimizations and Support
    Ensures the best performance and application compatibility for all DirectX 9 applications."

    Oops, not this time around...
  • Anonymous User - Friday, September 12, 2003 - link

    #74 - No, D3 isn't a DX9 game, its OGL. What it shows is that the FX series isn't bad - they just don't do so well under DX9. If you stick primarily to OpenGL games and run your DX games under the 8.1 spec, the FX should perform fine. It's the DX9 code that the FXes seem to really struggle with.
  • Anonymous User - Friday, September 12, 2003 - link

    #74: I have commonly heard this blamed on a bug in an older release of the CATALYST drivers that were used in the Doom3 benchmark. It is my understanding that if the benchmark was repeated with the 3.7 (RELEASED) drivers, the ATI would perform much better.

    #75: I believe this goes back to prior instances where Nvidia has claimed that some new driver would increase performance dramatically to get it into a benchmark and then never release the driver for public use. If this happened, the benchmark would be unreliable as it could not be repeated by an end-user with similar results.

    Also, the Det50 drivers from Nvidia do not have a working fog system. It has been hinted that this could be intentional to improve performance. Either way, I saw a benchmark today (forgot where) that compared the Det45's to the beta Det50's. The 50's did improve performance in 3DMark03 but no where near the 73% gap in performance seen in HL2.
  • Anonymous User - Friday, September 12, 2003 - link

    Because Gabe controls how representative the hl2 beta is of the final hl2 product but he cannot control how representative the nvidia det50 beta is if the final det50s.

    And besides that there have been rumours of "optimalisations" in the new det50s.
  • Anonymous User - Friday, September 12, 2003 - link

    How is it that Gabe can recommend not running benckmarks on an publicly unavailable driver or hardware, yet the game itself is unavailable? Seems a little hypocritical....
  • Anonymous User - Friday, September 12, 2003 - link

    I didn't have time to look into this but can someone enlilghten me as to why the 5900 Ultra outperformed the 9800 PRO in the Doom 3 benchmarks we saw awhile back...is that not using DX9 as well? If I am way off the mark here or am even wrong on which outperformed which go easy on the flames!

    Thanks
  • Anonymous User - Friday, September 12, 2003 - link

    "Not true, the 9500 is a true DX9 part. The 9600 does have faster shader units though."

    My bad, should have looked at ATI first. I guess I'm thinking about the 8500. Either way, I would still go 9600 Pro, especially given that it is cheaper than a 9500 non-pro.
  • Anonymous User - Friday, September 12, 2003 - link

    "The 9600 fully supports DX9 whereas the 9500 does not."

    Not true, the 9500 is a true DX9 part. The 9600 does have faster shader units though.

Log in

Don't have an account? Sign up now