It's almost ironic that the one industry we deal with that is directly related to entertainment has been the least exciting for the longest time. The graphics world has been littered with controversies surrounding very fickle things as of late; the majority of articles you'll see relating to graphics these days don't have anything to do with how fast the latest $500 card will run. Instead, we're left to argue about the definition of the word "cheating". We pick at pixels with hopes of differentiating two of the fiercest competitors the GPU world has ever seen, and we debate over 3DMark.

What's interesting is that all of the things we have occupied ourselves with in recent times have been present throughout history. Graphics companies have always had questionable optimizations in their drivers, they have almost always differed in how they render a scene and yes, 3DMark has been around for quite some time now (only recently has it become "cool" to take issue with it).

So why is it that in the age of incredibly fast, absurdly powerful DirectX 9 hardware do we find it necessary to bicker about everything but the hardware? Because, for the most part, we've had absolutely nothing better to do with this hardware. Our last set of GPU reviews were focused on two cards - ATI's Radeon 9800 Pro (256MB) and NVIDIA's GeForce FX 5900 Ultra, both of which carried a hefty $499 price tag. What were we able to do with this kind of hardware? Run Unreal Tournament 2003 at 1600x1200 with 4X AA enabled and still have power to spare, or run Quake III Arena at fairytale frame rates. Both ATI and NVIDIA have spent countless millions of transistors, expensive die space and even sacrificed current-generation game performance in order to bring us some very powerful pixel shader units with their GPUs. Yet, we have been using them while letting their pixel shading muscles atrophy.

Honestly, since the Radeon 9700 Pro, we haven't needed any more performance to satisfy the needs of today's games. If you take the most popular game in recent history, the Frozen Throne expansion to Warcraft III, you could run that just fine on a GeForce4 MX - a $500 GeForce FX 5900 Ultra was in no way, shape or form necessary.

The argument we heard from both GPU camps was that you were buying for the future; that a card you would buy today could not only run all of your current games extremely well, but you'd be guaranteed good performance in the next-generation of games. The problem with this argument was that there was no guarantee when the "next-generation" of games would be out. And by the time they are out, prices on these wonderfully expensive graphics cards may have fallen significantly. Then there's the issue of the fact that how well cards perform in today's pixel-shaderless games honestly says nothing about how DirectX 9 games will perform. And this brought us to the joyful issue of using 3DMark as a benchmark.

If you haven't noticed, we've never relied on 3DMark as a performance tool in our 3D graphics benchmark suites. The only times we've included it, we've either used it in the context of a CPU comparison or to make sure fill rates were in line with what we were expecting. With 3DMark 03, the fine folks at Futuremark had a very ambitious goal in mind - to predict the performance of future DirectX 9 titles using their own shader code designed to mimic what various developers were working on. The goal was admirable; however, if we're going to recommend something to millions of readers, we're not going to base it solely off of one synthetic benchmark that potentially may be indicative of the performance of future games. The difference between the next generation of games and what we've seen in the past is that the performance of one game is much less indicative of the performance of the rest of the market; as you'll see, we're no longer memory bandwidth bound - now we're going to finally start dealing with games whose pixel shader programs and how they are handled by the execution units of the GPU will determine performance.

All of this discussion isn't for naught, as it brings us to why today is so very important. Not too long ago, we were able to benchmark Doom3 and show you a preview of its performance; but with the game being delayed until next year, we have to turn to yet another title to finally take advantage of this hardware - Half-Life 2. With the game almost done and a benchmarkable demo due out on September 30th, it isn't a surprise that we were given the opportunity to benchmark the demos shown off by Valve at E3 this year.

Unfortunately, the story here isn't as simple as how fast your card will perform under Half-Life 2; of course, given the history of the 3D graphics industry, would you really expect something like this to be without controversy?

It's Springer Time
Comments Locked

111 Comments

View All Comments

  • Anonymous User - Friday, September 12, 2003 - link

    ==="full 32-bit would be required" not 24-bit. So that leaves all ATI cards out in the cold.===

    By the time full 32-bit becomes standard (probably with DX10 in 2-3 years) there will be NEW cards that make current cards look like sh!t. ATi will have DX10 cards for under $100, same as nVidia and their 5200. People have been upgrading their PC's for new games for YEARS! Only an [nv]IDIOT would attempt to use an old card for new games and software (TNT2 for Doom3? NOT!).
  • Anonymous User - Friday, September 12, 2003 - link

    Funny that you guys think nVidia will be still "plugging along" with the GFFX if the DX spec changes to 32bit... you _do_ know what happens to the GFFX when it's forced to run 32bit prcession don't you? You'd get faster framerates by drawing each frame by hand on your monitor with a sharpie.
  • Pete - Friday, September 12, 2003 - link

    #23, the second quote in the first post here may be of interest: http://www.beyond3d.com/forum/viewtopic.php?t=7839... Note the last sentence, which I surrounded by ***'s.

    "nVidia has released the response as seen in the link. Particularly interesting, however, is this part of the e-mail sent to certain nVidia employees ( this was not posted at the given link ):

    'We have been working very closely with Valve on the development of Half Life 2 and tuning for NVIDIA GPU's. And until a week ago had been in close contact with their technical team. It appears that, in preparation for ATI's Shader Days conference, they have misinterpreted bugs associated with a beta version of our release 50 driver.

    You also may have heard that Valve has closed a multi-million dollar marketing deal with ATI. Valve invited us to bid on an exclusive marketing arrangement but we felt the price tag was far too high. We elected not to participate. ***We have no evidence or reason to believe that Valve's presentation yesterday was influenced by their marketing relationship with ATI.***'"

    If this document is indeed real, nV themselves told their own employees Gabe's presentation wasn't skewed by Valve's marketing relationship with ATi.
  • Anonymous User - Friday, September 12, 2003 - link

    Link please #38
  • Anonymous User - Friday, September 12, 2003 - link

    LOL! 19, I saw that too. Looks like I'll be replacing my nVidia 'the way it's meant to be played in DX8 because our DX9 runs like ass, and we still sell it for $500+ to uninformed customers' card with an ATi Radeon. Thanks for the review Anand; it will be interesting to see the AA/AF benchmarks, but I have a pretty good idea of who will win those as well.
  • Anonymous User - Friday, September 12, 2003 - link

    >>>>>>>ANYONE ELSE CATCH THE FOLLOWING IN THE ARTICLE<<<<<<<<<<<<<<<

    ""One thing that is also worth noting is that the shader-specific workarounds for NVIDIA that were implemented by Valve, will not immediately translate to all other games that are based off of Half-Life 2's Source engine. Remember that these restructured shaders are specific to the shaders used in Half-Life 2, which won't necessarily be the shaders used in a different game based off of the same engine.""

    So I guess the nvidia fan boys won't be able to run their $500 POS cards with Counterstrike 2 since it will probably be based on the HL2 engine.

    buhahahaha

    >>>>>>>>>>>>>>>>>>>>>>>>><<<<<<<<<<<<<<<<<<<
  • Anonymous User - Friday, September 12, 2003 - link

    Valve specifically said "full 32-bit would be required" not 24-bit. So that leaves all ATI cards out in the cold.
  • Pete - Friday, September 12, 2003 - link

    #23, I believe you're inferring far too much from ATi's HL2 bundling. Check TechReport's article on Gabe's presentation, in which Gabe is noted as saying Valve chose ATi (in the bidding war to bundle HL2) because their cards quite obviously performed so much better (and look better doing it--keep in mind, as Anand said, all those nVidia mixed modes look worse than pure DX9).

    In short, Valve doesn't need to do much to please others, as they're the one being chased for the potentially huge-selling Half-Life 2. Everyone will be sucking up to them, not the other way around. And it wouldn't do for Valve to offer nV the bundle exclusive, have consumers expect brilliant performance from the bundled FX cards, and get 12fps in DX9 on their DX9 FX card or 30fps on their $400+ 5900U. That would result in a lot of angry customers for Valve, which is a decidedly bad business move.

    People will buy HL2 regardless. Valve's bundling of HL2 with new cards is just an extra source of income for them, and not vital to the success of HL2 in any way. Bundling HL2 will be a big coup for an IHV like ATi, which requires boundary-pushing games like HL2 to drive hardware sales. Think of the relationship in this way: it's not that ATi won the bidding war to bundle HL2, but that Valve *allowed* ATi to win. Valve was going to get beaucoup bucks for marketing tie-ins with HL2 either way, so it's in their best interests to find sponsorships that present HL2 in the best light (thus apparently HL2 will be bundled with ATi DX9 cards, not their DX8 ones).

    You should read page 3 of Anand's article more closely, IMO. Valve coded not to a specific hardware standard, but to the DX9 standard. ATi cards run standard DX9 code much better than nV. Valve had to work extra hard to try to find custom paths to allow for the FX's weaknesses, but even that doesn't bring nV even with ATi in terms of performance. So ATi's current DX9 line-up is the poster-child for HL2 almost by default.

    We'll see what the Det50's do for nV's scores and IQ soon enough, and that should indicate whether Gabe was being mean or just frank.
  • Anonymous User - Friday, September 12, 2003 - link

    #33 To be pedantic, the spec for DX9 24bit minimum, it has never been said by Microsoft that it was 24bit and nothing else, 24bit is just a minimum.

    Just as 640x480 is a minimum. That doesn't make 1024x768 non standard.

    But considering you are right, and 24 bit is a rock solid standard, doesn't that mean that Valve in the future will violate the DX9 spec in your eyes? Does that not mean that ATI cards will be left high and dry, in the future? Afterall, there will be no optimizations allowed/able?

    32bit is the future, according to Valve after all.
    Nvidia may suck at doing it, but at least they can do it.
  • XPgeek - Friday, September 12, 2003 - link

    edit, post #32-

    should read, "my ATi is so faster than YOUR nVidia"

Log in

Don't have an account? Sign up now