By now you've heard that our Half-Life 2 benchmarking time took place at an ATI event called "Shader Day." The point of Shader Day was to educate the press about shaders, their importance and give a little insight into how ATI's R3x0 architecture is optimized for the type of shader performance necessary for DirectX 9 applications. Granted, there's a huge marketing push from ATI, despite efforts to tone down the usual marketing that is present at these sorts of events.

One of the presenters at Shader Day was Gabe Newell of Valve, and it was in Gabe's presentation that the information we published here yesterday. According to Gabe, during the development of Half-Life 2, the development team encountered some very unusual performance numbers. Taken directly from Gabe's slide in the presentation, here's the performance they saw initially:


Taken from Valve Presentation

As you can guess, the folks at Valve were quite shocked. With NVIDIA's fastest offering unable to outperform a Radeon 9600 Pro (the Pro suffix was omitted from Gabe's chart), something was wrong, given that in any other game, the GeForce FX 5900 Ultra would be much closer to the Radeon 9800 Pro in performance.

Working closely with NVIDIA (according to Gabe), Valve ended up developing a special codepath for NVIDIA's NV3x architecture that made some tradeoffs in order to improve performance on NVIDIA's FX cards. The tradeoffs, as explained by Gabe, were mainly in using 16-bit precision instead of 32-bit precision for certain floats and defaulting to Pixel Shader 1.4 (DX8.1) shaders instead of newer Pixel Shader 2.0 (DX9) shaders in certain cases. Valve refers to this new NV3x code path as a "mixed mode" of operation, as it is a mixture of full precision (32-bit) and partial precision (16-bit) floats as well as pixel shader 2.0 and 1.4 shader code. There's clearly a visual tradeoff made here, which we will get to shortly, but the tradeoff was necessary in order to improve performance.

The resulting performance that the Valve team saw was as follows:


Taken from Valve Presentation

We had to recap the issues here for those who haven't been keeping up with the situation as it unfolded over the past 24 hours, but now that you've seen what Valve has shown us, it's time to dig a bit deeper and answer some very important questions (and of course, get to our own benchmarks under Half-Life 2).

Index ATI & Valve - Defining the Relationship
Comments Locked

111 Comments

View All Comments

  • Anonymous User - Friday, September 12, 2003 - link

    I think the insinuation is clear from that nVidia email posted and Gabe's comments. Valve believed nVidia was trying to "cheat" with their D50s by intentionally having fog disabled etc. Rather than toss around accusations, it was simpler for them to just require that the benchmarks at this point be run with released drivers and avoid the issue of currently bugged drivers with non-working features, whether the reason was accidental or intentional.

    Considering that the FXes fared poorly with 3DMark and again with HL2 - both using DX9 implementations, I think it might be fair to say that the FXes aren't going to do too much better in the future. Especially considering the way they reacted to 3DMark 03 - fighting the benchmark rather than releasing drivers to remedy the performance issue.

    I'd like to see how the FXes do running HL2 with pure DX8 rather than DX9 or a hybrid, as I think most people owning current nVidia cards are going to have to go that route to achieve the framerates desired.
  • Anonymous User - Friday, September 12, 2003 - link

    I dont see how the minimum requirements set but valve are going to play this game. 700mhz and a TNT2. The FX5200's could barely keep up.
  • Anonymous User - Friday, September 12, 2003 - link

    #68: 33 fps * 1.73 = 57.09 fps (add the one to account for the intial 33 score).

    This doesn't quite work out based on the 57.3 score of the 9800 Pro so corrected score on the Nvidia was probably closer to this:
    57.3 / 1.73 = 33.12 fps

    #69: I would definitely try to find a 9600 Pro before I bought a 9500 Pro. The 9600 fully supports DX9 whereas the 9500 does not.
  • Anonymous User - Friday, September 12, 2003 - link

    Guess Its time to upgrade...
    Now where's my &*&%%'n wallet!!


    Wonder where I'll be able to find a R9500Pro (Sapphire)
  • Anonymous User - Friday, September 12, 2003 - link

    The performance increase between the FX5900 and Rad9800Pro is not 73%. Do the math correctly and it turns into 36.5% lead. The article should be revised.
  • atlr - Friday, September 12, 2003 - link

    If anyone sees benchmarks for 1 GHz computers, please post a URL. Thanks.
  • WooDaddy - Friday, September 12, 2003 - link

    Hmmm... I understand that Nvidia would be upset. But it's not like ATI is using a special setting to run faster. They're using DX9.. Nvidia needs to get on the ball. I'm going to have to upgrade my video card since I have a now obsolete Ti4200 GF4.

    GET IT TOGETHER NVIDIA!!! DON'T MAKE ME BUY ATI!

    I might just sell my Nvidia stock while I'm at it. HL2 is a big mover and I believe can make or break the card on the consumer side.
  • Anonymous User - Friday, September 12, 2003 - link

    I had just ordered a 5600 Ultra thinking it would be a great card. It's going back.

    If I can get full DX 9 performance with a 9600 Pro for around $180, and that card's performance is better than the 5900 Ultra - then I'm game.

    I bought a TNT when Nvidia was making a name for it's self. I bought a GF2 GTS when Nvida was destroying the 3dfx - now Nvidia seems to have droped the ball on DX9. I want to play HL2 on what ever card I buy. A 5600 ultra won't seem to cut it. I know the 50's are out there, but I've seen the Aquamark comparision with the 45's and 50's and I'm not impressed.

    I really wanted to buy Nvidia, but I cannot afford it.

  • Anonymous User - Friday, September 12, 2003 - link

    #62: I do have the money but I choose to spend it elsewhere. FYI: I spend $164 US on my 2.4C and I'm running speeds faster than the system used for this benchmark.

    "The Dell PCs we used were configured with Pentium 4 3.0C processors on 875P based motherboards with 1GB of memory. We were running Windows XP without any special modifications to the OS or other changes to the system."

    Anand was using a single system to show what HL2 performance would be on video cards available on the market today. If we was to run benchmarks on different CPU's he would have to spend a tremendous amount more time doing so. In the interest of getting the info out as soon as possible, he limited himself to a single system.

    I would deduce from the performance numbers of HL2 in Anand's benchmarks that unless you have a 9600 Pro/9800 Pro, your AMD will not be able to effectively run HL2.
  • Anonymous User - Friday, September 12, 2003 - link

    Woohoooo!!!
    My ATI 9500@9700 128MB with 8 pixel pipelines and 256bit access beats the crap out of any FX.
    And it only costed me 190euros/190dollars

    Back to the drawing board NVidia.
    Muahahahah!!!

Log in

Don't have an account? Sign up now