By now you've heard that our Half-Life 2 benchmarking time took place at an ATI event called "Shader Day." The point of Shader Day was to educate the press about shaders, their importance and give a little insight into how ATI's R3x0 architecture is optimized for the type of shader performance necessary for DirectX 9 applications. Granted, there's a huge marketing push from ATI, despite efforts to tone down the usual marketing that is present at these sorts of events.

One of the presenters at Shader Day was Gabe Newell of Valve, and it was in Gabe's presentation that the information we published here yesterday. According to Gabe, during the development of Half-Life 2, the development team encountered some very unusual performance numbers. Taken directly from Gabe's slide in the presentation, here's the performance they saw initially:


Taken from Valve Presentation

As you can guess, the folks at Valve were quite shocked. With NVIDIA's fastest offering unable to outperform a Radeon 9600 Pro (the Pro suffix was omitted from Gabe's chart), something was wrong, given that in any other game, the GeForce FX 5900 Ultra would be much closer to the Radeon 9800 Pro in performance.

Working closely with NVIDIA (according to Gabe), Valve ended up developing a special codepath for NVIDIA's NV3x architecture that made some tradeoffs in order to improve performance on NVIDIA's FX cards. The tradeoffs, as explained by Gabe, were mainly in using 16-bit precision instead of 32-bit precision for certain floats and defaulting to Pixel Shader 1.4 (DX8.1) shaders instead of newer Pixel Shader 2.0 (DX9) shaders in certain cases. Valve refers to this new NV3x code path as a "mixed mode" of operation, as it is a mixture of full precision (32-bit) and partial precision (16-bit) floats as well as pixel shader 2.0 and 1.4 shader code. There's clearly a visual tradeoff made here, which we will get to shortly, but the tradeoff was necessary in order to improve performance.

The resulting performance that the Valve team saw was as follows:


Taken from Valve Presentation

We had to recap the issues here for those who haven't been keeping up with the situation as it unfolded over the past 24 hours, but now that you've seen what Valve has shown us, it's time to dig a bit deeper and answer some very important questions (and of course, get to our own benchmarks under Half-Life 2).

Index ATI & Valve - Defining the Relationship
Comments Locked

111 Comments

View All Comments

  • atlr - Friday, September 12, 2003 - link


    Quote from http://www.anandtech.com/video/showdoc.html?i=1863...
    "The Radeon 9600 Pro manages to come within 4% of NVIDIA's flagship, not bad for a ~$100 card."

    Anyone know where a ~$100 9600 Pro is sold? I thought this was a ~$200 card.
  • Anonymous User - Friday, September 12, 2003 - link

    Time to load up on ATI stock :)
  • Anonymous User - Friday, September 12, 2003 - link

    Quoted from Nvidia.com:

    "Microsoft® DirectX® 9.0 Optimizations and Support
    Ensures the best performance and application compatibility for all DirectX 9 applications."

    Oops, not this time around...
  • Anonymous User - Friday, September 12, 2003 - link

    #74 - No, D3 isn't a DX9 game, its OGL. What it shows is that the FX series isn't bad - they just don't do so well under DX9. If you stick primarily to OpenGL games and run your DX games under the 8.1 spec, the FX should perform fine. It's the DX9 code that the FXes seem to really struggle with.
  • Anonymous User - Friday, September 12, 2003 - link

    #74: I have commonly heard this blamed on a bug in an older release of the CATALYST drivers that were used in the Doom3 benchmark. It is my understanding that if the benchmark was repeated with the 3.7 (RELEASED) drivers, the ATI would perform much better.

    #75: I believe this goes back to prior instances where Nvidia has claimed that some new driver would increase performance dramatically to get it into a benchmark and then never release the driver for public use. If this happened, the benchmark would be unreliable as it could not be repeated by an end-user with similar results.

    Also, the Det50 drivers from Nvidia do not have a working fog system. It has been hinted that this could be intentional to improve performance. Either way, I saw a benchmark today (forgot where) that compared the Det45's to the beta Det50's. The 50's did improve performance in 3DMark03 but no where near the 73% gap in performance seen in HL2.
  • Anonymous User - Friday, September 12, 2003 - link

    Because Gabe controls how representative the hl2 beta is of the final hl2 product but he cannot control how representative the nvidia det50 beta is if the final det50s.

    And besides that there have been rumours of "optimalisations" in the new det50s.
  • Anonymous User - Friday, September 12, 2003 - link

    How is it that Gabe can recommend not running benckmarks on an publicly unavailable driver or hardware, yet the game itself is unavailable? Seems a little hypocritical....
  • Anonymous User - Friday, September 12, 2003 - link

    I didn't have time to look into this but can someone enlilghten me as to why the 5900 Ultra outperformed the 9800 PRO in the Doom 3 benchmarks we saw awhile back...is that not using DX9 as well? If I am way off the mark here or am even wrong on which outperformed which go easy on the flames!

    Thanks
  • Anonymous User - Friday, September 12, 2003 - link

    "Not true, the 9500 is a true DX9 part. The 9600 does have faster shader units though."

    My bad, should have looked at ATI first. I guess I'm thinking about the 8500. Either way, I would still go 9600 Pro, especially given that it is cheaper than a 9500 non-pro.
  • Anonymous User - Friday, September 12, 2003 - link

    "The 9600 fully supports DX9 whereas the 9500 does not."

    Not true, the 9500 is a true DX9 part. The 9600 does have faster shader units though.

Log in

Don't have an account? Sign up now