Improving Performance on NVIDIA

If the hypotheses mentioned on the previous page hold true, then there may be some ways around these performance issues. The most obvious is through updated drivers. NVIDIA does have a new driver release on the horizon, the Detonator 50 series of drivers. However, Valve instructed us not to use these drivers as they do not render fog in Half-Life 2. In fact, Valve was quite insistent that we only used publicly available drivers on publicly available hardware, which is a reason you won't see Half-Life 2 benchmarks in our upcoming Athlon 64 review.

Future drivers may be the key for higher performance to be enabled on NVIDIA platforms, but Gabe issued the following warning:

"I guess I am encouraging skepticism about future driver performance."

Only time will tell if updated drivers can close the performance gap, but as you are about to see, it is a decent sized gap.

One thing that is also worth noting is that the shader-specific workarounds for NVIDIA implemented by Valve will not immediately translate to all other games that are based off of Half-Life 2's Source engine. Remember that these restructured shaders are specific to the shaders used in Half-Life 2, which won't necessarily be the shaders used in a different game based off of the same engine.

Gabe also cautioned that reverting to 16-bit floating point values will only become more of an issue going forward as "newer DX9 functionality will be able to use fewer and fewer partial precision functions." Although the theory is that by the time this happens, NV4x will be upon us and will have hopefully fixed the problems that we're seeing today.

NVIDIA's Official Response

Of course, NVIDIA has their official PR response to these issues, which we've published below:

During the entire development of Half Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed.

We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.

Regarding the Half Life2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half-Life 2 and other new games are included in our Rel.50 drivers - which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.

Pending detailed information from Valve, we are unaware of any issues with Rel. 50 and the drop of Half-Life 2 that we have. The drop of Half-Life 2 that we currently have is more than 2 weeks old. It is not a cheat or an over optimization. Our current drop of Half-Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half-Life 2.

We are committed to working with Gabe to fully understand.

What's Wrong with NVIDIA? More on Mixed-Mode for NV3x
Comments Locked

111 Comments

View All Comments

  • Anonymous User - Friday, September 12, 2003 - link

    I perviously posted this in a wrong place so let me just shamelessly repost in here:

    Let me just get my little disclaimer out, before I dive into being a devil's advocate - I own both 9800pro and fx5900nu and am not biased to neither, ATi or nVidia.
    With that being said, let me take a shot at what Anand opted not to speculate about ant that is the question of ATi/Valve colaboration and their present and future relationship.
    First of all, FX's architecture is obviously inferior to R3x0 in terms of native DX9 and tha is not going to be my focus. I would rather debate a little about the business/finacial side of ATi/Valve relationship. That's the area of my expertise and looking at this situation from afinacial angle might add another twist to this.
    What got my attention are Gabe Newell presentations slides that have omitted small but significant things like "pro" behind r9600 and his statement of "optimiztions going too far" without actually going into specifics, other than new detonators don't render fog. Those are small but significant details that add a little oil on a very hot issue of "cheating" in regards to nVidia's "optimizations". But I sopke of inancial side of things, so let me get back to it. After clearly stating how superior ATi's harware is to FX, stating how much effort they have invested to make the game work on FX (which is absolutely commendable) I can not help but notice that all this perfectly leads into the next great thing. A new line of ATi cards will be bundeled with ATi cards (or vice versa), and ATi is just getting ready to offer a value DX9 line. Remember how it was the only area that they have not covered and nVidia was selling truckloads of FX5200 in the meantime. After they have demonstrated how poorly FX flagship performs, let alone the value parts, is't it a perfect lead into selling shiploads of those bundeled cards(games). Add to that Gabe's shooting down of any optimization efforts on nVidia's part (simply insinuate on "chaets") and things are slowly moving in the right direction. And to top it all off, Valve expilcitley said that future additions will not be done for DX8 or so called mixed class but exclusively DX9. What is Joe consumer to do than? The only logical thing - get him/herself one of those bundles.
    That concludes my observations on this angle of this newly emerged attraction and I see only good things on the horizon for ATi stockholders.
    Feel free to debate, disagree and criticize, but keep in mind that I am not defending or bashing anybody, just offering my opinion on the part I considered equally as interesting as hardware performance is.
  • Anonymous User - Friday, September 12, 2003 - link

    Wow...I buy a new video card every 3 years or so..my last one was a GF2PRO....hehe...I'm so glad to have a 9800PRO right now.
    Snif..I'm proud to be Canadian ;-)
  • Anonymous User - Friday, September 12, 2003 - link

    How come the 9600 pros hardly loses any performance going from 1024 to 1280? Shouldn't it be affected by only having 4 pipelines?
  • Anonymous User - Friday, September 12, 2003 - link

    MUHAHAHA!!! Go the 9600pros, i'd like to bitch slap my friends for telling me the 9600's will not run half-life 2. I guess i can now purchase an All-In-Wonder 9600pro.
  • Anonymous User - Friday, September 12, 2003 - link

    Man, I burst into a coughing/laughing spree when I saw an add using nVidia's "The way it's meant to be played" slogan. Funny thing is, I first noticed the add on the page titled "What's Wrong with Nvidia?"
  • Anonymous User - Friday, September 12, 2003 - link

    booyah, i hope my ti4200 can hold me over at 800x600 until i can switch to ATI! big up canada
  • Anonymous User - Friday, September 12, 2003 - link

    You can bet your house nvidia's 50 drivers will get closer performance, but they're STILL thoroughly bitchslapped... Ppl will be buying R9x00's by the ton. Nvidia better watch out, or they'll go down like, whatwassitsname, 3dfx ?
  • dvinnen - Friday, September 12, 2003 - link

    Hehe, I concer. Seeing a 9500on there would of been nice. But I really want to see is some AF turned on. I can live with no AA (ok, 2x AA) but I'll be damn if AF isn't going to be on.
  • Anonymous User - Friday, September 12, 2003 - link

    Anand, you guys rock. It's because of your in depth reviews that I purchased the Radeon 9500 Pro. I noticed the oddity mentioned of the small performance gap between the 9700 Pro and the 9600 Pro at 1280x1024. I would really like to see how the 9500 Pro is affected by this (and all the other benchmarks). If you have a chance, could you run a comparison between the 9500 Pro and the 9600 Pro (I guess what I really want to know if my 9500 Pro is better than a 9600 Pro for this game).

    Arigato,
    The Internal
  • Pete - Friday, September 12, 2003 - link

    (Whoops, that was me above (lucky #13)--entered the wrong p/w.)

Log in

Don't have an account? Sign up now