Improving Performance on NVIDIA

If the hypotheses mentioned on the previous page hold true, then there may be some ways around these performance issues. The most obvious is through updated drivers. NVIDIA does have a new driver release on the horizon, the Detonator 50 series of drivers. However, Valve instructed us not to use these drivers as they do not render fog in Half-Life 2. In fact, Valve was quite insistent that we only used publicly available drivers on publicly available hardware, which is a reason you won't see Half-Life 2 benchmarks in our upcoming Athlon 64 review.

Future drivers may be the key for higher performance to be enabled on NVIDIA platforms, but Gabe issued the following warning:

"I guess I am encouraging skepticism about future driver performance."

Only time will tell if updated drivers can close the performance gap, but as you are about to see, it is a decent sized gap.

One thing that is also worth noting is that the shader-specific workarounds for NVIDIA implemented by Valve will not immediately translate to all other games that are based off of Half-Life 2's Source engine. Remember that these restructured shaders are specific to the shaders used in Half-Life 2, which won't necessarily be the shaders used in a different game based off of the same engine.

Gabe also cautioned that reverting to 16-bit floating point values will only become more of an issue going forward as "newer DX9 functionality will be able to use fewer and fewer partial precision functions." Although the theory is that by the time this happens, NV4x will be upon us and will have hopefully fixed the problems that we're seeing today.

NVIDIA's Official Response

Of course, NVIDIA has their official PR response to these issues, which we've published below:

During the entire development of Half Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed.

We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.

Regarding the Half Life2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half-Life 2 and other new games are included in our Rel.50 drivers - which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.

Pending detailed information from Valve, we are unaware of any issues with Rel. 50 and the drop of Half-Life 2 that we have. The drop of Half-Life 2 that we currently have is more than 2 weeks old. It is not a cheat or an over optimization. Our current drop of Half-Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half-Life 2.

We are committed to working with Gabe to fully understand.

What's Wrong with NVIDIA? More on Mixed-Mode for NV3x
Comments Locked

111 Comments

View All Comments

  • dvinnen - Friday, September 12, 2003 - link

    #31: I know what I said. DX9 dosen't require 32 bit. It's not in the spec so you couldn't write shader that uses more than 24bit percision.
  • XPgeek - Friday, September 12, 2003 - link

    Well #26, if the next gen of games do need 32 bit precision, then the tides will once again be turned. and all these "my ATi is so faster than for nVidia" will have to just suck it up and buy another new card, whereas the GFFX's will still be plugging along. by then, who knows, maybe DX10 will support 32 bit precision on the nVidia cards better...
    btw, im still loading down my GF3 Ti500. so regardless, i will have crappy perf. but i also buy cards from the company i like, that being Gainward/Cardex nVidia based boards. no ATi for me, also no Intel for me. Why? bcuz its my choice. so it may be slower, whoopty-doo!

    for all i know, HL2 could run for crap on AMD CPUs as well. so i'll be in good shape then with my XP2400+ and GF3

    sorry, i know my opinions dont matter, but i put em here anyhow.

    buy what you like, dont just follow the herd... unless you like having your face in everyones ass.
  • Anonymous User - Friday, September 12, 2003 - link

    #28 Not 24bit, 32 bit.
  • Anonymous User - Friday, September 12, 2003 - link

    Yeah, like mentioned above, what about whether or not AA and AF were turned on in these tests? Do you talk about it somewhere in your article?

    I can't believe it's not mentioned since this site was the one that make a detailed (and excellent) presentation of the differences b/w ati and nvdia's AA and AF back in the day.

    Strange your benchmarks appear to be silent on the matter. I assume they were both turned off.
  • Anonymous User - Friday, September 12, 2003 - link

    >>thus need full 32-bit precision."<<

    Huh? Wha?

    This is an interesting can of worms. So in the future months time, if ATI stick to 24bit, or cannot develop 32 bit precision, the tables will have reversed on the current situation - but even moreso because there would not be a work around (Or optimization).

    Will ATI users in the future accuse Valve of sleeping with Nvidia because their cards cannot shade with 32-bit precision?

    Will Nvidia users claim that ATI users are "non-compliant with directX 9"? Will ATI users respond that 24bit precision is the only acceptable standard Direct 9 standard, and that Valve are traitors?

    Will Microsoft actually force manufacturers to bloody well wait and force them to follow the standard.

    And finally, who did shoot Colonel Mustard in the Dining Room?

    Questions, Questions.
  • dvinnen - Friday, September 12, 2003 - link

    #26: It means it can't cheat and use 16 bit registries to do it and need a full 24bit. SO it would waste the rest of the registry
  • Anonymous User - Friday, September 12, 2003 - link

    #26 That was in reference to the fx cards. They can do 16 or 32 bit precision. Ati cards do 24 bit precision, which is the dx 9 standard.

    24 bit is the dx 9 standard because it's "good enough." It's much faster than 32 bit, and much better looking then 16 bit. So 16 bit will wear out sooner. Of course, someday 24 bit won't be enough, either, but there's no way of knowing when that'll be.
  • Anonymous User - Friday, September 12, 2003 - link

    Valve says no benchmarks on Athlon 64! :-/
    Booo!

    Quote:
    http://www.tomshardware.com/business/20030911/inde...
    "Valve was able to heavily increase the performance of the NVIDIA cards with the optimized path but Valve warns that such optimizations won't be possible in future titles, because future shaders will be more complex and will thus need full 32-bit precision."

    The new ATI cards only have 24bit shaders!
    So would that make ALL current ATI cards without any way to run future Valve titles?

    Perhaps I do not understand the technology fully, can someone elaborate on this?
  • Anonymous User - Friday, September 12, 2003 - link

    I agree with #23 in terms of money making power the ATI/Valve combo is astounding. ATI's design is superior as we can see but the point is that ATI is going to get truckloads of money and recognition for this. Its a good day to have stock in ATI, lets all thank them for buying ArtX!
  • Anonymous User - Friday, September 12, 2003 - link

    I emailed gabe about my 9600 pro, but he didnt have to do all this just for me :D

    I love it.

Log in

Don't have an account? Sign up now