Improving Performance on NVIDIA

If the hypotheses mentioned on the previous page hold true, then there may be some ways around these performance issues. The most obvious is through updated drivers. NVIDIA does have a new driver release on the horizon, the Detonator 50 series of drivers. However, Valve instructed us not to use these drivers as they do not render fog in Half-Life 2. In fact, Valve was quite insistent that we only used publicly available drivers on publicly available hardware, which is a reason you won't see Half-Life 2 benchmarks in our upcoming Athlon 64 review.

Future drivers may be the key for higher performance to be enabled on NVIDIA platforms, but Gabe issued the following warning:

"I guess I am encouraging skepticism about future driver performance."

Only time will tell if updated drivers can close the performance gap, but as you are about to see, it is a decent sized gap.

One thing that is also worth noting is that the shader-specific workarounds for NVIDIA implemented by Valve will not immediately translate to all other games that are based off of Half-Life 2's Source engine. Remember that these restructured shaders are specific to the shaders used in Half-Life 2, which won't necessarily be the shaders used in a different game based off of the same engine.

Gabe also cautioned that reverting to 16-bit floating point values will only become more of an issue going forward as "newer DX9 functionality will be able to use fewer and fewer partial precision functions." Although the theory is that by the time this happens, NV4x will be upon us and will have hopefully fixed the problems that we're seeing today.

NVIDIA's Official Response

Of course, NVIDIA has their official PR response to these issues, which we've published below:

During the entire development of Half Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed.

We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.

Regarding the Half Life2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half-Life 2 and other new games are included in our Rel.50 drivers - which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.

Pending detailed information from Valve, we are unaware of any issues with Rel. 50 and the drop of Half-Life 2 that we have. The drop of Half-Life 2 that we currently have is more than 2 weeks old. It is not a cheat or an over optimization. Our current drop of Half-Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half-Life 2.

We are committed to working with Gabe to fully understand.

What's Wrong with NVIDIA? More on Mixed-Mode for NV3x
Comments Locked

111 Comments

View All Comments

  • Anonymous User - Friday, September 12, 2003 - link

    I think the insinuation is clear from that nVidia email posted and Gabe's comments. Valve believed nVidia was trying to "cheat" with their D50s by intentionally having fog disabled etc. Rather than toss around accusations, it was simpler for them to just require that the benchmarks at this point be run with released drivers and avoid the issue of currently bugged drivers with non-working features, whether the reason was accidental or intentional.

    Considering that the FXes fared poorly with 3DMark and again with HL2 - both using DX9 implementations, I think it might be fair to say that the FXes aren't going to do too much better in the future. Especially considering the way they reacted to 3DMark 03 - fighting the benchmark rather than releasing drivers to remedy the performance issue.

    I'd like to see how the FXes do running HL2 with pure DX8 rather than DX9 or a hybrid, as I think most people owning current nVidia cards are going to have to go that route to achieve the framerates desired.
  • Anonymous User - Friday, September 12, 2003 - link

    I dont see how the minimum requirements set but valve are going to play this game. 700mhz and a TNT2. The FX5200's could barely keep up.
  • Anonymous User - Friday, September 12, 2003 - link

    #68: 33 fps * 1.73 = 57.09 fps (add the one to account for the intial 33 score).

    This doesn't quite work out based on the 57.3 score of the 9800 Pro so corrected score on the Nvidia was probably closer to this:
    57.3 / 1.73 = 33.12 fps

    #69: I would definitely try to find a 9600 Pro before I bought a 9500 Pro. The 9600 fully supports DX9 whereas the 9500 does not.
  • Anonymous User - Friday, September 12, 2003 - link

    Guess Its time to upgrade...
    Now where's my &*&%%'n wallet!!


    Wonder where I'll be able to find a R9500Pro (Sapphire)
  • Anonymous User - Friday, September 12, 2003 - link

    The performance increase between the FX5900 and Rad9800Pro is not 73%. Do the math correctly and it turns into 36.5% lead. The article should be revised.
  • atlr - Friday, September 12, 2003 - link

    If anyone sees benchmarks for 1 GHz computers, please post a URL. Thanks.
  • WooDaddy - Friday, September 12, 2003 - link

    Hmmm... I understand that Nvidia would be upset. But it's not like ATI is using a special setting to run faster. They're using DX9.. Nvidia needs to get on the ball. I'm going to have to upgrade my video card since I have a now obsolete Ti4200 GF4.

    GET IT TOGETHER NVIDIA!!! DON'T MAKE ME BUY ATI!

    I might just sell my Nvidia stock while I'm at it. HL2 is a big mover and I believe can make or break the card on the consumer side.
  • Anonymous User - Friday, September 12, 2003 - link

    I had just ordered a 5600 Ultra thinking it would be a great card. It's going back.

    If I can get full DX 9 performance with a 9600 Pro for around $180, and that card's performance is better than the 5900 Ultra - then I'm game.

    I bought a TNT when Nvidia was making a name for it's self. I bought a GF2 GTS when Nvida was destroying the 3dfx - now Nvidia seems to have droped the ball on DX9. I want to play HL2 on what ever card I buy. A 5600 ultra won't seem to cut it. I know the 50's are out there, but I've seen the Aquamark comparision with the 45's and 50's and I'm not impressed.

    I really wanted to buy Nvidia, but I cannot afford it.

  • Anonymous User - Friday, September 12, 2003 - link

    #62: I do have the money but I choose to spend it elsewhere. FYI: I spend $164 US on my 2.4C and I'm running speeds faster than the system used for this benchmark.

    "The Dell PCs we used were configured with Pentium 4 3.0C processors on 875P based motherboards with 1GB of memory. We were running Windows XP without any special modifications to the OS or other changes to the system."

    Anand was using a single system to show what HL2 performance would be on video cards available on the market today. If we was to run benchmarks on different CPU's he would have to spend a tremendous amount more time doing so. In the interest of getting the info out as soon as possible, he limited himself to a single system.

    I would deduce from the performance numbers of HL2 in Anand's benchmarks that unless you have a 9600 Pro/9800 Pro, your AMD will not be able to effectively run HL2.
  • Anonymous User - Friday, September 12, 2003 - link

    Woohoooo!!!
    My ATI 9500@9700 128MB with 8 pixel pipelines and 256bit access beats the crap out of any FX.
    And it only costed me 190euros/190dollars

    Back to the drawing board NVidia.
    Muahahahah!!!

Log in

Don't have an account? Sign up now