Improving Performance on NVIDIA

If the hypotheses mentioned on the previous page hold true, then there may be some ways around these performance issues. The most obvious is through updated drivers. NVIDIA does have a new driver release on the horizon, the Detonator 50 series of drivers. However, Valve instructed us not to use these drivers as they do not render fog in Half-Life 2. In fact, Valve was quite insistent that we only used publicly available drivers on publicly available hardware, which is a reason you won't see Half-Life 2 benchmarks in our upcoming Athlon 64 review.

Future drivers may be the key for higher performance to be enabled on NVIDIA platforms, but Gabe issued the following warning:

"I guess I am encouraging skepticism about future driver performance."

Only time will tell if updated drivers can close the performance gap, but as you are about to see, it is a decent sized gap.

One thing that is also worth noting is that the shader-specific workarounds for NVIDIA implemented by Valve will not immediately translate to all other games that are based off of Half-Life 2's Source engine. Remember that these restructured shaders are specific to the shaders used in Half-Life 2, which won't necessarily be the shaders used in a different game based off of the same engine.

Gabe also cautioned that reverting to 16-bit floating point values will only become more of an issue going forward as "newer DX9 functionality will be able to use fewer and fewer partial precision functions." Although the theory is that by the time this happens, NV4x will be upon us and will have hopefully fixed the problems that we're seeing today.

NVIDIA's Official Response

Of course, NVIDIA has their official PR response to these issues, which we've published below:

During the entire development of Half Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed.

We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.

Regarding the Half Life2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half-Life 2 and other new games are included in our Rel.50 drivers - which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.

Pending detailed information from Valve, we are unaware of any issues with Rel. 50 and the drop of Half-Life 2 that we have. The drop of Half-Life 2 that we currently have is more than 2 weeks old. It is not a cheat or an over optimization. Our current drop of Half-Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half-Life 2.

We are committed to working with Gabe to fully understand.

What's Wrong with NVIDIA? More on Mixed-Mode for NV3x
Comments Locked

111 Comments

View All Comments

  • Anonymous User - Friday, September 12, 2003 - link

    #61.. i take it YOU have the money to shell out for top of the line hardware ????????? i sure as hell don't, but like #42 said, " more widely used comp "

    i my self am running a 1700+ at 2400+ speeds, no way in hell am i gonna go spend the 930 bucks ( in cdn funds )on a 3.2c P4, thats NOT inc the mobo and ram, and i'm also not gonna spend the 700 cdn on a barton 3200+ either, for the price of the above P4 chip i can get a whole decient comp, may not be able to run halflife at its fullest, but still, i'm not even interested in HL2, it just not the kind of game i play, but if i was, whay i typed above, is still valid..


    anand... RUN THESE HL2 BENCHES ON HARDWARE THE AVERAGE PERSON CAN AFFORD !!!!!!!!!!!!!!!!!!!!!!!! not he spoiled rich kid crap .....
  • Anonymous User - Friday, September 12, 2003 - link

    #42 "...should have benchmarked on a more widely used computer like a 2400 or 2500+ AMD...":

    The use of 'outdated' hardware such as your 2400 AMD would have increased the possibility of cpu limitations taking over the benchmark. Historically all video card benchmarks have used the fastest (or near fastest) GPU available to ensure the GPU is able to operate in the best possible scenario. If you want to know how your 2400 will work with HL2, wait and buy it when it comes out.

    In reference to the 16/32 bit floating point shaders and how that applies to ATI's 24 bit shaders:

    It was my understanding that this quote was referencing the need for Nvidia to use it's 32 bit shaders as future support for its 16 bit shaders would not exist. I don't see this quote pertaining to ATI's 24 bit shaders as they meet the DX9 specs. The chance of future HL2 engine based games leaving ATI users out in the cold is somewhere between slim and none. For an example of how software vendor's react to leaving out support for a particular line of video card, simply look at how much work Valve put into making Nvidia's cards work. If it was feasible for a software vendor to leave out support for an entire line like your are refering to (ATI in your inference) we would have had HL2 shipping by now (for ATI only though...).
  • Anonymous User - Friday, September 12, 2003 - link

    58, http://myweb.cableone.net/jrose/Jeremy/HL2.jpg
  • Anonymous User - Friday, September 12, 2003 - link

    Are pixel shader operations anti-aliased on current generation video cards? I ask because in the latest Half Life 2 technology demo movie, anti-aliasing is enabled. Everything looks smooth except for the specular highlights on the roof and other areas, which are still full of shimmering effects. Just seems a little sore on the eyes.
  • Anonymous User - Friday, September 12, 2003 - link

    An observation:

    Brian Burke = Iraqi Information Officer

    I mean this guy rode 3dfx into the dirt nap and he's providing the same great service to Nvidia.

    Note to self: Never buy anything from a company that has this guy spewing lies.
  • Anonymous User - Friday, September 12, 2003 - link

    OK, this article was great.

    For us freaks, can you do a supplement article. Do 1600x1200 benchmarks!!!

    Things will probably crawl, but it would be nice to know that this should be the worst case at this resolution when ATI and NVidia come out with next gen cards.

    Also, was any testing done to see if the benchmarks were CPU or GPU limited? Maybe use the CPU utilization montior in Windows o see what the CPU thought. maybe a 5.0 GHz processor down the road will solve some headaches. Doubtful, but maybe....
  • Anonymous User - Friday, September 12, 2003 - link

    Whats really funny is that Maximum PC magazine built an $11000 "Dream Machine", using a GeforeFX 5900 and i can built a machine for less then $2000 and beat it using a 9800 pro.

    Long Live my 9500 pro!
  • Anonymous User - Friday, September 12, 2003 - link

    I can play Frozen Throne and I am doing so on a GeForce2MX LOL (on a P2@400mhz).
  • Anonymous User - Friday, September 12, 2003 - link

    look at my #46 posting - i know it's different engines, different API's, different driver revisions etc...
    but still it's interesting..

    enigma
  • Anonymous User - Friday, September 12, 2003 - link

    #52 different engines, different results. hl 2 is probably more shader limited than doom 3. The 9600pro has strong shader performance, which narrows the gap in shader limited situations such as hl 2.

    btw, where did you get those doom 3 results? Only doom 3 benches I know about are based off the old alpha or that invalid test from back when the nv35 was launched...

Log in

Don't have an account? Sign up now