ATI & Valve - Defining the Relationship

The first thing that comes to mind when you see results like this is a cry of foul play; that Valve has unfairly optimized their game for ATI's hardware and thus, it does not perform well on NVIDIA's hardware. Although it is the simplest accusation, it is actually one of the less frequent that we've seen thrown around.

During Gabe Newell's presentation, he insisted that they [Valve] have not optimized or doctored the engine to produce these results. It also doesn't make much sense for Valve to develop an ATI-specific game simply because the majority of the market out there does have NVIDIA based graphics cards, and it is in their best interest to make the game run as well as possible on NVIDIA GPUs.

Gabe mentioned that the developers spent 5x as much time optimizing the special NV3x code path (mixed mode) as they did optimizing the generic DX9 path (what ATI's DX9 cards use). Thus, it is clear that a good attempt was made to get the game to run as well as possible on NVIDIA hardware.

To those that fault Valve for spending so much time and effort trying to optimize for the NV3x family, remember that they are in the business to sell games and with the market the way it is, purposefully crippling one graphics manufacturer in favor of another would not make much business sense.

Truthfully, we believe that Valve made an honest attempt to get the game running as well as possible on NV3x hardware but simply ran into other unavoidable issues (which we will get to shortly). You can attempt to attack the competence of Valve's developers; however, we are not qualified to do so. Yet, any of those who have developed something similar in complexity to Half-Life 2's source engine may feel free to do so.

According to Gabe, these performance results were the reason that Valve aligned themselves more closely with ATI. As you probably know, Valve has a fairly large OEM deal with ATI that will bring Half-Life 2 as a bundled item with ATI graphics cards in the future. We'll be able to tell you more about the cards with which it will be bundled soon enough (has it been 6 months already?).

With these sorts of deals, there's always money (e.g. marketing dollars) involved, and we're not debating the existence of that in this deal, but as far as Valve's official line is concerned, the deal came after the performance discovery.

Once again, we're not questioning Valve in this sense and honestly don't see much reason to, as it wouldn't make any business sense for them to cripple Half-Life 2 on NVIDIA cards. As always, we encourage you to draw your own conclusions based on the data we've provided.

Moving on…

It's Springer Time What's Wrong with NVIDIA?
Comments Locked

111 Comments

View All Comments

  • atlr - Friday, September 12, 2003 - link


    Quote from http://www.anandtech.com/video/showdoc.html?i=1863...
    "The Radeon 9600 Pro manages to come within 4% of NVIDIA's flagship, not bad for a ~$100 card."

    Anyone know where a ~$100 9600 Pro is sold? I thought this was a ~$200 card.
  • Anonymous User - Friday, September 12, 2003 - link

    Time to load up on ATI stock :)
  • Anonymous User - Friday, September 12, 2003 - link

    Quoted from Nvidia.com:

    "Microsoft® DirectX® 9.0 Optimizations and Support
    Ensures the best performance and application compatibility for all DirectX 9 applications."

    Oops, not this time around...
  • Anonymous User - Friday, September 12, 2003 - link

    #74 - No, D3 isn't a DX9 game, its OGL. What it shows is that the FX series isn't bad - they just don't do so well under DX9. If you stick primarily to OpenGL games and run your DX games under the 8.1 spec, the FX should perform fine. It's the DX9 code that the FXes seem to really struggle with.
  • Anonymous User - Friday, September 12, 2003 - link

    #74: I have commonly heard this blamed on a bug in an older release of the CATALYST drivers that were used in the Doom3 benchmark. It is my understanding that if the benchmark was repeated with the 3.7 (RELEASED) drivers, the ATI would perform much better.

    #75: I believe this goes back to prior instances where Nvidia has claimed that some new driver would increase performance dramatically to get it into a benchmark and then never release the driver for public use. If this happened, the benchmark would be unreliable as it could not be repeated by an end-user with similar results.

    Also, the Det50 drivers from Nvidia do not have a working fog system. It has been hinted that this could be intentional to improve performance. Either way, I saw a benchmark today (forgot where) that compared the Det45's to the beta Det50's. The 50's did improve performance in 3DMark03 but no where near the 73% gap in performance seen in HL2.
  • Anonymous User - Friday, September 12, 2003 - link

    Because Gabe controls how representative the hl2 beta is of the final hl2 product but he cannot control how representative the nvidia det50 beta is if the final det50s.

    And besides that there have been rumours of "optimalisations" in the new det50s.
  • Anonymous User - Friday, September 12, 2003 - link

    How is it that Gabe can recommend not running benckmarks on an publicly unavailable driver or hardware, yet the game itself is unavailable? Seems a little hypocritical....
  • Anonymous User - Friday, September 12, 2003 - link

    I didn't have time to look into this but can someone enlilghten me as to why the 5900 Ultra outperformed the 9800 PRO in the Doom 3 benchmarks we saw awhile back...is that not using DX9 as well? If I am way off the mark here or am even wrong on which outperformed which go easy on the flames!

    Thanks
  • Anonymous User - Friday, September 12, 2003 - link

    "Not true, the 9500 is a true DX9 part. The 9600 does have faster shader units though."

    My bad, should have looked at ATI first. I guess I'm thinking about the 8500. Either way, I would still go 9600 Pro, especially given that it is cheaper than a 9500 non-pro.
  • Anonymous User - Friday, September 12, 2003 - link

    "The 9600 fully supports DX9 whereas the 9500 does not."

    Not true, the 9500 is a true DX9 part. The 9600 does have faster shader units though.

Log in

Don't have an account? Sign up now