Final Words

When we first heard Gabe Newell's words, what came to mind is that this is the type of excitement that the 3D graphics industry hasn't seen in years. The days where we were waiting to break 40 fps in Quake I were gone and we were left arguing over whose anisotropic filtering was correct. With Half-Life 2, we are seeing the "Dawn of DX9" as one speaker put it; and this is just the beginning.

The performance paradigm changes here; instead of being bound by memory bandwidth and being able to produce triple digit frame rates, we are entering a world of games where memory bandwidth isn't the bottleneck - where we are bound by raw GPU power. This is exactly the type of shift we saw in the CPU world a while ago, where memory bandwidth stopped being the defining performance characteristic and the architecture/computational power of the microprocessors had a much larger impact.

One of the benefits of moving away from memory bandwidth limited scenarios is that enhancements that traditionally ate up memory bandwidth, will soon be able to be offered at virtually no performance penalty. If your GPU is waiting on its ALUs to complete pixel shading operations then the additional memory bandwidth used by something like anisotropic filtering will not negatively impact performance. Things are beginning to change and they are beginning to do so in a very big way.

In terms of the performance of the cards you've seen here today, the standings shouldn't change by the time Half-Life 2 ships - although NVIDIA will undoubtedly have newer drivers to improve performance. Over the coming weeks we'll be digging even further into the NVIDIA performance mystery to see if our theories are correct; if they are, we may have to wait until NV4x before these issues get sorted out.

For now, Half-Life 2 seems to be best paired with ATI hardware and as you've seen through our benchmarks, whether you have a Radeon 9600 Pro or a Radeon 9800 Pro you'll be running just fine. Things are finally heating up and it's a good feeling to have back...

Half-Life 2 Performance - e3_c17_02.dem
Comments Locked

111 Comments

View All Comments

  • atlr - Friday, September 12, 2003 - link


    Quote from http://www.anandtech.com/video/showdoc.html?i=1863...
    "The Radeon 9600 Pro manages to come within 4% of NVIDIA's flagship, not bad for a ~$100 card."

    Anyone know where a ~$100 9600 Pro is sold? I thought this was a ~$200 card.
  • Anonymous User - Friday, September 12, 2003 - link

    Time to load up on ATI stock :)
  • Anonymous User - Friday, September 12, 2003 - link

    Quoted from Nvidia.com:

    "Microsoft® DirectX® 9.0 Optimizations and Support
    Ensures the best performance and application compatibility for all DirectX 9 applications."

    Oops, not this time around...
  • Anonymous User - Friday, September 12, 2003 - link

    #74 - No, D3 isn't a DX9 game, its OGL. What it shows is that the FX series isn't bad - they just don't do so well under DX9. If you stick primarily to OpenGL games and run your DX games under the 8.1 spec, the FX should perform fine. It's the DX9 code that the FXes seem to really struggle with.
  • Anonymous User - Friday, September 12, 2003 - link

    #74: I have commonly heard this blamed on a bug in an older release of the CATALYST drivers that were used in the Doom3 benchmark. It is my understanding that if the benchmark was repeated with the 3.7 (RELEASED) drivers, the ATI would perform much better.

    #75: I believe this goes back to prior instances where Nvidia has claimed that some new driver would increase performance dramatically to get it into a benchmark and then never release the driver for public use. If this happened, the benchmark would be unreliable as it could not be repeated by an end-user with similar results.

    Also, the Det50 drivers from Nvidia do not have a working fog system. It has been hinted that this could be intentional to improve performance. Either way, I saw a benchmark today (forgot where) that compared the Det45's to the beta Det50's. The 50's did improve performance in 3DMark03 but no where near the 73% gap in performance seen in HL2.
  • Anonymous User - Friday, September 12, 2003 - link

    Because Gabe controls how representative the hl2 beta is of the final hl2 product but he cannot control how representative the nvidia det50 beta is if the final det50s.

    And besides that there have been rumours of "optimalisations" in the new det50s.
  • Anonymous User - Friday, September 12, 2003 - link

    How is it that Gabe can recommend not running benckmarks on an publicly unavailable driver or hardware, yet the game itself is unavailable? Seems a little hypocritical....
  • Anonymous User - Friday, September 12, 2003 - link

    I didn't have time to look into this but can someone enlilghten me as to why the 5900 Ultra outperformed the 9800 PRO in the Doom 3 benchmarks we saw awhile back...is that not using DX9 as well? If I am way off the mark here or am even wrong on which outperformed which go easy on the flames!

    Thanks
  • Anonymous User - Friday, September 12, 2003 - link

    "Not true, the 9500 is a true DX9 part. The 9600 does have faster shader units though."

    My bad, should have looked at ATI first. I guess I'm thinking about the 8500. Either way, I would still go 9600 Pro, especially given that it is cheaper than a 9500 non-pro.
  • Anonymous User - Friday, September 12, 2003 - link

    "The 9600 fully supports DX9 whereas the 9500 does not."

    Not true, the 9500 is a true DX9 part. The 9600 does have faster shader units though.

Log in

Don't have an account? Sign up now