Just over one year ago we were able to run our very first set of benchmarks using Valve's Half Life 2 and while we though that, at the time, the game was going to be clearly an ATI dominated title we were not counting on it taking a full year to actually come to market. 

With Half Life 2 finally out we go back and look at how a year's worth of driver releases and GPU launches have changed the playing field in this multipart article on Half Life 2 GPU performance. 

The first part of our Half Life 2 GPU series focuses on the performance of today's latest and greatest GPUs, but we will have follow up articles for owners of older hardware as well.  There's quite a bit to talk about here so let's just get right to it.

The Love Triangle: ATI, NVIDIA and Valve

It's no big surprise that ATI and Valve have been working very closely with one another with the development of Half Life 2.

Valve would not let either ATI or NVIDIA have a copy of the final Half Life 2 game in order to prevent any sorts of leaks from happening.  Judging by the fact that Half Life 2 is the only game in recent history to not be leaked before its official street date (we don't consider being able to purchase an unlockable copy to be "leaked"), Valve's policy about not letting anyone have the game worked quite well.

ATI and NVIDIA both spent a little bit of time at Valve benchmarking and play testing Half Life 2 over the past couple of weeks.  From what ATI tells us, they spent a full week with Half Life 2 and NVIDIA informed us that they spent two long days at Valve.  Immediately there's a discrepancy with the amount of time the two companies have had to benchmark and toy around with Half Life 2, but then again ATI is paying the bills and NVIDIA isn't so you can expect a bit of preferential treatment to be at play.  NVIDIA did tell us that honestly their limited time at Valve wasn't solely dictated by Valve.  Valve extended an invitation to NVIDIA and things just ended up working out so that NVIDIA only had two (albeit long) days with the final version of the game. 

ATI managed to run quite a few benchmarks and even created some of their own demos of Half Life 2, all of which showed ATI hardware outperforming NVIDIA hardware.  While ATI did share the demos with us for use in this article, we elected not to use them in favor of developing our own benchmarks in order to be as fair to both sides as possible.  We did extend the offer to NVIDIA to provide their own demos for us to look at as well, to which NVIDIA responded that they were not allowed to record any timedemos.  Their testbed hard drives were kept at Valve and will be returned to them sometime after the launch of the game.  NVIDIA mentioned that their main focus at Valve was to perform QA testing to ensure that the game was playable and that there were no significant issues that needed to be addressed. 

Both ATI and NVIDIA have supplied us with beta drivers for use in our Half Life 2 testing.  ATI's driver is the publicly available Catalyst 4.12 beta, which is specifically targeted to improve performance under Half Life 2.  NVIDIA's driver is the most recent internal build of their ForeWare drivers (version 67.02).  There are three main improvements in this version of NVIDIA's drivers and they are as follows:

1) Some game-specific texture shimmering issues have been fixed

2) The Flatout demo no longer crashes

3) Some shader compiler fixes that should improve shader performance

As you can see, only #3 could potentially apply to Half Life 2, but NVIDIA indicated that the fixes were not Half Life 2 specific, although they could yield a positive performance gain. 

ATI seemed quite confident in their performance under Half Life 2 from our conversations with them before the game's launch, while the atmosphere at NVIDIA was considerably more cautious and often times, downright worried.  From our talks with NVIDIA we got the distinct impression that we had more information about Half Life 2 (courtesy of ATI) than they did, in fact, they were not able to provide us with any insight into how their hardware would handle Half Life 2 other than that it would be playable and seems to run surprisingly well on even the older GeForce2 cards. 

We're here today to find out for ourselves where things stand between ATI and NVIDIA when it comes to Half Life 2 performance.  We've spent every hour since Half Life 2's official online launch play testing, benchmarking and investigating image quality in the game in order to bring you the first of a series of articles on Half Life 2. 

Today's article will focus on the performance of today's most popular DirectX 9 GPUs; our following articles will delve into the performance of older DirectX 7 and DirectX 8 class GPUs, but for those users on the verge of upgrading their systems today, this article will give you the best recommendations based on your price range. 

Benchmarking Half Life 2


View All Comments

  • Anand Lal Shimpi - Wednesday, November 17, 2004 - link

    Thanks for all of the comments guys. Just so you know, I started on Part 2 the minute the first article was done. I'm hoping to be done with testing by sometime tomorrow and then I've just got to write the article. Here's a list of the new cards being tested:

    9600XT, 9550, 9700, X300, GF 6200, GF 5900XT, GF4 Ti 4600, GF4 MX440

    I'm doing both DX9 and DX8 comparisons, including image quality.

    After Part 2 I think I'll go ahead and do the CPU comparison, although I've been thinking about doing a more investigative type of article into Half Life 2 performance in trying to figure out where its performance limitations exist, so things may get shuffled around a bit.

    We used the PCI Express 6600GT for our tests, but the AGP version should perform quite similarly.

    The one issue I'm struggling with right now is the fact that the X700 XT is still not available in retail, while the X700 Pro (256MB) is. If I have the time I may go back and run some X700 Pro numbers to make this a more realistic present-day comparison.

    Any other requests?

    Take care,
  • Cybercat - Wednesday, November 17, 2004 - link

    You guys made my day comparing the X700XT, 6800, and 6600GT together. One question though (and I apologize if this was mentioned in the article and I missed it), did you guys use the PCIe or AGP version of the 6600GT? Reply
  • Houdani - Wednesday, November 17, 2004 - link

    18: Many users rely on hardware review sites to get a feel for what technology is worth upgrading and when.

    Most of us have financial contraints which preclude us from upgrading to the best hardware, therefore we are more interested in knowing how the mainstream hardware performs.

    You are correct that it would not be an efficient use of resources to have AT repeat the tests on hardware that is two or three generations old ... but sampling the previous generation seems appropriate. Fortunately, that's where part 2 will come in handy.

    I expect that part 2 will be sufficient in showing whether or not the previous generation's hardware will be a bottleneck. The results will be invaluable for helping me establish my minimum level of satisfaction for today's applications.
  • stelleg151 - Wednesday, November 17, 2004 - link

    forget what i said in 34..... Reply
  • pio!pio! - Wednesday, November 17, 2004 - link

    So how do you softmod a 6800NU to a 6800GT???
    or unlock the extra stuff....
  • stelleg151 - Wednesday, November 17, 2004 - link

    What drivers were being used here, 4.12 + 67.02?? Reply
  • Akira1224 - Wednesday, November 17, 2004 - link


    lol I should have seen that one coming!
  • nastyemu25 - Wednesday, November 17, 2004 - link

    i bought a 9600XT because it came boxed with a free coupon for HL2. and now i can't even see how it matches up :( Reply
  • coldpower27 - Wednesday, November 17, 2004 - link

    These benchmarks are more in line with what I was predicting, the x800 Pro should be equal to 6800 GT due to similar Pixel Shader fillrate while the X800 XT should have an advantage at higher resolutions due to it's having a higher fillrate being clocked higher.

    Unlike DriverATIheaven:P.

    This is great I am happy knowing Nvidia's current generation of hardware is very competitive in performance in all aspects when at equal amounts of fillrate.
  • Da3dalus - Wednesday, November 17, 2004 - link

    In the 67.02 Forceware driver there's a new option called "Negative LOD bias", if I understand what I've read correctly it's supposed to reduce shimmering.

    What was that option set to in the tests? And how did it affect performance, image quality and shimmering?

Log in

Don't have an account? Sign up now