Just over one year ago we were able to run our very first set of benchmarks using Valve's Half Life 2 and while we though that, at the time, the game was going to be clearly an ATI dominated title we were not counting on it taking a full year to actually come to market. 

With Half Life 2 finally out we go back and look at how a year's worth of driver releases and GPU launches have changed the playing field in this multipart article on Half Life 2 GPU performance. 

The first part of our Half Life 2 GPU series focuses on the performance of today's latest and greatest GPUs, but we will have follow up articles for owners of older hardware as well.  There's quite a bit to talk about here so let's just get right to it.

The Love Triangle: ATI, NVIDIA and Valve

It's no big surprise that ATI and Valve have been working very closely with one another with the development of Half Life 2.

Valve would not let either ATI or NVIDIA have a copy of the final Half Life 2 game in order to prevent any sorts of leaks from happening.  Judging by the fact that Half Life 2 is the only game in recent history to not be leaked before its official street date (we don't consider being able to purchase an unlockable copy to be "leaked"), Valve's policy about not letting anyone have the game worked quite well.

ATI and NVIDIA both spent a little bit of time at Valve benchmarking and play testing Half Life 2 over the past couple of weeks.  From what ATI tells us, they spent a full week with Half Life 2 and NVIDIA informed us that they spent two long days at Valve.  Immediately there's a discrepancy with the amount of time the two companies have had to benchmark and toy around with Half Life 2, but then again ATI is paying the bills and NVIDIA isn't so you can expect a bit of preferential treatment to be at play.  NVIDIA did tell us that honestly their limited time at Valve wasn't solely dictated by Valve.  Valve extended an invitation to NVIDIA and things just ended up working out so that NVIDIA only had two (albeit long) days with the final version of the game. 

ATI managed to run quite a few benchmarks and even created some of their own demos of Half Life 2, all of which showed ATI hardware outperforming NVIDIA hardware.  While ATI did share the demos with us for use in this article, we elected not to use them in favor of developing our own benchmarks in order to be as fair to both sides as possible.  We did extend the offer to NVIDIA to provide their own demos for us to look at as well, to which NVIDIA responded that they were not allowed to record any timedemos.  Their testbed hard drives were kept at Valve and will be returned to them sometime after the launch of the game.  NVIDIA mentioned that their main focus at Valve was to perform QA testing to ensure that the game was playable and that there were no significant issues that needed to be addressed. 

Both ATI and NVIDIA have supplied us with beta drivers for use in our Half Life 2 testing.  ATI's driver is the publicly available Catalyst 4.12 beta, which is specifically targeted to improve performance under Half Life 2.  NVIDIA's driver is the most recent internal build of their ForeWare drivers (version 67.02).  There are three main improvements in this version of NVIDIA's drivers and they are as follows:

1) Some game-specific texture shimmering issues have been fixed

2) The Flatout demo no longer crashes

3) Some shader compiler fixes that should improve shader performance

As you can see, only #3 could potentially apply to Half Life 2, but NVIDIA indicated that the fixes were not Half Life 2 specific, although they could yield a positive performance gain. 

ATI seemed quite confident in their performance under Half Life 2 from our conversations with them before the game's launch, while the atmosphere at NVIDIA was considerably more cautious and often times, downright worried.  From our talks with NVIDIA we got the distinct impression that we had more information about Half Life 2 (courtesy of ATI) than they did, in fact, they were not able to provide us with any insight into how their hardware would handle Half Life 2 other than that it would be playable and seems to run surprisingly well on even the older GeForce2 cards. 

We're here today to find out for ourselves where things stand between ATI and NVIDIA when it comes to Half Life 2 performance.  We've spent every hour since Half Life 2's official online launch play testing, benchmarking and investigating image quality in the game in order to bring you the first of a series of articles on Half Life 2. 

Today's article will focus on the performance of today's most popular DirectX 9 GPUs; our following articles will delve into the performance of older DirectX 7 and DirectX 8 class GPUs, but for those users on the verge of upgrading their systems today, this article will give you the best recommendations based on your price range. 

Benchmarking Half Life 2
Comments Locked

79 Comments

View All Comments

  • ballero - Wednesday, November 17, 2004 - link

    it'd be nice a comparison between cpu
  • Jalf - Wednesday, November 17, 2004 - link

    To those wanting benchmarks on older hardware, remember that this is a hardware site, not a games review site.

    Their focus is on the hardware, and honestly, few hardware enthusiasts can get excited about an 800 mhz cpu or a Geforce 3. ;)

    For AT, HL2 is a tool to compare new *interesting* hardware. It's not the other way around.
  • CU - Wednesday, November 17, 2004 - link

    I would also like to see slower cpu's and 512meg systems tested. It seems all recent cards can run it fine, so it would be nice to see how other things affect HL2.
  • CU - Wednesday, November 17, 2004 - link

    Based on the 6800nu vs 6600gt I would say that HL2 is being limited by fillrate and not bandwith. I say this since they both have about the same fillrate, but the 6800nu has around 40% more bandwidth than the 6600gt. So, unlocking extra pipes and overclocking the GPU should give the most increase in fps. Anyone want to test this?
  • Jeff7181 - Wednesday, November 17, 2004 - link

    ... in addition... this is a case where minimum frame rates would be very useful to know.
  • Jeff7181 - Wednesday, November 17, 2004 - link

    Those numbers are about what I expected. I'm sorta thinking that triple buffering isn't working with the 66.93 drivers and HL2 because I have vsync enabled, it seems like the frame rate is either 85 or 42.

    I also suspected that anistropic filtering wasn't particularly necessary... I'll have to try it without and see how it looks... although with 4XAA and 8XAF I'm still getting acceptable frame rates.
  • nserra - Wednesday, November 17, 2004 - link

    #8 i never heard of 6800 extra pipes unlocked, where did you see that. Arent you making some confusion with the Ati 9500 cards?
  • MAME - Wednesday, November 17, 2004 - link

    Make some budget video card benchmarks (Ti4200 plus or minus) and possibly a slower cpu or less ram so that people will know if they have to upgrade
  • Akira1224 - Wednesday, November 17, 2004 - link

    #8 Thats not a fair comparison. Yes atm it would seem the 6800Nu is a better buy. However if you go to Gameve you will find the XFX (clocked at PCIe speeds)6600GT for $218. Thats a much better deal than your example using Newegg. You talk about a $5 diff... if you are a smart shopper you can get upwards of a $50 diff.

    THAT makes the 6600GT the better buy. Esp when you consider that the market this card is aimed at is not the same market that will softmod their cards to unlock pipes. Either way you go you will get great performance.

    I digress off topic.... sorry.
  • nserra - Wednesday, November 17, 2004 - link

    You didn’t use overclocked nvidia cards like hardocp did. That Kyle has the shame to say he used stock clock, those BFG OC are overclocked from factory. Just 25Mhz but its something.

    Very good review!!! Better then the NVIDIA's GeForce 6600GT AGP review where something was missing.

Log in

Don't have an account? Sign up now