Just over one year ago we were able to run our very first set of benchmarks using Valve's Half Life 2 and while we though that, at the time, the game was going to be clearly an ATI dominated title we were not counting on it taking a full year to actually come to market. 

With Half Life 2 finally out we go back and look at how a year's worth of driver releases and GPU launches have changed the playing field in this multipart article on Half Life 2 GPU performance. 

The first part of our Half Life 2 GPU series focuses on the performance of today's latest and greatest GPUs, but we will have follow up articles for owners of older hardware as well.  There's quite a bit to talk about here so let's just get right to it.

The Love Triangle: ATI, NVIDIA and Valve

It's no big surprise that ATI and Valve have been working very closely with one another with the development of Half Life 2.

Valve would not let either ATI or NVIDIA have a copy of the final Half Life 2 game in order to prevent any sorts of leaks from happening.  Judging by the fact that Half Life 2 is the only game in recent history to not be leaked before its official street date (we don't consider being able to purchase an unlockable copy to be "leaked"), Valve's policy about not letting anyone have the game worked quite well.

ATI and NVIDIA both spent a little bit of time at Valve benchmarking and play testing Half Life 2 over the past couple of weeks.  From what ATI tells us, they spent a full week with Half Life 2 and NVIDIA informed us that they spent two long days at Valve.  Immediately there's a discrepancy with the amount of time the two companies have had to benchmark and toy around with Half Life 2, but then again ATI is paying the bills and NVIDIA isn't so you can expect a bit of preferential treatment to be at play.  NVIDIA did tell us that honestly their limited time at Valve wasn't solely dictated by Valve.  Valve extended an invitation to NVIDIA and things just ended up working out so that NVIDIA only had two (albeit long) days with the final version of the game. 

ATI managed to run quite a few benchmarks and even created some of their own demos of Half Life 2, all of which showed ATI hardware outperforming NVIDIA hardware.  While ATI did share the demos with us for use in this article, we elected not to use them in favor of developing our own benchmarks in order to be as fair to both sides as possible.  We did extend the offer to NVIDIA to provide their own demos for us to look at as well, to which NVIDIA responded that they were not allowed to record any timedemos.  Their testbed hard drives were kept at Valve and will be returned to them sometime after the launch of the game.  NVIDIA mentioned that their main focus at Valve was to perform QA testing to ensure that the game was playable and that there were no significant issues that needed to be addressed. 

Both ATI and NVIDIA have supplied us with beta drivers for use in our Half Life 2 testing.  ATI's driver is the publicly available Catalyst 4.12 beta, which is specifically targeted to improve performance under Half Life 2.  NVIDIA's driver is the most recent internal build of their ForeWare drivers (version 67.02).  There are three main improvements in this version of NVIDIA's drivers and they are as follows:

1) Some game-specific texture shimmering issues have been fixed

2) The Flatout demo no longer crashes

3) Some shader compiler fixes that should improve shader performance

As you can see, only #3 could potentially apply to Half Life 2, but NVIDIA indicated that the fixes were not Half Life 2 specific, although they could yield a positive performance gain. 

ATI seemed quite confident in their performance under Half Life 2 from our conversations with them before the game's launch, while the atmosphere at NVIDIA was considerably more cautious and often times, downright worried.  From our talks with NVIDIA we got the distinct impression that we had more information about Half Life 2 (courtesy of ATI) than they did, in fact, they were not able to provide us with any insight into how their hardware would handle Half Life 2 other than that it would be playable and seems to run surprisingly well on even the older GeForce2 cards. 

We're here today to find out for ourselves where things stand between ATI and NVIDIA when it comes to Half Life 2 performance.  We've spent every hour since Half Life 2's official online launch play testing, benchmarking and investigating image quality in the game in order to bring you the first of a series of articles on Half Life 2. 

Today's article will focus on the performance of today's most popular DirectX 9 GPUs; our following articles will delve into the performance of older DirectX 7 and DirectX 8 class GPUs, but for those users on the verge of upgrading their systems today, this article will give you the best recommendations based on your price range. 

Benchmarking Half Life 2
Comments Locked

79 Comments

View All Comments

  • nthexwn - Wednesday, November 17, 2004 - link

    In reply to Jeff7181 (#14):

    I have a Radeon 9700 pro with the 4.11 drivers and I'm having the same problems with my LCD (Samsung Syncmaster 710T @ 1280x1024)! Refresh rate is set to 70hz and with vsync I either get 35 (Interleaving frames to every other) or 70 fps (Matching frames to refresh rate)... Since our cards are from different companies I'm guessing it's a problem with the game itself...

    I've tried both triple buffering and alternating the DVI frequency (don't know if that would even help) and it doesn't solve the problem...

    It's rather irritating because I actually PLAY my games instead of just gawking over the benchmark scores (I'm one of those lucky people that has some free time!), and the screen looks like a Freddy Kruger job without vsync on! :*(

    Also, when the game switches between 70 and 35 there is a bit of a stall, which, even though 35fps is still playable can ruin online play in CS:S! Especially since player models running onto the screen tend to temporarily stress the card enough to make it hitch up on me, in which time said player usually caps me in the head and moves on! :*(

    I suppose we could type "fps_max 35" or "fps_max 42.5" (assuming it accepts floating values. You could just set your monitor to 80hz and set fps_max to 40) in the console (don't use the "s), but limiting the framerate to those values isn't what I'd call an ideal solution...

    Oh well...

    GREAT GAME! GREAT HARDWARE! GREAT WEBSITE!
  • smn198 - Wednesday, November 17, 2004 - link

    I'v got a 9800SE 128MB (256bit) card. Would like to know how that compares. I fried my 9500Pro making it into a 9700Pro so that won't do 3D no more (Artifacts then crashes) :(

    What graphics card which will be tested would have similar performance to a 9800SE (256bit RAM)?
  • ElMoIsEviL - Wednesday, November 17, 2004 - link

    "The one issue I'm struggling with right now is the fact that the X700 XT is still not available in retail, while the X700 Pro (256MB) is. If I have the time I may go back and run some X700 Pro numbers to make this a more realistic present-day comparison."

    I should post you a picture.. the x700XT is available at futurshop in Canada and has been for about a week now.. :)

    Althought not my cup of tea they are selling quite well I'm told.

    But then again ATi cards always sell well in Canada.. so well ATi usually cannot fill the demand (with the USA taking soo many of the chips lol).
  • ElMoIsEviL - Wednesday, November 17, 2004 - link

    Well... for one thing the numbers are not even close to what other sites are showing and secondly where's the x800XT PE?

    It's the card I own (albeit mine is clocked at 650/625).

    It's good to see ATi in the lead by such significant margins and that the game can be easilly played at 1600x1200 with 4xAA and 8xAF with an x800XT PE. Also great to see that the game runs well without the final HL2 drivers from ATi (yeah the 4.12's are only interim, the full 4.12's are going to be fully optimised).

    The biggest surprise is how well the 6600GT performed although losing convinsingly against the x700XT it still put on a good showing.

    BTW, other sites are showing the x800 Pro beating the 6800 Ultra with the same drivers albeit using an AthlonFX-55.

    Meh,

    Looks like ATi can probably offer even greater performance at lower resolutions according to the 1600x1200 results being soo close to the lower resolutions.
  • SMT - Wednesday, November 17, 2004 - link

    Anand,

    My flashlight worked throughout Nova Prospekt. Are you sure yours wasn't available?
  • abravo01 - Wednesday, November 17, 2004 - link

    Was the 6800GT used on the test 128 or 256MB? Huge price difference around here: if it was the 128MB, than it's definitely the best buy.
  • Anand Lal Shimpi - Wednesday, November 17, 2004 - link

    The AA benchmarks actually used 8X Aniso as well.

    Take care,
    Anand
  • OriginalReaper - Wednesday, November 17, 2004 - link

    on page 8 and 9 you discuss AA and AF, yet on page 10, 11, 12, and 13, you only list 4xAA being used. Did you forget to put 8xAF in the results or did the benchmark only do 4xAA?

    Thanks.
  • CU - Wednesday, November 17, 2004 - link

    I think an investigative article that shows when what hardware becomes a bottleneck for HL2 would be great. I look forward to it.

    "Any other requests?

    Take care,
    Anand"

    Can you send me all the hardware when you are done testing HL2. :-)
  • Cybercat - Wednesday, November 17, 2004 - link

    Nice, I wanted to know how the 9550 performed, mostly to see how it compares with the FX 5200. Is that 128 bit memory or 64 bit memory interface version? I'm pretty excited about the 6200 as well, since this is finally a budget card that performs better than the Ti4200. The performance leap this gen is spectacular.

    Overall, I think you left the other guys in the dust with this one.

    And on the subject of the X700 Pro, it's kind of an odd card, because with its price range (the 128MB version at about $180, 256MB at $200), it's unclear what card it's competing with. It acts like a fifth wheel in this way. People would much rather buy a X700XT or 6600GT instead since they're in the same general price range. Only thing is, like you said the X700XT isn't widely available yet, making the X700 Pro a stopgap for now, and giving NVIDIA the clear win in the mainstream market until ATI can start shipping out the more competitive card. That's the only thing saving the X700 Pro right now from being completely pointless.

Log in

Don't have an account? Sign up now