Half-Life 2 Performance

Unfortunately, we were unable to test the Intel platform under Half-Life 2. We aren't quite sure how to explain the issue that we were seeing, but in trying to run the game, the screen would flash between each frame. There were other visual issues as well. Due to these issues, performance was not comparable to our other systems. Short of a hard crash, this was the worst possible kind of problem that we could have seen. It is very likely that this could be an issue that NVIDIA may have fixed between 71.20 and 66.81 on Intel systems, but we are unable to test any other driver at this time.

We are also not including the 6800 Ultra scores, as the numbers that we were using as a reference (from our article on NVIDIA's official SLI launch) were run using the older version 6 of Half-Life 2 as well as older (different) versions of our timedemos.

Continuing our trend with Half-Life 2 and graphics card tests, we've benched the game in 5 different levels in two resolutions with and without 4xAA/8xAF enabled. We've listed the raw results in these tables for those who are interested. For easier analysis, we've taken the average performance of each of the 5 level tests and compared the result in our graphs below.

 Half-Life 2 1280x1024 noAA/AF
   at_c17_12  at_canals_08  at_coast_05  at_coast_12  at_prison_05
NVIDIA GeForce 6600 GT 76.1 71.3 115 94.9 80
2x NVIDIA 6600 GT (AMD SLI) 77.2 98.8 118.5 117.5 116.6
Gigabyte 2x6600GT 3D1 77.3 99.1 118.7 117.9 118.2

 Half-Life 2 1600x1200 noAA/AF
   at_c17_12  at_canals_08  at_coast_05  at_coast_12  at_prison_05
NVIDIA GeForce 6600 GT 61.1 55.7 91.5 69.3 57.6
2x NVIDIA 6600 GT (AMD SLI) 73.5 85.8 110.8 104.9 92.9
Gigabyte 2x6600GT 3D1 73.6 87 111.4 106 94.6

 Half-Life 2 1280x1024 4xAA/8xAF
   at_c17_12  at_canals_08  at_coast_05  at_coast_12  at_prison_05
NVIDIA GeForce 6600 GT 40.5 40.1 74.8 54.9 45.1
2x NVIDIA 6600 GT (AMD SLI) 45.2 47.8 92.4 81 61
Gigabyte 2x6600GT 3D1 45.7 47.8 94.1 82.8 62.3

 Half-Life 2 1600x1200 4xAA/8xAF
   at_c17_12  at_canals_08  at_coast_05  at_coast_12  at_prison_05
NVIDIA GeForce 6600 GT 27.3 27.2 43.3 32.7 28.1
2x NVIDIA 6600 GT (AMD SLI) 33.8 35.3 58.1 49.2 39.8
Gigabyte 2x6600GT 3D1 33.8 35.3 59.3 50.8 40.6

The 3D1 averages about one half to one frame higher in performance than two stock clocked 6600 GT's in SLI mode. This equates to a difference of absolutely nothing with frame rates of near 100fps. Even the overclocked RAM doesn't help the 3D1 here.

Half-Life 2 Average Performance

Half-Life 2 Average Performance

We see more of the same when we look at performance with anti-aliasing and anisotropic filtering enabled. The Gigabyte 3D1 performs on par with 2 x 6600GT cards in SLI. With the RAM overclocked, this means that the bottleneck under HL2 is somewhere else when running in SLI mode. We've seen GPU and RAM speed to impact HL2 performance pretty evenly under single GPU conditions.

Half-Life 2 Average Performance

Half-Life 2 Average Performance

Far Cry v1.3 Performance Unreal Tournament 2004 Performance
Comments Locked

43 Comments

View All Comments

  • Gigahertz19 - Thursday, January 6, 2005 - link

    1st is the worst...2nd is the best....3rd is the one with the hairy chest :)
  • bbomb - Thursday, January 6, 2005 - link

    It seems like Nvidia just wants to make sure that none of their partners can benefit from SLI technology to ensure that Nvidia has some new technology to introduce in th future.

    I bet Nvidia already has a multi-gpu card that work on any board and can probably work in SLI with another multi-GPU card sitting in a cabinet somewhere until Nvidia sees fit to let us get our hand on the technology.

    I hope ATI's solution stomps Nvidias into the ground, but then again Nvidias software team cant seem to get it right and they blow away ATI's driver progam which leads me to beleive that ATI will have driver problems as well.
  • HardwareD00d - Thursday, January 6, 2005 - link

    yippie first post!

Log in

Don't have an account? Sign up now