Final Words

When Valve and ATI came together to show us the first inklings of Half Life 2 performance last year, it did not look pretty for NVIDIA.  NVIDIA’s highest end card at the time, the GeForce FX 5900 Ultra, could not even outperform a Radeon 9600 Pro in most tests – much less anything from ATI at its price point.  Even though we haven’t shown it here (that’s coming in Part II), the situation has not changed for NVIDIA’s NV3x line of GPUs – they still must be treated as DirectX 8 hardware, otherwise they suffer extreme performance penalties when running Half Life 2 using the DirectX 9 codepath.  To give you a little preview of what is to come, in DirectX 9 mode, the GeForce 5900 Ultra offers about 1/3 of the performance of the slowest card in this test.  If you’re unfortunate enough to have purchased a NV3x based graphics card, you’re out of luck with running Half Life 2 using the DX9 codepath (at any reasonable frame rates). 

What we were missing from looking at Half Life 2 performance a year ago was the release of NVIDIA’s NV4x line of GPUs, which have effectively “saved” NVIDIA from delivering embarrassing performance under Half Life 2.  In fact, NVIDIA’s GeForce 6 line of GPUs actually runs Half Life 2 extremely well, even when pitted up against equivalently priced competition from ATI. 

Our final Head to Head comparisons revealed a few interesting things:

The GeForce 6800 Ultra performs very similarly to the X800 XT as long as antialiasing and anisotropic filtering are disabled.  With those two features enabled, the X800 XT begins to show a performance advantage that is truly seen at 1280 x 1024 and 1600 x 1200 with 4X AA enabled.  If you are running with AA disabled, the two GPUs perform very similar to each other.  It is only at 1600 x 1200 that the performance becomes somewhat noticeable between the two, as the X800 XT averaged 8% faster than the 6800 Ultra.  However, turning on antialiasing and anisotropic filtering gave the X800 XT between a 4 – 20% advantage depending on resolution, which definitely isn’t shabby. 

At the $400 price point, the X800 Pro and the GeForce 6800GT are basically equal performers in all of the resolutions we tested (regardless of whether or not AA/aniso was enabled).  So the recommendation here goes either way, look at the performance of the cards in some of the other games you play to determine which one is right for you. 

If you’re spending $200 - $300 you’ve got three choices for PCI Express graphics cards, and one for AGP.  The NVIDIA GeForce 6800 is 12-pipe underclocked version of the 6800GT/Ultra and currently sells for close to $300, however in Half Life 2 the performance of the regular 6800 is not any better than the cheaper 6600GT, thus making our NVIDIA recommendation clear.  But how does the 6600GT stack up to the X700 XT?  The two GPUs are basically equal performers under Half Life 2, although the X700 XT is faster with AA enabled. If you need an AGP card however, then the 6600GT AGP is your only option (and far from a bad one at that).

We’ve left a number of questions unanswered here today involving older/slower hardware, so be sure to check back for part II of our Half Life GPU comparison to find out how well older hardware performs under Valve’s amazing game.  Thanks for taking a break from playing Half Life 2 to read this, now get back to it…

Head to Head: NVIDIA GeForce 6800 vs. NVIDIA GeForce 6600GT
Comments Locked

79 Comments

View All Comments

  • Nuke Waste - Thursday, December 16, 2004 - link

    Would it be possible for AT to update the timedemos to Source Enigne 7? Steam "graciously" updated my HL2 platform, and now none of my timedemos work!
  • The Internal - Friday, December 3, 2004 - link

    Which x700 XT card was used? How much RAM did it have?
  • VortigernRed - Tuesday, November 23, 2004 - link

    "Remember that we used the highest detail settings with the exception of anisotropic filtering and antialiasing, "

    That is not what you are showing on the SS on page 2. You are showing there that you have the water details set to "reflect world" not "reflect all".

    I would be interested to see how that affects the performance in your benchmarks with water in them, as some sites are showing larger wins for ATI and it seems possible that this setting may be the difference.

    It certainly looks much better in game with "reflect all" but does affect the performance.

    PS, sorry for the empty post above, trying to guess my username and password!
  • VortigernRed - Tuesday, November 23, 2004 - link

  • Warder45 - Sunday, November 21, 2004 - link

    I'd like to know what you guys think about X0bit's and other reviews that have ATI way ahead in numbers do to turning on Reflect All and not just reflect world.

    http://www.chaoticdreams.org/ce/jb/ReflectAll.jpg
    http://www.chaoticdreams.org/ce/jb/ReflectWorld.jp...

    Some SS.
  • Counterspeller - Friday, November 19, 2004 - link

    I forgot about my specs : P4 3.0 3HD 8, 16, 60Gb, MB P4P800-E Deluxe, Samtron 96BDF Screen.
  • Counterspeller - Friday, November 19, 2004 - link

    I don't understand... I have a GeForce 256 DDR, and the ONLY game that I have not been able to play is DOOM 3, only because it asks for 64Mb of VRAM, and I only have 32. I'd like to play HL2, but I don't have it. Perhaps it'll be like D3... not enough VRAM, and in that case, the 2nd game I can't play with that board. What I don't understand is this : how can anyone be complaining because x game or y game «only» gives us 200 fps... Can YOU see 200 fps ? we're happy with 24fps on TV, 25fps in the theaters, and we're bitchin' about some game that only gives us 56.7 fps instead of the «behold perfection» 67.5. I know there is a difference, and yes, we can see that difference, but is it useful, in terms of gameplay ? Will you be fragged because of a 1 or 2 or even 3 fps difference between you and your opponent ? Stupidity gets us fragged, not fps. I believe that anything below 30/40 fps is nice, but unplayable, when it comes to action games. I'm happy with 60. Anything above it is extra. I have played with this very board many demanding games, and I can say that yes, some parts are demanding on the board. But I never lost because of it. Resuming : I don't understand this war between ATI lovers and NVIDIA lovers. I've been using the same board for years, and I never needed to change it. Unless it crumbles, I'll stick with it.
  • Counterspeller - Friday, November 19, 2004 - link

    I don't understand... I have a GeForce 256 DDR, and the ONLY game that I have not been able to play is DOOM 3, only because it asks for 64Mb of VRAM, and I only have 32. I'd like to play HL2, but I don't have it. Perhaps it'll be like D3... not enough VRAM, and in that case, the 2nd game I can't play with that board. What I don't understand is this : how can anyone be complaining because x game or y game «only» gives us 200 fps... Can YOU see 200 fps ? we're happy with 24fps on TV, 25fps in the theaters, and we're bitchin' about some game that only gives us 56.7 fps instead of the «behold perfection» 67.5. I know there is a difference, and yes, we can see that difference, but is it useful, in terms of gameplay ? Will you be fragged because of a 1 or 2 or even 3 fps difference between you and your opponent ? Stupidity gets us fragged, not fps. I believe that anything below 30/40 fps is nice, but unplayable, when it comes to action games. I'm happy with 60. Anything above it is extra. I have played with this very board many demanding games, and I can say that yes, some parts are demanding on the board. But I never lost because of it. Resuming : I don't understand this war between ATI lovers and NVIDIA lovers. I've been using the same board for years, and I never needed to change it. Unless it crumbles, I'll stick with it.
  • TheRealSkywolf - Friday, November 19, 2004 - link

    I have a fx 5950, i have turned on the x9 path and things run great. 1st and all the graphics dont look much better, you see slight differences on the water and in some bumpmapping, but minor things.
    So i guess its time for Ati fans to shut up, both the fx and the 9800 cards run the game great.
    Man, doom3 showed all the wistles and bells, why wouldnt hl2? I think is very unprofessional from Valve to do what they did.
  • SLI - Friday, November 19, 2004 - link

    Umm, why was the Radeon P.E. not tested?

Log in

Don't have an account? Sign up now