Head to Head: ATI Radeon X700 XT vs. NVIDIA GeForce 6600GT

As far as PCI Express platforms go, the Radeon X700 XT and the GeForce 6600GT are about as evenly matched as you can get in terms of price and performance and thankfully they are both readily available today. Let's see how they perform head to head under Half Life 2:

In our first demo, the two basically tie - we aren't considering performance differences of ~3% or less to be anything significant.

Half Life 2 AT_canals_08.dem
 
ATI Radeon X700 XT
NVIDIA GeForce 6600GT
Performance Advantage
1024 x 768
116.4
113.5
2.6%
1280 x 1024
78.7
78.1
0.8%
1600 x 1200
55.9
57.7
3.1%
1024 x 768 - 4X AA
75.3
72.7
3.6%
Winner
-
-
Tie

Our second demo shows the X700 XT pulling ahead and definitely taking the lead when AA is enabled.

Half Life 2 AT_coast_05.dem
 
ATI Radeon X700 XT
NVIDIA GeForce 6600GT
Performance Advantage
1024 x 768
133.5
129.6
3.0%
1280 x 1024
112.3
107.9
4.1%
1600 x 1200
81.9
78.5
4.3%
1024 x 768 - 4X AA
115.8
107.2
8.0%
Winner
-
-
X700 XT

We have a tie once again in our third demo:

Half Life 2 AT_coast_12.dem
 
ATI Radeon X700 XT
NVIDIA GeForce 6600GT
Performance Advantage
1024 x 768
115.7
113.6
1.8%
1280 x 1024
88
88.2
0.2%
1600 x 1200
63
64.4
2.2%
1024 x 768 - 4X AA
87.5
87.1
0.5%
Winner
-
-
Tie

Our fourth demo shows the X700 XT pulling far ahead with AA enabled, but otherwise the two perform identically:

Half Life 2 AT_prison_05.dem
 
ATI Radeon X700 XT
NVIDIA GeForce 6600GT
Performance Advantage
1024 x 768
119
116.1
2.5%
1280 x 1024
79.4
77.2
2.8%
1600 x 1200
55.5
55.7
0.4%
1024 x 768 - 4X AA
85.1
74.6
14.1%
Winner
-
-
X700 XT

In our final demothe X700 XT manages to maintain the greatest performance advantage, even at 1600 x 1200 without AA the X700 XT is over 10% faster than the 6600GT.

Half Life 2 AT_c17_12.dem
 
ATI Radeon X700 XT
NVIDIA GeForce 6600GT
Performance Advantage
1024 x 768
87.3
82.9
5.3%
1280 x 1024
82.2
76.4
7.6%
1600 x 1200
69.2
61.6
12.3%
1024 x 768 - 4X AA
77.4
70
10.6%
Winner
-
-
X700 XT

We averaged all of the X700 XT's wins together to make up the table below. From the looks of it, the X700 XT is significantly faster when AA is enabled, but is otherwise a relative equal to the 6600GT. What is important to note here is that the sweet spot for image quality and performance on these two cards appears to be at 1280 x 1024, where they are both virtually equal in performance.

Summary
 
Average Performance Advantage (X700 XT over 6600GT)
1024 x 768
3.0%
1280 x 1024
3.1%
1600 x 1200
4.5%
1024 x 768 - 4X AA
7.3%
Head to Head: ATI Radeon X800 Pro vs. NVIDIA GeForce 6800GT Head to Head: NVIDIA GeForce 6800 vs. NVIDIA GeForce 6600GT
Comments Locked

79 Comments

View All Comments

  • Anand Lal Shimpi - Wednesday, November 17, 2004 - link

    Thanks for all of the comments guys. Just so you know, I started on Part 2 the minute the first article was done. I'm hoping to be done with testing by sometime tomorrow and then I've just got to write the article. Here's a list of the new cards being tested:

    9600XT, 9550, 9700, X300, GF 6200, GF 5900XT, GF4 Ti 4600, GF4 MX440

    I'm doing both DX9 and DX8 comparisons, including image quality.

    After Part 2 I think I'll go ahead and do the CPU comparison, although I've been thinking about doing a more investigative type of article into Half Life 2 performance in trying to figure out where its performance limitations exist, so things may get shuffled around a bit.

    We used the PCI Express 6600GT for our tests, but the AGP version should perform quite similarly.

    The one issue I'm struggling with right now is the fact that the X700 XT is still not available in retail, while the X700 Pro (256MB) is. If I have the time I may go back and run some X700 Pro numbers to make this a more realistic present-day comparison.

    Any other requests?

    Take care,
    Anand
  • Cybercat - Wednesday, November 17, 2004 - link

    You guys made my day comparing the X700XT, 6800, and 6600GT together. One question though (and I apologize if this was mentioned in the article and I missed it), did you guys use the PCIe or AGP version of the 6600GT?
  • Houdani - Wednesday, November 17, 2004 - link

    18: Many users rely on hardware review sites to get a feel for what technology is worth upgrading and when.

    Most of us have financial contraints which preclude us from upgrading to the best hardware, therefore we are more interested in knowing how the mainstream hardware performs.

    You are correct that it would not be an efficient use of resources to have AT repeat the tests on hardware that is two or three generations old ... but sampling the previous generation seems appropriate. Fortunately, that's where part 2 will come in handy.

    I expect that part 2 will be sufficient in showing whether or not the previous generation's hardware will be a bottleneck. The results will be invaluable for helping me establish my minimum level of satisfaction for today's applications.
  • stelleg151 - Wednesday, November 17, 2004 - link

    forget what i said in 34.....
  • pio!pio! - Wednesday, November 17, 2004 - link

    So how do you softmod a 6800NU to a 6800GT???
    or unlock the extra stuff....
  • stelleg151 - Wednesday, November 17, 2004 - link

    What drivers were being used here, 4.12 + 67.02??
  • Akira1224 - Wednesday, November 17, 2004 - link

    Jedi

    lol I should have seen that one coming!
  • nastyemu25 - Wednesday, November 17, 2004 - link

    i bought a 9600XT because it came boxed with a free coupon for HL2. and now i can't even see how it matches up :(
  • coldpower27 - Wednesday, November 17, 2004 - link

    These benchmarks are more in line with what I was predicting, the x800 Pro should be equal to 6800 GT due to similar Pixel Shader fillrate while the X800 XT should have an advantage at higher resolutions due to it's having a higher fillrate being clocked higher.

    Unlike DriverATIheaven:P.

    This is great I am happy knowing Nvidia's current generation of hardware is very competitive in performance in all aspects when at equal amounts of fillrate.
  • Da3dalus - Wednesday, November 17, 2004 - link

    In the 67.02 Forceware driver there's a new option called "Negative LOD bias", if I understand what I've read correctly it's supposed to reduce shimmering.

    What was that option set to in the tests? And how did it affect performance, image quality and shimmering?

Log in

Don't have an account? Sign up now