The Test

AMD Athlon 64 FX-55
2 x 512MB OCZ PC3200 EL Dual Channel DIMMs 2-2-2-10
MSI K8N Neo2 nForce3 Motherboard
ATI Catalyst 4.11 Drivers
NVIDIA ForceWare 66.93 Drivers

Doom 3 Performance

Since even before its release Doom 3 has clearly been an NVIDIA selling point. It is built around OpenGL, and preliminary benchmarks showed NVIDIA hardware running Doom 3 faster than ATI a year before the game's release. The end product doesn't seem to have deviated from the initial track, and surely NVIDIA couldn't be happier. We see here that the 6600GT is outperforming ATI's 12 pipe x800 Pro part with the the 6800 GT setting the bar on performance very high.

Doom 3

Under Doom 3 the 6600GT is a very powerful midrange card. Our resolution scaling graph shows a card with a profile that exceeds that of the x800 Pro. All of the previous generation cards fall a good distance behind the top three contenders in this test.


When 4x AA is enabled under Doom 3, the 6600GT dips under the x800 Pro in resolution scaleing. The 6600GT is still a midrange card, and 4xAA at high resolutions is going to be easier to handle on the higher end x800. The 6600GT puts in a very good showing overall.


Head to Head: NVIDIA GeForce 5900XT vs. NVIDIA GeForce 6600GT Counterstrike: Source Visual Stress Test
Comments Locked

66 Comments

View All Comments

  • Pythias - Wednesday, November 17, 2004 - link

    >>The impact of the bridge, as I mentioned in the review, is negligible. The bridge + slower memory results in a 0 - 5% performance difference between the PCI Express and AGP versions of the 6600GT (the 5% figure being because of the additional memory bandwidth courtesy of the 500/1000 clock vs. 500/900).

    Just so you guys know, I went out and picked up a vanilla 6800 for inclusion in my upcoming Half Life 2 GPU comparison. Know that your voice has been heard :)

    Take care,
    Anand<<<

    Anand, you kick teh bootay.
  • Poser - Tuesday, November 16, 2004 - link

    #42 He's not benching them with the fastest processor he can get his hands on just to show off what cool hardware he's got, you know. If you match up a fast video card with a slower processor, you can get benchmark scores that are CPU limited, instead of GPU limited like you want to see. You can see a little bit of what CPU limiting looks like when you look at the low resolution benchmarks with older games, and even with Unreal Tournament 2004 in this review. Every card ends up with essentially the same score, because it's no longer the video card that's the bottleneck -- it's the rest of the system, chiefly the CPU.

    If you knew all that already, my apologies for the mini-lecture =). I agree that it's nice to occassionally see benchmarks with a range of processors so that you can spot "yours" and see what sort of performance boost you'd get by upgrading, but it hardly seems practical to do that for every video card review and if you've got to pick ONE processor to test everything on, then the fastest available is a good choice.
  • thebluesgnr - Tuesday, November 16, 2004 - link

    #41,
    the PT894 Pro chipset should be sampling right now.

  • bhtooefr - Tuesday, November 16, 2004 - link

    draazeejs: Anand compared it against other cards that are the same price. So, a 2 year old card that is now that same price IS a fair comparison.
  • Niatross - Tuesday, November 16, 2004 - link

    I know you've heard this comment a million times before. I don't have a FX 55 I've got an Athlon 2500 mobile. These benchs mean absolutly nothing to me
  • Tanclearas - Tuesday, November 16, 2004 - link

    "Most enthusiast users appear to be sticking with their AGP platforms and while they would consider a GPU upgrade, they are not willing to upgrade their motherboard (and sometimes CPU and memory) just to get a faster graphics card."

    Don't you think this has something to do with the fact that you still can't purchase AMD PCIe boards? Not to mention that it looks like the only (realistic) SLI solution that will be available in the next several months will be for Athlon 64.
  • Pete - Tuesday, November 16, 2004 - link

    #28, as ATi won't be releasing the X700XT in AGP form for quite some time, and as they're actually going to (continue to) use the 9800P as competition at the $200 price point, your accusation is wholly without merit. If you want to see X700XT vs. 6600GT numbers, just read Anand's X700XT review. As it stands, the 6600GT is unchallenged in the field of new AGP cards at $200.

    But it's way overpriced for the $250 NewEgg is charging for it, dual DVI or not. For $250, you're better off with the BFG 6800OC at Outpost.com (which may even come with Far Cry, making it an even better deal).
  • coldpower27 - Tuesday, November 16, 2004 - link

    The review at firingsquad also seems to paint the same pciture, the conclusion there is similar in wording to the conclusion here to me. It seems the 6600 GT AGP is most definitely a good video card for the mass market :P
  • ChronoReverse - Tuesday, November 16, 2004 - link

    To #28

    A quote from HardOCP

    "One thing is for sure, the GeForce 6600GT and the Radeon X700XT are very competitive products when it comes to overall gaming performance. If we had to edge out a card that offers up the better value we would have to lean towards the GeForce 6600GT at this point in time. In our two days of X700XT experience we saw it get held out of the top spot in terms of both framerate and image quality by the GeForce 6600GT. Keep in mind that the GeForce 6600 series also packs the performance potential of Shader Model 3.0 once games start using it."

    Any nVidia bias you attribute to Anandtech is unfounded.
  • vailr - Tuesday, November 16, 2004 - link

    Some AVSForum.com (/Home Theater Computers) postings had said that the 6600 video processor was fine; that only the 6800 (the AGP version specifically) had certain hardware problems, which "cannot be cured by a driver update". Or, that maybe some future Windows Media Player update would be needed, to enable hardware assisted .wmv files.
    So, general confusion, as to what the real facts are.
    "nVidia admits 6800 has faulty on chip decoder":
    http://www.avsforum.com/avs-vb/showthread.php?s=&a...

Log in

Don't have an account? Sign up now