Introduction

Today, we'll be covering the performance of 11 different vendor's versions of the Geforce 6600GT. When that many of the same part get into the same room at the same time, you know that we're going to have a good cross-section of what the market should look like. If you're interested in buying a 6600GT, then this is the article for you.

Not only will we see what all these different vendors have to offer to you as a customer, but we will really see how hard the NV43 can be pushed, pulled, and stretched when it hits your system. We don't usually like to test overclocking on a large scale with the engineering sample parts that NVIDIA and ATI send us just after a product launch. These test samples are often just strung together by the skin of their IHV's proverbial teeth. It's not uncommon to see wires, resistors, and capacitors soldered onto an early PCB. We're actually lucky that these things work at all in some cases. We received an overclocked 6800 Ultra Extreme from NVIDIA that never booted, as well as an NV41 that was DOA. These preproduction boards are not the kind of boards that we would actually buy and use in our home systems.

And so, when an incredible number of vendors responded to our call for parts, we were very happy. Shipping parts means that we have what the end user will have. Heat tests, noise tests, overclocking tests - they all become very relevant and interesting. We will be looking at which vendors offer the best products to the consumer. Cards will be judged based on their idle and load thermal diode temperatures, the sound pressure level in dB of the system at a one meter distance, overclockability, features, bundle, and price.

We do spend a lot of time looking at the benchmarks of these cards at overclocked speeds, but these benchmarks aren't the "be all, end all" judge of what vendor makes a better card. First of all, the potential of any given ASIC to achieve a certain overclock is not something over which a vendor can have any power, unless they bin their chips and sell a special line of overclocker friendly cards (or, more likely, pre-overclocked cards). None of these 6600GTs fall into that category. This means that our BrandX card running at a certain speed doesn't guarantee anything about yours.

Overclocking tests are still important, as they assure that the cards which do achieve a high stable clock are able to support a GPU that is capable of running at a high speed. Some boards are not. It's just more of an art than a science sometimes and these numbers shouldn't be used as an absolute metric.

Heat management is especially important when overclocking. With a new breed of game on store shelves, such as Doom 3, Half-Life 2, and the onslaught of titles that will surely be based on their engines, GPU temperatures have no where to go but up. Increasing the core clock speed will help performance, but in our tests, it also raised maximum load temperature by a degree or two. The more a graphics card maker can do to keep heat down, the better. And that will be especially tricky with these cards once they've been in end users' hands for a while. Allow me to explain.

The way that the cooling solution attaches to NVIDIA's reference design is with 2 holes. Basically, the popular rectangular heatsink design is positioned precariously on top of the GPU and can pivot easily around the ASIC. This means: don't touch the heatsink. This really causes problems in situations where the thermal tape or glue is used. The kind of fulcrum that the NVIDIA reference design created is beyond powerful enough to tear through tape and snap the strongest glue without a second thought. Once those seals have been broken, cooling is severely compromised. Looking back at our numbers, this may be the reason why we see some of the extreme temperature numbers that we do. Of course, we were extraordinarily careful to avoid touching any HSFs after we realized what was going on.

Overclocking our Geforce 6600GTs
Comments Locked

84 Comments

View All Comments

  • Pete - Friday, December 10, 2004 - link

    Obviously Derek OCed himself to get this article out, and he's beginning to show error. Better bump your (alarm) clocks down 10MHz (an hour) or so, Derek.
  • pio!pio! - Friday, December 10, 2004 - link

    Noticed a typo. At one point your wrote 'clock stock speed' instead of 'stock clock speed' easy mistake.
  • Pete - Friday, December 10, 2004 - link

    Another reason to narrow the distance b/w the mic and the noise source is that some of these cards may go into SFFs, or cases that sit on the desk. 12" may well be more indicative of the noise level those users would experience.
  • Pete - Friday, December 10, 2004 - link

    Great article, Derek!

    As usual, I keep my praise concise and my constructive criticism elaborate (although I could argue that the fact that I keep coming back is rather elaborate praise :)). I think you made the same mistake I made when discussing dB and perceived noise, confusing power with loudness. From the following two sources, I see that a 3dB increase equates to 2x more power, but is only 1.23x as loud. A 10db increase corresponds to 10x more power and a doubling of loudness. So apparently the loudest HSFs in this roundup are "merely" twice as loud as the quietest.

    http://www.gcaudio.com/resources/howtos/voltagelou...
    http://www.silentpcreview.com/article121-page1.htm...

    Speaking of measurements, do you think 1M is a bit too far away, perhaps affording less precision than, say, 12"?

    You might also consider changing the test system to a fanless PSU (Antec and others make them), with a Zalman Reserator cooling the CPU and placed at as great a distance from the mic as possible. I'd also suggest simply laying the test system out on a piece of (sound-dampening) foam, rather than fitting it in a case (with potential heat trapping and resonance). The HD should also be as quiet as possible (2.5"?).

    I still think you should buy these cards yourselves, a la Consumer Reports, if you want true samples (and independence). Surely AT can afford it, and you could always resell them in FS/FT for not much of a loss.

    Anyway, again, cheers for an interesting article.
  • redavnI - Thursday, December 9, 2004 - link

    Very nice article, but any chance we could get a part 2 with any replacement cards the manufacturers send and I'd like the see the Pine card reviewed too. It's being advertised as the Anandtech Deal at the top of this article and has dual dvi like the XFX card. Kind of odd one of the only cards not reviewed gets a big fat buy me link.

    To me it seems that with the 6600GT/6800 series Nvidia has their best offering since the Geforce4 TI's...I'm sure I'm not the only one still hanging on to my Ti4600.

  • Filibuster - Thursday, December 9, 2004 - link

    Something I've just realized: The Gigabyte NX66T256D is not a GT yet supports SLI. Are they using a GT that can't run at the faster speeds and selling it as a 6600 standard? It has 256MB.
    We ordered two from a vendor who said it definately does SLI.

    http://www.giga-byte.com/VGA/Products/Products_GV-...

    Can you guys find out for sure?
  • TrogdorJW - Thursday, December 9, 2004 - link

    Derek, the "enlarged images" all seem to be missing, or else the links are somehow broken. I tested with Firefox and IE6 and neither one would resolve the image links.

    Other than that, *wow* - who knew HSFs could be such an issue? I'm quite surprised that they are only secured at two corners. Would it really have been that difficult to use four mount points? The long-term prospects for these cards are not looking too good.
  • CrystalBay - Thursday, December 9, 2004 - link

    Great job on the quality control inspections of these cards D.W. Hopefully IHV's take notice and resolve these potentially damageing problems.
  • LoneWolf15 - Thursday, December 9, 2004 - link

    I didn't see a single card in this review that didn't have a really cheesey looking fan...the type that might last a couple years if you're really lucky, but might last six months on some cards if you're not. The GeForce 6600GT is a decent card; for $175-250 (depending on PCIe or AGP) you'd think vendors would put a fan deserving of the price. My PNY 6800NU came with a squirrel-cage fan and super heavy heatsink that I know will last. Hopefully, Arctic Cooling will come out with an NV Silencer soon for the 6600 family; I wouldn't trust any of the fans I saw here to last.
  • Filibuster - Thursday, December 9, 2004 - link

    What quality settings were used in the games?

    I am assuming that Doom 3 is in medium since these are 128MB cards.
    I've read that there are some 6600GT 256MB cards coming out (Gigabyte GV-NX66T256D and MSI 6600GT-256E, maybe more) Please show us some tests with the 256MB models once they hit the streets (or if you know they are definately not, please tell us that too)

    Even though the cards only have 128bit bus, wouldn't the extra ram help out in places like Doom 3 where texture quality is a matter of ram quantity? The local video ram still has to be faster than fetching it from system ram.

Log in

Don't have an account? Sign up now