Introduction

Today, we'll be covering the performance of 11 different vendor's versions of the Geforce 6600GT. When that many of the same part get into the same room at the same time, you know that we're going to have a good cross-section of what the market should look like. If you're interested in buying a 6600GT, then this is the article for you.

Not only will we see what all these different vendors have to offer to you as a customer, but we will really see how hard the NV43 can be pushed, pulled, and stretched when it hits your system. We don't usually like to test overclocking on a large scale with the engineering sample parts that NVIDIA and ATI send us just after a product launch. These test samples are often just strung together by the skin of their IHV's proverbial teeth. It's not uncommon to see wires, resistors, and capacitors soldered onto an early PCB. We're actually lucky that these things work at all in some cases. We received an overclocked 6800 Ultra Extreme from NVIDIA that never booted, as well as an NV41 that was DOA. These preproduction boards are not the kind of boards that we would actually buy and use in our home systems.

And so, when an incredible number of vendors responded to our call for parts, we were very happy. Shipping parts means that we have what the end user will have. Heat tests, noise tests, overclocking tests - they all become very relevant and interesting. We will be looking at which vendors offer the best products to the consumer. Cards will be judged based on their idle and load thermal diode temperatures, the sound pressure level in dB of the system at a one meter distance, overclockability, features, bundle, and price.

We do spend a lot of time looking at the benchmarks of these cards at overclocked speeds, but these benchmarks aren't the "be all, end all" judge of what vendor makes a better card. First of all, the potential of any given ASIC to achieve a certain overclock is not something over which a vendor can have any power, unless they bin their chips and sell a special line of overclocker friendly cards (or, more likely, pre-overclocked cards). None of these 6600GTs fall into that category. This means that our BrandX card running at a certain speed doesn't guarantee anything about yours.

Overclocking tests are still important, as they assure that the cards which do achieve a high stable clock are able to support a GPU that is capable of running at a high speed. Some boards are not. It's just more of an art than a science sometimes and these numbers shouldn't be used as an absolute metric.

Heat management is especially important when overclocking. With a new breed of game on store shelves, such as Doom 3, Half-Life 2, and the onslaught of titles that will surely be based on their engines, GPU temperatures have no where to go but up. Increasing the core clock speed will help performance, but in our tests, it also raised maximum load temperature by a degree or two. The more a graphics card maker can do to keep heat down, the better. And that will be especially tricky with these cards once they've been in end users' hands for a while. Allow me to explain.

The way that the cooling solution attaches to NVIDIA's reference design is with 2 holes. Basically, the popular rectangular heatsink design is positioned precariously on top of the GPU and can pivot easily around the ASIC. This means: don't touch the heatsink. This really causes problems in situations where the thermal tape or glue is used. The kind of fulcrum that the NVIDIA reference design created is beyond powerful enough to tear through tape and snap the strongest glue without a second thought. Once those seals have been broken, cooling is severely compromised. Looking back at our numbers, this may be the reason why we see some of the extreme temperature numbers that we do. Of course, we were extraordinarily careful to avoid touching any HSFs after we realized what was going on.

Overclocking our Geforce 6600GTs
Comments Locked

84 Comments

View All Comments

  • Bonesdad - Wednesday, February 16, 2005 - link

    Yes, I too would like to see an update here...have any of the makers attacked the HSF mounting problems?
  • 1q3er5 - Tuesday, February 15, 2005 - link

    can we please get an update on this article with more cards, and replacements of defective cards?

    I'm interested in the gigabyte card
  • Yush - Tuesday, February 8, 2005 - link

    Those temperature results are pretty dodge. Surely no regular computer user would have a caseless computer. Those results are only favourable and only shed light on how cool the card CAN be, and not how hot they actually are in a regular scenario. The results would've been much more useful had the temperature been measured inside a case.
  • Andrewliu6294 - Saturday, January 29, 2005 - link

    i like the albatron best. Exactly how loud is it? like how many decibels?
  • JClimbs - Thursday, January 27, 2005 - link

    Anyone have any information on the Galaxy part? I don't find it in a pricewatch or pricegrabber search at all.
  • Abecedaria - Saturday, January 22, 2005 - link

    Hey there. I noticed that Gigabyte seems to have modified their HSI cooling solution. Has anyone had any experience with this? It looks much better.

    Comments?
    http://www.giga-byte.com/VGA/Products/Products_GV-...

    abc
  • levicki - Sunday, January 9, 2005 - link

    Derek, do you read your email at all? I got Prolink 6600 GT card and I would like to hear a suggestion on improving the cooling solution. I can confirm that retail card reaches 95 C at full load and idles at 48 C. That is really bad image for nVidia. They should be informed about vendor's doing poor job on cooling design. I mean, you would expect it to be way better because those cards ain't cheap.
  • levicki - Sunday, January 9, 2005 - link

  • geogecko - Wednesday, January 5, 2005 - link

    Derek. Could you speculate on what thermal compound is used to interface between the HSF and the GPU on the XFX card? I e-mailed them, and they won't tell me what it is?! It would be great if it was paste or tape. I need to be able to remove it, and then later, would like to re-install it. I might be able to overlook not having the component video pod on the XFX card, as long as I get an HDTV that supports DVI.
  • Beatnik - Friday, December 31, 2004 - link


    I thought I would add about the DUAL-DVI issue, in the new NVIDIA drivers, they show that the second DVI can be used for HDTV output. It appears that even the overscan adjustments are there.

    So not having the component "pod" on the XFX card appears to be less of a concern than I thought it might be. It would be nice to hear if someone tried running 1600x1200 + 1600x1200 on the XFX, just to know if the DVI is up to snuff for dual LCD use.

Log in

Don't have an account? Sign up now