Introduction

Today, we'll be covering the performance of 11 different vendor's versions of the Geforce 6600GT. When that many of the same part get into the same room at the same time, you know that we're going to have a good cross-section of what the market should look like. If you're interested in buying a 6600GT, then this is the article for you.

Not only will we see what all these different vendors have to offer to you as a customer, but we will really see how hard the NV43 can be pushed, pulled, and stretched when it hits your system. We don't usually like to test overclocking on a large scale with the engineering sample parts that NVIDIA and ATI send us just after a product launch. These test samples are often just strung together by the skin of their IHV's proverbial teeth. It's not uncommon to see wires, resistors, and capacitors soldered onto an early PCB. We're actually lucky that these things work at all in some cases. We received an overclocked 6800 Ultra Extreme from NVIDIA that never booted, as well as an NV41 that was DOA. These preproduction boards are not the kind of boards that we would actually buy and use in our home systems.

And so, when an incredible number of vendors responded to our call for parts, we were very happy. Shipping parts means that we have what the end user will have. Heat tests, noise tests, overclocking tests - they all become very relevant and interesting. We will be looking at which vendors offer the best products to the consumer. Cards will be judged based on their idle and load thermal diode temperatures, the sound pressure level in dB of the system at a one meter distance, overclockability, features, bundle, and price.

We do spend a lot of time looking at the benchmarks of these cards at overclocked speeds, but these benchmarks aren't the "be all, end all" judge of what vendor makes a better card. First of all, the potential of any given ASIC to achieve a certain overclock is not something over which a vendor can have any power, unless they bin their chips and sell a special line of overclocker friendly cards (or, more likely, pre-overclocked cards). None of these 6600GTs fall into that category. This means that our BrandX card running at a certain speed doesn't guarantee anything about yours.

Overclocking tests are still important, as they assure that the cards which do achieve a high stable clock are able to support a GPU that is capable of running at a high speed. Some boards are not. It's just more of an art than a science sometimes and these numbers shouldn't be used as an absolute metric.

Heat management is especially important when overclocking. With a new breed of game on store shelves, such as Doom 3, Half-Life 2, and the onslaught of titles that will surely be based on their engines, GPU temperatures have no where to go but up. Increasing the core clock speed will help performance, but in our tests, it also raised maximum load temperature by a degree or two. The more a graphics card maker can do to keep heat down, the better. And that will be especially tricky with these cards once they've been in end users' hands for a while. Allow me to explain.

The way that the cooling solution attaches to NVIDIA's reference design is with 2 holes. Basically, the popular rectangular heatsink design is positioned precariously on top of the GPU and can pivot easily around the ASIC. This means: don't touch the heatsink. This really causes problems in situations where the thermal tape or glue is used. The kind of fulcrum that the NVIDIA reference design created is beyond powerful enough to tear through tape and snap the strongest glue without a second thought. Once those seals have been broken, cooling is severely compromised. Looking back at our numbers, this may be the reason why we see some of the extreme temperature numbers that we do. Of course, we were extraordinarily careful to avoid touching any HSFs after we realized what was going on.

Overclocking our Geforce 6600GTs
Comments Locked

84 Comments

View All Comments

  • 1q3er5 - Thursday, December 16, 2004 - link

    errr weird how the albatron despite its so called HSF mounting problem scored so high on all the tests albeit a bit loud and didn't get an award !

    Also looks like LEADTEK changed the design of the board of the bit

    http://www.leadtek.com/3d_graphic/winfast_a6600_gt...

    They added a heatsink on the RAM and you may also notice that the shroud now extends right over the other chips on the card.
  • miketus - Thursday, December 16, 2004 - link

    Hi, has anyboby experience with Albatron 6600GT for AGP
  • geogecko - Monday, December 13, 2004 - link

    Personally, I'd be willing to spend the extra $15-20 to get a decent HSF on these cards. Of course, the first one I buy will go in an HTPC, which will all be passively cooled, so the HSF in this case doesn't matter, because I'll just be removing it.

    However, for my PC, I sure would like a decent quality HSF. It would stink to have a $200 card burn up in your PC because of a $10 HSF setup.
  • WT - Monday, December 13, 2004 - link

    Interesting that GigaByte used a passive HSF on their 6800 card (with great results), but went with a craptastic fan on the 6600GT. I have an MSI 5900 and didn't want to settle for the cheesy MSI 5900XT cards HSF setup, so we are seeing the same thing occur with the 6600GTs .... cut costs by using a cheaper HSF.
    Excellent article .. I found it answered every question I had left on the GT cards, further convincing me to buy the 6800 series.
  • DerekWilson - Sunday, December 12, 2004 - link

    #49 -- it was a problem with our sample ... the actual issue was not a design flaw, but if the design (of most 6600 GT cards) was different, it might have been possible for our sample to have avoid breakage.

    That's kind of a complicated way of saying that you should be alright as long as you are careful with the card when you install it.

    After it's installed, the way to tell if you have a problem is to run a 3D game/application in windowed mode. Open display properies and click on the system tab. Hit the advanced button and select the NVIDIA tab. select the temperature option and if you see temperatures of 90 degrees C and higher, you probably have a problem.

    if your temp is lower than that you're fine.
  • Vico26 - Sunday, December 12, 2004 - link

    Derek,

    was the 6600 GT MSI a broken piece, or is there a problem with the HS design? Plz let me know, as I bought the MSI card on the same day as you published the article. Now, I am shocked, and I would like to find a solution - new cooling system? Am I able to install it (I m not a sort of professional)?

    Anyway many thanks, I should have waited a day...
  • DerekWilson - Sunday, December 12, 2004 - link

    http://www.gfe.com.hk/news/buy.asp
  • Nyati13 - Sunday, December 12, 2004 - link

    What I'd like to know is where are the Galaxy 6600GTs available? I've looked at some e-tailers that I know of, and searched pricewatch and e-bay, and there aren't any Galaxy cards for sale.
  • geogecko - Sunday, December 12, 2004 - link

    Well, I actually meant to say something in that last post.

    Anyway, short and sweet. That's the way I like these articles. Who wants to spend more than about 15-30 minutes to find out which card is best for them.

    I do think that the HDTV thing could have been looked at, but other than that, it's a great article.
  • geogecko - Sunday, December 12, 2004 - link

Log in

Don't have an account? Sign up now