Introduction

Today, we'll be covering the performance of 11 different vendor's versions of the Geforce 6600GT. When that many of the same part get into the same room at the same time, you know that we're going to have a good cross-section of what the market should look like. If you're interested in buying a 6600GT, then this is the article for you.

Not only will we see what all these different vendors have to offer to you as a customer, but we will really see how hard the NV43 can be pushed, pulled, and stretched when it hits your system. We don't usually like to test overclocking on a large scale with the engineering sample parts that NVIDIA and ATI send us just after a product launch. These test samples are often just strung together by the skin of their IHV's proverbial teeth. It's not uncommon to see wires, resistors, and capacitors soldered onto an early PCB. We're actually lucky that these things work at all in some cases. We received an overclocked 6800 Ultra Extreme from NVIDIA that never booted, as well as an NV41 that was DOA. These preproduction boards are not the kind of boards that we would actually buy and use in our home systems.

And so, when an incredible number of vendors responded to our call for parts, we were very happy. Shipping parts means that we have what the end user will have. Heat tests, noise tests, overclocking tests - they all become very relevant and interesting. We will be looking at which vendors offer the best products to the consumer. Cards will be judged based on their idle and load thermal diode temperatures, the sound pressure level in dB of the system at a one meter distance, overclockability, features, bundle, and price.

We do spend a lot of time looking at the benchmarks of these cards at overclocked speeds, but these benchmarks aren't the "be all, end all" judge of what vendor makes a better card. First of all, the potential of any given ASIC to achieve a certain overclock is not something over which a vendor can have any power, unless they bin their chips and sell a special line of overclocker friendly cards (or, more likely, pre-overclocked cards). None of these 6600GTs fall into that category. This means that our BrandX card running at a certain speed doesn't guarantee anything about yours.

Overclocking tests are still important, as they assure that the cards which do achieve a high stable clock are able to support a GPU that is capable of running at a high speed. Some boards are not. It's just more of an art than a science sometimes and these numbers shouldn't be used as an absolute metric.

Heat management is especially important when overclocking. With a new breed of game on store shelves, such as Doom 3, Half-Life 2, and the onslaught of titles that will surely be based on their engines, GPU temperatures have no where to go but up. Increasing the core clock speed will help performance, but in our tests, it also raised maximum load temperature by a degree or two. The more a graphics card maker can do to keep heat down, the better. And that will be especially tricky with these cards once they've been in end users' hands for a while. Allow me to explain.

The way that the cooling solution attaches to NVIDIA's reference design is with 2 holes. Basically, the popular rectangular heatsink design is positioned precariously on top of the GPU and can pivot easily around the ASIC. This means: don't touch the heatsink. This really causes problems in situations where the thermal tape or glue is used. The kind of fulcrum that the NVIDIA reference design created is beyond powerful enough to tear through tape and snap the strongest glue without a second thought. Once those seals have been broken, cooling is severely compromised. Looking back at our numbers, this may be the reason why we see some of the extreme temperature numbers that we do. Of course, we were extraordinarily careful to avoid touching any HSFs after we realized what was going on.

Overclocking our Geforce 6600GTs
POST A COMMENT

84 Comments

View All Comments

  • princethorpe - Wednesday, May 04, 2005 - link

    I've been checking the various forums and found thisone on the 6600gt's excellent. I don't know if anyone else has found them but Asus are making these cards and do a faster than standard model by using faster memory they recon according to their site they run 10% faster than the standard. I've ordered the Asus board by preference because of the build quality Reply
  • GollumSmeagol - Monday, May 02, 2005 - link

    I came across a forum a few months ago here in Hungary, and the people were talking about Leadtek's 6600GTs being faulty/freezing. Strange enough, a few weeks later, the main distributor of Leadtek, took off 6600GTs from their pricelists on the web. Wonder if they are waiting for a bugfix, or simply ran out of stock and wait for the next shipment.

    Another beauty I've just came across, is Gigabyte's TurboForce edition, which is a slightly overclocked version of the 6600 series (both PCI-Ex and AGP 8x). I'm shopping for a SILENT AGP one, (that's where I came across this review), and found this beauty

    http://www.giga-byte.com/VGA/Products/Products_GV-...

    This one has sg. they call Silent-Pipe as a cooler. Not much specs on Gigabyte's page, but from the picture, it looks like there is no fan at all, just a huge copper(-colored?) heatsink, that covers about 2/3rd of the card. (Well, a Zalman FB123 could still be used to move some air)
    The memory clock is wrote to be 1120MHz (remember, TurboForce), plus when I zoomed in on to the box picture, I could spot "VIVO" written on the box. This is also supported by the info on the local dealer's page, where they say "Y" to the TV-OUT of the regular GV-N66T128D, but they say "IN/OUT" for the GV-N66T128VP. All this for roughly 20 USD extra (local price).
    Reply
  • dpp - Saturday, November 19, 2005 - link

    I've bought http://www.giga-byte.com/VGA/Products/Products_GV-...">Gigabyte GV-NX66T128VP (TurboForce, no fan at all)
    Start up temperature 52C, maximum 65C.
    Is that normal?
    Reply
  • ylp88 - Monday, April 18, 2005 - link

    I found the article quite informative. Thank you. I purchased two Palit 6600GT cards a week ago and have put them in SLI mode.

    I have a few questions/comments:
    1) The Palit overview is rather short compared to the others. The Palit card is also never mentioned on the last page. Is there a reason for this?
    2) The Palit cards I got DO NOT have memory heatsinks as indicted on the photo for the Palit card. The memory remins cool, however.

    Thanks again for the article.

    ylp88
    Reply
  • zexe - Wednesday, April 06, 2005 - link

    Do not go for XFX 6600GT !!!!
    The card is NOT longer equipped with 1.6ns
    The chips on my card are Samsung K4J55323QF-GC20
    THAT MEANS 2.ms !!!
    Reply
  • zexe - Wednesday, April 06, 2005 - link

    Reply
  • marketmuse - Friday, April 01, 2005 - link

    does anyone know the difference between the Leadtek A6600GT and PX6600GT, besides the PCI-E and AGP?

    I'm looking to purchase a A6600GT, but I don't know if it will have the same performance as the PX version.

    Thanks
    MM
    Reply
  • Monypennyuk - Monday, March 14, 2005 - link

    Hello all.

    WOW a great review site.:)

    Just one problem. I was having problems deciding between two of these cards on the ebuyer.co.uk site.

    PNY Verto GeForce 6 6600GT AGP8x £119

    or

    Inno 3D 128MB GeForce 6600 GT 8xAGP TV-Out DVI DirectX9 £116

    This review does not mention the PNY version. although i now notice that they have the LEADTEK at about the same price. Going by these comments i GUESS i should get the LEADTEk??? Anyone know about the PNY cos my mate rekons thats the better one...

    Leadtek Winfast Geforce 6600 Gt128mb Ddr3 Agp Dvi-i Tv-out £117.

    Any help much appreciated.

    A

    Reply
  • BlackMamba - Tuesday, March 08, 2005 - link

    #75: That link to MSI is for the AGP version (note the sink for the bridge chip).

    Not sure if they've fixed the problems with the PCI-E version, and would also like to know.
    Reply
  • JensErik - Tuesday, March 01, 2005 - link

    Looking at the pictures of the MSI card in the review and the pics at MSI's page it seems that MSI has changed the a lot on their card, including the HSF.

    (Check it out here: http://www.msi.com.tw/program/products/vga/vga/pro...

    Does anyone know if this has solved the HSF mounting problem encountered in the test??
    Reply

Log in

Don't have an account? Sign up now