Overclocking our Geforce 6600GTs

Once again, we're going to commend NVIDIA for including the coolbits registry tweak in their drivers that allows core and memory clock speed adjustment (among other things). No matter how new and pretty ATI makes overdrive look, it just doesn't give the end user the kind of control that these two slider bars have. Unless something pretty big changes by the end of the year, ATI users still have to rely on 3rd party tools for any manual overclocking.

But, when all is said and done, overclocking is about much more than just moving some sliders to the right. After spending quite a few hours (each) doing nothing but playing with the clock speeds of eleven different Geforce 6600GT cards, we started to wonder if there was any purpose in life but to increase the speed of small square bits of silicon to a point just before failure. Hopefully, we can pass on what we have learned.

There are multiple things to keep in mind; it's not just core and memory speed. Power and GPU quality are also concerns. If the device doesn't have the power to drive all its components at the requested clock speed, something has to give. This means it may be able to push memory really high, and core really high, but not both at the same time. Also, GPU quality physically limits the maximum speed at which a core can run and yield correct results. One of the nice things about coolbits is that it won't let you try to run your card at a clock speed that is impossible. If the card can't provide enough power to the components, or the GPU simply fails to run correctly at that speed, the driver won't let you enable that clock speed. With utilities such as PowerStrip, the end user only has visual glitches and system lockups/reboots to indicate problems of this nature.

For anyone who doesn't know, to enable the clock controls on NVIDIA hardware, simply add a DWORD named coolbits with hex value 3 to registry key: "HKLM/Software/NVIDIA Corporation/Global/NVTweak/"

The beginner's and advanced method for a stable overclock begin at the same place: basing your core and mem clock speeds near what NVIDIA's driver picks. Go into the NVIDIA control panel after enabling coolbits, choose the clock control panel, and select manual control. Make sure that you are always setting clocks for 3D performance and not 2D. Let NVIDIA pick some clock speeds for you. At this point, we aren't sure exactly what process NVIDIA uses to determine these clock speeds, but at the very least, it makes sure that both the GPU and RAM will have enough power at those frequencies. We will try to look into the other conditions of this feature for future articles.

The stable way out is to look at what NVIDIA set the clock speeds to, drop them by 10MHz (core and mem), and set them there. Then grab Half-Life 2, 3dmark05, or Doom 3 and run a timedemo numerous times, watching closely for glitches and signs of overheating or other issues. Those are the three hottest running titles that we have in our labs at the moment, but Half-Life 2 is, hands down, the leader in turning video cards into cookware.

If you want more performance, it's possible to go faster than what NVIDIA says you can do. The first thing to do is to find the fastest speed that the driver will let you set the core. Then you have somewhat of range of what is possible. Of course, that speed won't be it; try half way between the NVIDIA recommendation and the max clock speed - but leave the memory at its factory setting. Pay close attention, and make sure that you're using a benchmark that you can bail quickly in case you notice any problems. If there are glitches, cut the space between where you are and the NVIDIA setting in half and try again. It's almost like a binary search for the sweet spot, but you can stop when you know that you're safe. When you find a core clock speed that you like, if it's much higher than the NVIDIA driver-determined setting, you may wish to bring the memory clock up slowly to keep from throwing off the balance.

So how do you know if something is wrong when you've overclocked? In newer games like Half-Life 2, all the shaders start to render slightly incorrectly. In HL2 especially, the anomalies tend to have high locality of reference (similar problems happen near each other) and form an almost grid-like pattern of disruption on surfaces. It used to be that disappearing geometry and hard locks were the number one tell tale sign, but now vertex and pixel shaders are a little more sensitive and subtle. On the memory side, if clocks are too high, we might see speckling or off-color pixels. Edges could be disjoint, and texturing issues can occur.

Index Albatron
Comments Locked

84 Comments

View All Comments

  • princethorpe - Wednesday, May 4, 2005 - link

    I've been checking the various forums and found thisone on the 6600gt's excellent. I don't know if anyone else has found them but Asus are making these cards and do a faster than standard model by using faster memory they recon according to their site they run 10% faster than the standard. I've ordered the Asus board by preference because of the build quality
  • GollumSmeagol - Monday, May 2, 2005 - link

    I came across a forum a few months ago here in Hungary, and the people were talking about Leadtek's 6600GTs being faulty/freezing. Strange enough, a few weeks later, the main distributor of Leadtek, took off 6600GTs from their pricelists on the web. Wonder if they are waiting for a bugfix, or simply ran out of stock and wait for the next shipment.

    Another beauty I've just came across, is Gigabyte's TurboForce edition, which is a slightly overclocked version of the 6600 series (both PCI-Ex and AGP 8x). I'm shopping for a SILENT AGP one, (that's where I came across this review), and found this beauty

    http://www.giga-byte.com/VGA/Products/Products_GV-...

    This one has sg. they call Silent-Pipe as a cooler. Not much specs on Gigabyte's page, but from the picture, it looks like there is no fan at all, just a huge copper(-colored?) heatsink, that covers about 2/3rd of the card. (Well, a Zalman FB123 could still be used to move some air)
    The memory clock is wrote to be 1120MHz (remember, TurboForce), plus when I zoomed in on to the box picture, I could spot "VIVO" written on the box. This is also supported by the info on the local dealer's page, where they say "Y" to the TV-OUT of the regular GV-N66T128D, but they say "IN/OUT" for the GV-N66T128VP. All this for roughly 20 USD extra (local price).
  • dpp - Saturday, November 19, 2005 - link

    I've bought http://www.giga-byte.com/VGA/Products/Products_GV-...">Gigabyte GV-NX66T128VP (TurboForce, no fan at all)
    Start up temperature 52C, maximum 65C.
    Is that normal?
  • ylp88 - Monday, April 18, 2005 - link

    I found the article quite informative. Thank you. I purchased two Palit 6600GT cards a week ago and have put them in SLI mode.

    I have a few questions/comments:
    1) The Palit overview is rather short compared to the others. The Palit card is also never mentioned on the last page. Is there a reason for this?
    2) The Palit cards I got DO NOT have memory heatsinks as indicted on the photo for the Palit card. The memory remins cool, however.

    Thanks again for the article.

    ylp88
  • zexe - Wednesday, April 6, 2005 - link

    Do not go for XFX 6600GT !!!!
    The card is NOT longer equipped with 1.6ns
    The chips on my card are Samsung K4J55323QF-GC20
    THAT MEANS 2.ms !!!
  • zexe - Wednesday, April 6, 2005 - link

  • marketmuse - Friday, April 1, 2005 - link

    does anyone know the difference between the Leadtek A6600GT and PX6600GT, besides the PCI-E and AGP?

    I'm looking to purchase a A6600GT, but I don't know if it will have the same performance as the PX version.

    Thanks
    MM
  • Monypennyuk - Monday, March 14, 2005 - link

    Hello all.

    WOW a great review site.:)

    Just one problem. I was having problems deciding between two of these cards on the ebuyer.co.uk site.

    PNY Verto GeForce 6 6600GT AGP8x £119

    or

    Inno 3D 128MB GeForce 6600 GT 8xAGP TV-Out DVI DirectX9 £116

    This review does not mention the PNY version. although i now notice that they have the LEADTEK at about the same price. Going by these comments i GUESS i should get the LEADTEk??? Anyone know about the PNY cos my mate rekons thats the better one...

    Leadtek Winfast Geforce 6600 Gt128mb Ddr3 Agp Dvi-i Tv-out £117.

    Any help much appreciated.

    A

  • BlackMamba - Tuesday, March 8, 2005 - link

    #75: That link to MSI is for the AGP version (note the sink for the bridge chip).

    Not sure if they've fixed the problems with the PCI-E version, and would also like to know.
  • JensErik - Tuesday, March 1, 2005 - link

    Looking at the pictures of the MSI card in the review and the pics at MSI's page it seems that MSI has changed the a lot on their card, including the HSF.

    (Check it out here: http://www.msi.com.tw/program/products/vga/vga/pro...

    Does anyone know if this has solved the HSF mounting problem encountered in the test??

Log in

Don't have an account? Sign up now