Overclocking our Geforce 6600GTs

Once again, we're going to commend NVIDIA for including the coolbits registry tweak in their drivers that allows core and memory clock speed adjustment (among other things). No matter how new and pretty ATI makes overdrive look, it just doesn't give the end user the kind of control that these two slider bars have. Unless something pretty big changes by the end of the year, ATI users still have to rely on 3rd party tools for any manual overclocking.

But, when all is said and done, overclocking is about much more than just moving some sliders to the right. After spending quite a few hours (each) doing nothing but playing with the clock speeds of eleven different Geforce 6600GT cards, we started to wonder if there was any purpose in life but to increase the speed of small square bits of silicon to a point just before failure. Hopefully, we can pass on what we have learned.

There are multiple things to keep in mind; it's not just core and memory speed. Power and GPU quality are also concerns. If the device doesn't have the power to drive all its components at the requested clock speed, something has to give. This means it may be able to push memory really high, and core really high, but not both at the same time. Also, GPU quality physically limits the maximum speed at which a core can run and yield correct results. One of the nice things about coolbits is that it won't let you try to run your card at a clock speed that is impossible. If the card can't provide enough power to the components, or the GPU simply fails to run correctly at that speed, the driver won't let you enable that clock speed. With utilities such as PowerStrip, the end user only has visual glitches and system lockups/reboots to indicate problems of this nature.

For anyone who doesn't know, to enable the clock controls on NVIDIA hardware, simply add a DWORD named coolbits with hex value 3 to registry key: "HKLM/Software/NVIDIA Corporation/Global/NVTweak/"

The beginner's and advanced method for a stable overclock begin at the same place: basing your core and mem clock speeds near what NVIDIA's driver picks. Go into the NVIDIA control panel after enabling coolbits, choose the clock control panel, and select manual control. Make sure that you are always setting clocks for 3D performance and not 2D. Let NVIDIA pick some clock speeds for you. At this point, we aren't sure exactly what process NVIDIA uses to determine these clock speeds, but at the very least, it makes sure that both the GPU and RAM will have enough power at those frequencies. We will try to look into the other conditions of this feature for future articles.

The stable way out is to look at what NVIDIA set the clock speeds to, drop them by 10MHz (core and mem), and set them there. Then grab Half-Life 2, 3dmark05, or Doom 3 and run a timedemo numerous times, watching closely for glitches and signs of overheating or other issues. Those are the three hottest running titles that we have in our labs at the moment, but Half-Life 2 is, hands down, the leader in turning video cards into cookware.

If you want more performance, it's possible to go faster than what NVIDIA says you can do. The first thing to do is to find the fastest speed that the driver will let you set the core. Then you have somewhat of range of what is possible. Of course, that speed won't be it; try half way between the NVIDIA recommendation and the max clock speed - but leave the memory at its factory setting. Pay close attention, and make sure that you're using a benchmark that you can bail quickly in case you notice any problems. If there are glitches, cut the space between where you are and the NVIDIA setting in half and try again. It's almost like a binary search for the sweet spot, but you can stop when you know that you're safe. When you find a core clock speed that you like, if it's much higher than the NVIDIA driver-determined setting, you may wish to bring the memory clock up slowly to keep from throwing off the balance.

So how do you know if something is wrong when you've overclocked? In newer games like Half-Life 2, all the shaders start to render slightly incorrectly. In HL2 especially, the anomalies tend to have high locality of reference (similar problems happen near each other) and form an almost grid-like pattern of disruption on surfaces. It used to be that disappearing geometry and hard locks were the number one tell tale sign, but now vertex and pixel shaders are a little more sensitive and subtle. On the memory side, if clocks are too high, we might see speckling or off-color pixels. Edges could be disjoint, and texturing issues can occur.

Index Albatron
Comments Locked

84 Comments

View All Comments

  • 1q3er5 - Thursday, December 16, 2004 - link

    errr weird how the albatron despite its so called HSF mounting problem scored so high on all the tests albeit a bit loud and didn't get an award !

    Also looks like LEADTEK changed the design of the board of the bit

    http://www.leadtek.com/3d_graphic/winfast_a6600_gt...

    They added a heatsink on the RAM and you may also notice that the shroud now extends right over the other chips on the card.
  • miketus - Thursday, December 16, 2004 - link

    Hi, has anyboby experience with Albatron 6600GT for AGP
  • geogecko - Monday, December 13, 2004 - link

    Personally, I'd be willing to spend the extra $15-20 to get a decent HSF on these cards. Of course, the first one I buy will go in an HTPC, which will all be passively cooled, so the HSF in this case doesn't matter, because I'll just be removing it.

    However, for my PC, I sure would like a decent quality HSF. It would stink to have a $200 card burn up in your PC because of a $10 HSF setup.
  • WT - Monday, December 13, 2004 - link

    Interesting that GigaByte used a passive HSF on their 6800 card (with great results), but went with a craptastic fan on the 6600GT. I have an MSI 5900 and didn't want to settle for the cheesy MSI 5900XT cards HSF setup, so we are seeing the same thing occur with the 6600GTs .... cut costs by using a cheaper HSF.
    Excellent article .. I found it answered every question I had left on the GT cards, further convincing me to buy the 6800 series.
  • DerekWilson - Sunday, December 12, 2004 - link

    #49 -- it was a problem with our sample ... the actual issue was not a design flaw, but if the design (of most 6600 GT cards) was different, it might have been possible for our sample to have avoid breakage.

    That's kind of a complicated way of saying that you should be alright as long as you are careful with the card when you install it.

    After it's installed, the way to tell if you have a problem is to run a 3D game/application in windowed mode. Open display properies and click on the system tab. Hit the advanced button and select the NVIDIA tab. select the temperature option and if you see temperatures of 90 degrees C and higher, you probably have a problem.

    if your temp is lower than that you're fine.
  • Vico26 - Sunday, December 12, 2004 - link

    Derek,

    was the 6600 GT MSI a broken piece, or is there a problem with the HS design? Plz let me know, as I bought the MSI card on the same day as you published the article. Now, I am shocked, and I would like to find a solution - new cooling system? Am I able to install it (I m not a sort of professional)?

    Anyway many thanks, I should have waited a day...
  • DerekWilson - Sunday, December 12, 2004 - link

    http://www.gfe.com.hk/news/buy.asp
  • Nyati13 - Sunday, December 12, 2004 - link

    What I'd like to know is where are the Galaxy 6600GTs available? I've looked at some e-tailers that I know of, and searched pricewatch and e-bay, and there aren't any Galaxy cards for sale.
  • geogecko - Sunday, December 12, 2004 - link

    Well, I actually meant to say something in that last post.

    Anyway, short and sweet. That's the way I like these articles. Who wants to spend more than about 15-30 minutes to find out which card is best for them.

    I do think that the HDTV thing could have been looked at, but other than that, it's a great article.
  • geogecko - Sunday, December 12, 2004 - link

Log in

Don't have an account? Sign up now