Overclocking our Geforce 6600GTs

Once again, we're going to commend NVIDIA for including the coolbits registry tweak in their drivers that allows core and memory clock speed adjustment (among other things). No matter how new and pretty ATI makes overdrive look, it just doesn't give the end user the kind of control that these two slider bars have. Unless something pretty big changes by the end of the year, ATI users still have to rely on 3rd party tools for any manual overclocking.

But, when all is said and done, overclocking is about much more than just moving some sliders to the right. After spending quite a few hours (each) doing nothing but playing with the clock speeds of eleven different Geforce 6600GT cards, we started to wonder if there was any purpose in life but to increase the speed of small square bits of silicon to a point just before failure. Hopefully, we can pass on what we have learned.

There are multiple things to keep in mind; it's not just core and memory speed. Power and GPU quality are also concerns. If the device doesn't have the power to drive all its components at the requested clock speed, something has to give. This means it may be able to push memory really high, and core really high, but not both at the same time. Also, GPU quality physically limits the maximum speed at which a core can run and yield correct results. One of the nice things about coolbits is that it won't let you try to run your card at a clock speed that is impossible. If the card can't provide enough power to the components, or the GPU simply fails to run correctly at that speed, the driver won't let you enable that clock speed. With utilities such as PowerStrip, the end user only has visual glitches and system lockups/reboots to indicate problems of this nature.

For anyone who doesn't know, to enable the clock controls on NVIDIA hardware, simply add a DWORD named coolbits with hex value 3 to registry key: "HKLM/Software/NVIDIA Corporation/Global/NVTweak/"

The beginner's and advanced method for a stable overclock begin at the same place: basing your core and mem clock speeds near what NVIDIA's driver picks. Go into the NVIDIA control panel after enabling coolbits, choose the clock control panel, and select manual control. Make sure that you are always setting clocks for 3D performance and not 2D. Let NVIDIA pick some clock speeds for you. At this point, we aren't sure exactly what process NVIDIA uses to determine these clock speeds, but at the very least, it makes sure that both the GPU and RAM will have enough power at those frequencies. We will try to look into the other conditions of this feature for future articles.

The stable way out is to look at what NVIDIA set the clock speeds to, drop them by 10MHz (core and mem), and set them there. Then grab Half-Life 2, 3dmark05, or Doom 3 and run a timedemo numerous times, watching closely for glitches and signs of overheating or other issues. Those are the three hottest running titles that we have in our labs at the moment, but Half-Life 2 is, hands down, the leader in turning video cards into cookware.

If you want more performance, it's possible to go faster than what NVIDIA says you can do. The first thing to do is to find the fastest speed that the driver will let you set the core. Then you have somewhat of range of what is possible. Of course, that speed won't be it; try half way between the NVIDIA recommendation and the max clock speed - but leave the memory at its factory setting. Pay close attention, and make sure that you're using a benchmark that you can bail quickly in case you notice any problems. If there are glitches, cut the space between where you are and the NVIDIA setting in half and try again. It's almost like a binary search for the sweet spot, but you can stop when you know that you're safe. When you find a core clock speed that you like, if it's much higher than the NVIDIA driver-determined setting, you may wish to bring the memory clock up slowly to keep from throwing off the balance.

So how do you know if something is wrong when you've overclocked? In newer games like Half-Life 2, all the shaders start to render slightly incorrectly. In HL2 especially, the anomalies tend to have high locality of reference (similar problems happen near each other) and form an almost grid-like pattern of disruption on surfaces. It used to be that disappearing geometry and hard locks were the number one tell tale sign, but now vertex and pixel shaders are a little more sensitive and subtle. On the memory side, if clocks are too high, we might see speckling or off-color pixels. Edges could be disjoint, and texturing issues can occur.

Index Albatron
Comments Locked

84 Comments

View All Comments

  • geogecko - Friday, December 10, 2004 - link

    #33. I agree completely. That's why I'm curious about the HDTV output. I want to build an HTPC that is somewhat future-proof, and if that is the case, then I need the HDTV Out feature to work. From a review on newegg.com's web site on the XFX card, he couldn't seem to get and HDTV Out to work with the card.

    NVDVD would also be a plus if it was included, but I doubt it. I sent an e-mail to tech support over at XFX, asking these particular questions. Hopefully, I'll get an answer.
  • jamawass - Friday, December 10, 2004 - link

    Well there's more to a graphics card than gaming. The 6600 series is causing quite a stir in the htpc community because of the video decoding capabilites and hdtv output. It would've been helpful if the reviewer had mentioned the various manufacturers' suppport for HDTV output out of the box, which cards come with the NDVD codec, component adapter etc.
  • bigpow - Friday, December 10, 2004 - link

    -> hex value 3 = decimal value 3
  • geogecko - Friday, December 10, 2004 - link

    Great article Derek. Been looking for a 6600GT round up article for a while now.

    Question though. A few of these cards come with an HDTV cable, which, I guess I'm a little confused to what this actually is. I prefer the XFX card, because of the dual DVI outputs (and no need to overclock the card). It doesn't list as coming with an HDTV cable, so I'm wondering, what is the impact on not having this cable? What is the cable? Can't one usually just hook up a DVI cable to an HDTV?
  • Spacecomber - Friday, December 10, 2004 - link

    Nice round-up, and your bringing attention to the potential problems with some of the heatsinks is very much appreciated.

    Maybe as a follow up we need a round-up of some after-market heatsinks for the 6600GT.

    Any reason to assume that these conclusions reached for the PCIe cards do or do not apply to the AGP versions? I know the AGP versions typically have their heatsinks set on a diagonal in order to accomodate the bridge chip.

    Space
  • arswihart - Friday, December 10, 2004 - link

    my last comment was in response to #25

    #27 - Derek is talking about his "IT friendly" list, those cards he felt had the most reliable hsf implementation

  • arswihart - Friday, December 10, 2004 - link

    it does consistently lead in performance, worth noting by all means, but also, as it was mentioned in the review, these cards probably all perform even better on an nforce4, and the rank and file in performance among these cards might be a little different on an nforce4

    but I would definitely get an arctic cooler for the albatron anyways (if a compatible one is released) to quiet it down, that fan on it is tiny, thin, and loud
  • Houdani - Friday, December 10, 2004 - link

    On the last page [Final Words] you are listing why some cards aren't worthy of an Editor's Choice award. The next to last paragraph states:

    "XFX doesn't make the list because..."

    But, ummm, isn't XFX the Silver choice?
  • Aquila76 - Friday, December 10, 2004 - link

    I know the Albatron didn't have a great mounting mechanism, but it was better than many of the others
  • Aquila76 - Friday, December 10, 2004 - link

    Why wasn't the Albatron given any medal? It has the best OC, best or near best performance in all the tests, and great temps even under load with the high OC. So the fan is a little noisier than the rest, is that any reason to dirt on this card?

Log in

Don't have an account? Sign up now