Overclocking our Geforce 6600GTs

Once again, we're going to commend NVIDIA for including the coolbits registry tweak in their drivers that allows core and memory clock speed adjustment (among other things). No matter how new and pretty ATI makes overdrive look, it just doesn't give the end user the kind of control that these two slider bars have. Unless something pretty big changes by the end of the year, ATI users still have to rely on 3rd party tools for any manual overclocking.

But, when all is said and done, overclocking is about much more than just moving some sliders to the right. After spending quite a few hours (each) doing nothing but playing with the clock speeds of eleven different Geforce 6600GT cards, we started to wonder if there was any purpose in life but to increase the speed of small square bits of silicon to a point just before failure. Hopefully, we can pass on what we have learned.

There are multiple things to keep in mind; it's not just core and memory speed. Power and GPU quality are also concerns. If the device doesn't have the power to drive all its components at the requested clock speed, something has to give. This means it may be able to push memory really high, and core really high, but not both at the same time. Also, GPU quality physically limits the maximum speed at which a core can run and yield correct results. One of the nice things about coolbits is that it won't let you try to run your card at a clock speed that is impossible. If the card can't provide enough power to the components, or the GPU simply fails to run correctly at that speed, the driver won't let you enable that clock speed. With utilities such as PowerStrip, the end user only has visual glitches and system lockups/reboots to indicate problems of this nature.

For anyone who doesn't know, to enable the clock controls on NVIDIA hardware, simply add a DWORD named coolbits with hex value 3 to registry key: "HKLM/Software/NVIDIA Corporation/Global/NVTweak/"

The beginner's and advanced method for a stable overclock begin at the same place: basing your core and mem clock speeds near what NVIDIA's driver picks. Go into the NVIDIA control panel after enabling coolbits, choose the clock control panel, and select manual control. Make sure that you are always setting clocks for 3D performance and not 2D. Let NVIDIA pick some clock speeds for you. At this point, we aren't sure exactly what process NVIDIA uses to determine these clock speeds, but at the very least, it makes sure that both the GPU and RAM will have enough power at those frequencies. We will try to look into the other conditions of this feature for future articles.

The stable way out is to look at what NVIDIA set the clock speeds to, drop them by 10MHz (core and mem), and set them there. Then grab Half-Life 2, 3dmark05, or Doom 3 and run a timedemo numerous times, watching closely for glitches and signs of overheating or other issues. Those are the three hottest running titles that we have in our labs at the moment, but Half-Life 2 is, hands down, the leader in turning video cards into cookware.

If you want more performance, it's possible to go faster than what NVIDIA says you can do. The first thing to do is to find the fastest speed that the driver will let you set the core. Then you have somewhat of range of what is possible. Of course, that speed won't be it; try half way between the NVIDIA recommendation and the max clock speed - but leave the memory at its factory setting. Pay close attention, and make sure that you're using a benchmark that you can bail quickly in case you notice any problems. If there are glitches, cut the space between where you are and the NVIDIA setting in half and try again. It's almost like a binary search for the sweet spot, but you can stop when you know that you're safe. When you find a core clock speed that you like, if it's much higher than the NVIDIA driver-determined setting, you may wish to bring the memory clock up slowly to keep from throwing off the balance.

So how do you know if something is wrong when you've overclocked? In newer games like Half-Life 2, all the shaders start to render slightly incorrectly. In HL2 especially, the anomalies tend to have high locality of reference (similar problems happen near each other) and form an almost grid-like pattern of disruption on surfaces. It used to be that disappearing geometry and hard locks were the number one tell tale sign, but now vertex and pixel shaders are a little more sensitive and subtle. On the memory side, if clocks are too high, we might see speckling or off-color pixels. Edges could be disjoint, and texturing issues can occur.

Index Albatron
Comments Locked

84 Comments

View All Comments

  • ChineseDemocracyGNR - Saturday, December 11, 2004 - link

    #41, please remember this is a 20 page article, and things were written in a way people can easily read all 20 pages.
  • overclockingoodness - Saturday, December 11, 2004 - link

    #41: What do you mean barely readable? You are not some scholor who needs perfect writing in order to understand something. If you don't like it, don't read it.

    The reason why the review style was like a "quickly-patched email" is because it is a round-up of 11 cards.

    The point of a round-up is to cover the positives and negatives of a plethora of similar products at the same time. Since AnandTech has already done extensive 6600 benchmarks, they decided to do quick comparison and be done with it.

    Now you which 6600 to go for.

    If you don't know how things work, it's better to be quiet.
  • skunkbuster - Saturday, December 11, 2004 - link

    #41 lets see if you can do better then
  • mrscintilla - Saturday, December 11, 2004 - link

    Sorry to say this, but the article Derek wrote was barely readable. It reads more like a quickly-patched email than an edited article. The writing quality has to improve in the future.

  • SleepNoMore - Saturday, December 11, 2004 - link

    Thank God XFX offers an AGP version of this card. I am not FORCED to buy a PCI-Express slot motherboard and trash my current system.
  • QuestMGD - Friday, December 10, 2004 - link

    MSI heatsink really sucks. I had supicions about the heatsink after I got my MSI card from NewEgg. This article verified it. Since the card isn't in a computer yet, I pulled of the heatsink and sanded it down.

    I'm not done yet, but after a while it does look like I can get it to fit tightly, it was just a PIA. The mounting springs seem to have been originally designed correctly, the heatsink casting was just crap.

    BTW heatsink is just copper colored coating over Aluminum or whatever, that's probably why the casting ended up so poor.

    Could anyone e-mail me whether I can use CPU thermal compound on my Graphics Card memory chips, or should I go out and get something else? I've heard mixed opinions regarding this. Thanks.
  • threeply - Friday, December 10, 2004 - link

    I noticed No Evga card was included in the review. Any reason why this card was not included?
  • Momental - Friday, December 10, 2004 - link

    Cobbling with your bogus dink is not recommended. See your doctor if condition persists. ;)

    A really great article. Extremely informative and gives "down and dirty", which I like. I'm in the market for a PCI-e 6600GT (sounds like a new motorcycle from Suzuki) and this article really gives one some serious food for thought rather then just the standard angle of "which one is the fastest and/or cheapest?"

    The last thing I want is to have to handle one of these things like it was some sort of rare antiquity from the Ming Dynasty. While I don't do my best imitation of a ferrit on crack inside a case, it's good to know that there is the possibility of damaging the HSF quite easily. Who'd a thought!
  • ShadowVlican - Friday, December 10, 2004 - link

    thanks for the excellent write up Derek, i hope the vendors follow your advice to improve the contact issues with the HSF and GPU, since i won't be purchasing a gfx card with poor design that can be fixed so easily

    the leadtek will be on top of my list and likely in my next comp as soon as a64 pci-e motherboards come out
  • JClimbs - Friday, December 10, 2004 - link

    Excellent article, focusing on a few key issues that performance buffs tend to overlook in their quest for higher framerates.
    My overall take after reading this was that the 6600GT's market is really limited to people/companies willing to pull things apart and fix them up right. The cooling solutions all seem either bogus or cobbled, with cobbled being the best of the bunch. If you don't want to dink with your purchase, get a cobbled one; if you WILL dink with it, you can get a bogus model and fix it.
    One thing I would like to have seen compared is power usage. I'm curious to see what the spread is there. And also, harking back to an earlier article, if improving the power supply improves overclocking performance.
    Once again, excellent article, Derek!

Log in

Don't have an account? Sign up now