Overclocking our Geforce 6600GTs

Once again, we're going to commend NVIDIA for including the coolbits registry tweak in their drivers that allows core and memory clock speed adjustment (among other things). No matter how new and pretty ATI makes overdrive look, it just doesn't give the end user the kind of control that these two slider bars have. Unless something pretty big changes by the end of the year, ATI users still have to rely on 3rd party tools for any manual overclocking.

But, when all is said and done, overclocking is about much more than just moving some sliders to the right. After spending quite a few hours (each) doing nothing but playing with the clock speeds of eleven different Geforce 6600GT cards, we started to wonder if there was any purpose in life but to increase the speed of small square bits of silicon to a point just before failure. Hopefully, we can pass on what we have learned.

There are multiple things to keep in mind; it's not just core and memory speed. Power and GPU quality are also concerns. If the device doesn't have the power to drive all its components at the requested clock speed, something has to give. This means it may be able to push memory really high, and core really high, but not both at the same time. Also, GPU quality physically limits the maximum speed at which a core can run and yield correct results. One of the nice things about coolbits is that it won't let you try to run your card at a clock speed that is impossible. If the card can't provide enough power to the components, or the GPU simply fails to run correctly at that speed, the driver won't let you enable that clock speed. With utilities such as PowerStrip, the end user only has visual glitches and system lockups/reboots to indicate problems of this nature.

For anyone who doesn't know, to enable the clock controls on NVIDIA hardware, simply add a DWORD named coolbits with hex value 3 to registry key: "HKLM/Software/NVIDIA Corporation/Global/NVTweak/"

The beginner's and advanced method for a stable overclock begin at the same place: basing your core and mem clock speeds near what NVIDIA's driver picks. Go into the NVIDIA control panel after enabling coolbits, choose the clock control panel, and select manual control. Make sure that you are always setting clocks for 3D performance and not 2D. Let NVIDIA pick some clock speeds for you. At this point, we aren't sure exactly what process NVIDIA uses to determine these clock speeds, but at the very least, it makes sure that both the GPU and RAM will have enough power at those frequencies. We will try to look into the other conditions of this feature for future articles.

The stable way out is to look at what NVIDIA set the clock speeds to, drop them by 10MHz (core and mem), and set them there. Then grab Half-Life 2, 3dmark05, or Doom 3 and run a timedemo numerous times, watching closely for glitches and signs of overheating or other issues. Those are the three hottest running titles that we have in our labs at the moment, but Half-Life 2 is, hands down, the leader in turning video cards into cookware.

If you want more performance, it's possible to go faster than what NVIDIA says you can do. The first thing to do is to find the fastest speed that the driver will let you set the core. Then you have somewhat of range of what is possible. Of course, that speed won't be it; try half way between the NVIDIA recommendation and the max clock speed - but leave the memory at its factory setting. Pay close attention, and make sure that you're using a benchmark that you can bail quickly in case you notice any problems. If there are glitches, cut the space between where you are and the NVIDIA setting in half and try again. It's almost like a binary search for the sweet spot, but you can stop when you know that you're safe. When you find a core clock speed that you like, if it's much higher than the NVIDIA driver-determined setting, you may wish to bring the memory clock up slowly to keep from throwing off the balance.

So how do you know if something is wrong when you've overclocked? In newer games like Half-Life 2, all the shaders start to render slightly incorrectly. In HL2 especially, the anomalies tend to have high locality of reference (similar problems happen near each other) and form an almost grid-like pattern of disruption on surfaces. It used to be that disappearing geometry and hard locks were the number one tell tale sign, but now vertex and pixel shaders are a little more sensitive and subtle. On the memory side, if clocks are too high, we might see speckling or off-color pixels. Edges could be disjoint, and texturing issues can occur.

Index Albatron
Comments Locked

84 Comments

View All Comments

  • arswihart - Friday, December 10, 2004 - link

    my mistake, didn't actually read some of the article
  • bbomb - Friday, December 10, 2004 - link

    #20 Derek said that Pine = XFX so he did review the card with the buy me link.
  • arswihart - Friday, December 10, 2004 - link

    #21 - i guess thats what arctic cooling is for...

    speaking of which, its good to see some makers adopting similar hsf designs to the arctic coolers, except for the part about shunting the air directly out of the case

    I just saw that asus does have a 6600gt coming, and it has a very arctic cooling-esque design which I like, also, I think the albatron in this round-up has some semblance as well, too bad its fan is so loud
  • mindless1 - Friday, December 10, 2004 - link

    I agree, the heatsinks (and particularly fans) are disappointing. If they just abandoned the idea of the fansink only taking up one slot-height they'd have a lot more freedom to improve things (like fan thickness, which could combat noise AND longevity). It might even be better to prevent someone from sandwiching another card in next to the video anyway, taking up more than one slot thickness could be a positive thing all around. Not that it would "need" be two slots thick, but even an extra 5mm is a lot on such a thin 'sink.
  • arswihart - Friday, December 10, 2004 - link

    the leadtek has just looked like a solid card since i first saw it, I'm not surprised by the results, this card is clearly the best of all in this roundup

    #6 - I agree that most of the other HSF's look really cheap, especially the Chaintech, Galaxy, Gigabyte, Inno3d, and MSI just from eye-balling them.

    #10 - nice point, Anand, why do you even include these deals on the review pages? I can only assume its basically an ad that the company is paying for, and you are half-way endorsing the product, and in this case, not even reviewing it as we meanwhile read about 11 other competing cards

    overall, these cards look pretty cheap, I think the quality control issues highlight this

    anyone know if asus or abit plans to make 6600gt's (or any other manufacturer)?
  • Filibuster - Friday, December 10, 2004 - link

    Thanks for the info Derek!
  • ocyl - Friday, December 10, 2004 - link

    Derek > This is a follow-up to my post at #16. I have done a quick research, and here is a simple comparison chart of video-in implementations of these cards.

    Albatron PC6600GTV/PC6600GT: Yes/No (not sure which one was tested)

    Chaintech SE6600G: No

    Galaxy GF6600GT: No

    Gigabyte GV-NX66T128D/GV-NX66256D: No/No (Did Gigabyte send you a NX66256D? I don't know if they've got a wrong picture on their website but it looks like they may have sent you an overclocked 6600 instead of a real 6600GT).

    Inno3D GeForce 6600 GT: No

    Leadtek WinFast PX6600 GT TDH: No

    MSI NX6600GT-VTD128E/NX6600GT-TD128E: Yes/No (not sure which one was tested)

    Palit GeForce 6600GT: No

    Prolink PV-N43E(128LD): No

    Sparkle SP-PX43GVH/SP-PX43GDH: Yes/No (not sure which one was tested)

    XFX PVT43GNDD7: No

    In terms of full product lines (6600 series PCI Express + AGP), MSI has 4 out of 8 cards featuring video-in, followed by Sparkle (1 out of 4) and Albatron (1 out of 8).

    Oscar
  • DerekWilson - Friday, December 10, 2004 - link

    I would like to appologize -- Galaxy just informed me that they are, in fact, shipping their 6600gts at 525/550 ...

    this modest overclock comes basically free to the end users -- this gets them an editors choice award as no other vendor has shipped with a default core oc.
  • ocyl - Friday, December 10, 2004 - link

    Derek > Thank you for paying attention to the noise issue in the report. It will be great if we can also see a discussion/comparison of VIVO implementation (or lack thereof) in the future since video processing is now a built-in feature of the GPU :)
  • DerekWilson - Friday, December 10, 2004 - link

    Yes, I oc'd myself ... but i'd like to know where i said clock stock so i can fix it ;-)

    Trogdor -- 3 mount points would have worked fine. there aren't any larger images... that was a mistake -- I appologize.

    redavnl -- Pine is XFX

    Fillibuster -- high quality for doom 3, and the gigabyte card may be called 6600 series, but it is a 6600 gt (clocked at 500/500 with sli)

    Pete -- as always, thanks for the constructive feedback. i've altered the sound bits to reflect 6db to 10db being a double in perceived volume. I knew 3db was the power doubling point not perception, i was just overclocking myself too much that night :-) ...

    we stick to 1M distances for a few logistic reasons. after this article: http://anandtech.com/video/showdoc.aspx?i=2126&... it was pointed out to us that a 5 cm distance skews the results because of things like turbulance from the fan. talking to some audio engineers, it seems measuring the spl level of a system at 1 meter is pretty standard.

    we do actually measure with no case. It does acutally sit on a desk on a layer of foam, though any sound deadening is secondary. I don't think I have any cases in my lab.

    We'll continue to look into the sound issue, but I wouldn't think having a box literally 12 inches from your ear is a commonly possible thing (i can't even get my monitor 12 inches from my eyes). I could see 5 decimeters maybe ...

    we will continue to look into the spl issue.

Log in

Don't have an account? Sign up now