Overclocking Comparison

With overclocking, every GPU is different, so a side effect of looking at this is that we get a good idea of how the 6600GT will overclock in a general sense. We can't really say that all Albatron cards will overclock by 90MHz. Believe us when we say that if they all would run at that speed, they would all be running at that speed and out-selling the competition. There are a lot of factors that go into it. That's why we base most of our recommendation and ranking decisions on cooling and noise levels rather than overclocking. It is still a factor though.

Core Clock Speed

Those with a calculator handy will notice that the mean median and standard deviation are:

Mean: 568.10
Median: 571
Std. Dev.: 17.1840

Knowing NVIDIA, QA is going to assure that chips leaving labs will run at a little higher than stock clock speeds. This translates to a little bit of breathing room. What we pull away from this testing is that we expect Geforce 6600GT's to achieve a minium 7% overclock. A 9% to 12% overclock should be possible to most people who decide to own this card. Beyond that is icing on the cake. Of course, we are working with a very small sample size and we don't know much about the population as a whole either. We would have been more comfortable making predictions had this data looked more like a bell curve, but what we see is a little too flat for us to say anything with any statistical confidence.

Our memory clock speed graph shows Sparkle on top, but that's 2ns RAM on a 110MHz overclock. The XFX RAM is running 1.6ns RAM at a 10MHz overclock. This could be really lucky for Sparkle, but it isn't likely to happen on most boards. A 22% memory overclock, even with the added features of GDDR3, is still tough to pull off, especially when the 1.6ns memory only matched its performance. Inno3D also uses 1.6ns memory, but our final overclock ended up lower than the 600MHz that should have been possible with this part.

All the other solutions are 2ns memories which overclock between 50 and 100MHz. All the memories we looked at on 6600 GT boards are Samsung GDDR3 solutions.

Mem Clock Speed

The Test Overclocked Doom 3 Performance
Comments Locked

84 Comments

View All Comments

  • Bonesdad - Wednesday, February 16, 2005 - link

    Yes, I too would like to see an update here...have any of the makers attacked the HSF mounting problems?
  • 1q3er5 - Tuesday, February 15, 2005 - link

    can we please get an update on this article with more cards, and replacements of defective cards?

    I'm interested in the gigabyte card
  • Yush - Tuesday, February 8, 2005 - link

    Those temperature results are pretty dodge. Surely no regular computer user would have a caseless computer. Those results are only favourable and only shed light on how cool the card CAN be, and not how hot they actually are in a regular scenario. The results would've been much more useful had the temperature been measured inside a case.
  • Andrewliu6294 - Saturday, January 29, 2005 - link

    i like the albatron best. Exactly how loud is it? like how many decibels?
  • JClimbs - Thursday, January 27, 2005 - link

    Anyone have any information on the Galaxy part? I don't find it in a pricewatch or pricegrabber search at all.
  • Abecedaria - Saturday, January 22, 2005 - link

    Hey there. I noticed that Gigabyte seems to have modified their HSI cooling solution. Has anyone had any experience with this? It looks much better.

    Comments?
    http://www.giga-byte.com/VGA/Products/Products_GV-...

    abc
  • levicki - Sunday, January 9, 2005 - link

    Derek, do you read your email at all? I got Prolink 6600 GT card and I would like to hear a suggestion on improving the cooling solution. I can confirm that retail card reaches 95 C at full load and idles at 48 C. That is really bad image for nVidia. They should be informed about vendor's doing poor job on cooling design. I mean, you would expect it to be way better because those cards ain't cheap.
  • levicki - Sunday, January 9, 2005 - link

  • geogecko - Wednesday, January 5, 2005 - link

    Derek. Could you speculate on what thermal compound is used to interface between the HSF and the GPU on the XFX card? I e-mailed them, and they won't tell me what it is?! It would be great if it was paste or tape. I need to be able to remove it, and then later, would like to re-install it. I might be able to overlook not having the component video pod on the XFX card, as long as I get an HDTV that supports DVI.
  • Beatnik - Friday, December 31, 2004 - link


    I thought I would add about the DUAL-DVI issue, in the new NVIDIA drivers, they show that the second DVI can be used for HDTV output. It appears that even the overscan adjustments are there.

    So not having the component "pod" on the XFX card appears to be less of a concern than I thought it might be. It would be nice to hear if someone tried running 1600x1200 + 1600x1200 on the XFX, just to know if the DVI is up to snuff for dual LCD use.

Log in

Don't have an account? Sign up now