Noise and Heat

Here are the meat and potatoes of our comparison between the 11 Geforce 6600GT cards today. How quiet and efficient are the cooling solutions attached to the cards? We are about to find out.

Our noise test was done using an SPL meter in a very quiet room. Unfortunately, we haven't yet been able baffle the walls with sound deadening material in the lab, and the CPU and PSU fans were still on as well. But in each case, the GPU fan was the loudest contributor to the SPL in the room by a large margin. Thus, the SPL of the GPU is the factor that drove the measured SPL of the system.

Our measurement was taken at 1 meter from the caseless computer. Please keep in mind when looking at this graph that everyone experiences sound and decibel levels a little bit differently. Generally, though, a one dB change in SPL translates to a perceivable change in volume. Somewhere between a 6 dB and 10 dB difference, people perceive the volume of a sound to double. That means the Inno3D fan is over twice as loud as the Galaxy fan. Two newcomers to our labs end up rounding out the top and bottom of our chart.

The very first thing to notice about our chart is that the top three spots are held by our custom round HSF solutions with no shroud. This is clearly the way to go for quiet cooling.

Chaintech, Palit, and Gigabyte are the quietest of the shrouded solutions, and going by our rule of thumb, the Palit and Gigabyte cards may be barely audibly louder than the Chaintech card.

The Albatron, Prolink, and XFX cards have no audible difference, and they are all very loud cards. The fact that the Sparkle card is a little over 1 dB down from the XFX card is a little surprising: they use the same cooling solution with a different sticker attached. Of course, you'll remember from the XFX page that it seems that they attached a copper plate and a pad to the bottom of the HSF. The fact that Sparkle's solution is more stable (while XFX has tighter pressure on the GPU from the springs) could mean the slight difference in sound here.

All of our 6600 GT solutions are fairly quiet. These fans are nothing like the ones on larger models, and the volume levels are nothing to be concerned about. Of course, the Inno3D fan did have a sort of whine to it that we could have done without. It wasn't shrill, but it was clearly a relatively higher pitch than the low drone of the other fans that we had listened to.

Fan Noise

NVIDIA clocks its 6600 GT cores at 300 MHz when not in 3D mode, and since large sections of the chip are not in use, not much power is needed, and less heat is dissipated than if a game were running. But there is still work going on inside the silicon, and the fan is still spinning its heart out.

Running the coolest is the XFX card. That extra copper plate and tension must be doing something for it, and glancing down at the Sparkle card, perhaps we can see the full picture of why XFX decided to forgo the rubber nubs on their HSF.

The Leadtek and Galaxy cards come in second, pulling in well in two categories.

We have the feeling that the MSI and Prolink cards had their thermal tape or thermal glue seals broken at some point at the factory or during shipping. We know that the seal on the thermal glue on the Gigabyte card was broken, as this card introduced us to the problems with handling 6600 GT solutions that don't have good 4 corner support under the heatsink. We tried our best to reset it, but we don't think that these three numbers are representative of what the three companies can offer in terms of cooling. We will see similar numbers in the load temp graphs as well.

Idle Temp

Our heat test consists of running a benchmark over and over and over again on the GPU until we hit a maximum temperature. There are quite a few factors that go into the way a GPU is going to heat up in response to software, and our goal in this test was to push maximum thermal load. Since we are looking at the same architecture, only the particular variance in GPU and the vendor's implementation of the product are factors in the temperature reading we get from the thermal diode. These readings should be directly comparable.

We evaluated Doom 3, Half-Life 2, 3dmark05, and UT2K4 as thermal test platforms. We selected resolutions that were not CPU bound but had to try very hard not to push memory bandwith beyond saturation. Looping demos in different levels and different resolutions with different settings while observing temperatures gave us a very good indication of the sweet spot for putting pressure on the GPU in these games, and the winner for the hardest hitting game in the thermal arena is: Half-Life 2.

The settings we used for our 6600 GT test were 1280x1024 with no AA and AF. The quality settings were cranked up. We looped our at_coast_12-rev7 demo until a stable maximum temperature was found.

We had trouble in the past observing the thermal diode temperature, but this time around, we setup multiple monitors. Our second monitor was running at 640x480x8@60 in order to minimize the framebuffer impact. We kept the driver open to the temperature panel on the second monitor while the game ran and observed the temperature fluctuations. We still really want an application from NVIDIA that can record these temperatures over time, as the part heats and cools very rapidly. This would also eliminate any impact from running a second display. Performance impact was minimal, so we don't believe temperature impact was large either. Of course, that's no excuse for not trying to do thing in the optimal way. All we want is an MBM5 plugin, is that too much to ask?

It is somewhat surprising that Galaxy is the leader in handling load temps. Of course, the fact that it was the lightest overclock of the bunch probably helped it a little bit, but most other cards were running up at about 68 degrees under load before we overclocked them. The XFX card, slipping down a few slots from the Idle clock numbers with its relatively low core overclock, combined with the fact that our core clock speed leader moves up to be the second coolest card in the bunch, makes this a very exciting graph.

For such a loud fan, we would have liked to see Inno3D cool the chip a little better under load, but their placement in the pack is still very respectable. The Sparkle card again shows that XFX had some reason behind their design change. The added copper bar really helped them even though they lost some stability.

Load Temp

The Gigabyte (despite my best efforts to repair my damage), MSI, and Prolink cards were all way too hot, even at stock speeds. I actually had to add a clamp and a rubber band to the MSI card to keep it from reaching over 110 degrees C at stock clocks. The problem was that the thermal tape on the RAM had come loose from the heatsink. Rather than having the heatsink stuck down to both banks of RAM as well as the two spring pegs, the heatsink was lifting off of the top of the GPU. We didn't notice this until we started testing because the HSF had pulled away less than a millimeter. The MSI design is great, and we wish that we could have had an undamaged board. MSI could keep this from happening if they put a spacer between the board and the HSF on the opposite side from the RAM near the PCIe connectors.

Overclocked Unreal Tournament 2004 Performance Final Words
Comments Locked

84 Comments

View All Comments

  • princethorpe - Wednesday, May 4, 2005 - link

    I've been checking the various forums and found thisone on the 6600gt's excellent. I don't know if anyone else has found them but Asus are making these cards and do a faster than standard model by using faster memory they recon according to their site they run 10% faster than the standard. I've ordered the Asus board by preference because of the build quality
  • GollumSmeagol - Monday, May 2, 2005 - link

    I came across a forum a few months ago here in Hungary, and the people were talking about Leadtek's 6600GTs being faulty/freezing. Strange enough, a few weeks later, the main distributor of Leadtek, took off 6600GTs from their pricelists on the web. Wonder if they are waiting for a bugfix, or simply ran out of stock and wait for the next shipment.

    Another beauty I've just came across, is Gigabyte's TurboForce edition, which is a slightly overclocked version of the 6600 series (both PCI-Ex and AGP 8x). I'm shopping for a SILENT AGP one, (that's where I came across this review), and found this beauty

    http://www.giga-byte.com/VGA/Products/Products_GV-...

    This one has sg. they call Silent-Pipe as a cooler. Not much specs on Gigabyte's page, but from the picture, it looks like there is no fan at all, just a huge copper(-colored?) heatsink, that covers about 2/3rd of the card. (Well, a Zalman FB123 could still be used to move some air)
    The memory clock is wrote to be 1120MHz (remember, TurboForce), plus when I zoomed in on to the box picture, I could spot "VIVO" written on the box. This is also supported by the info on the local dealer's page, where they say "Y" to the TV-OUT of the regular GV-N66T128D, but they say "IN/OUT" for the GV-N66T128VP. All this for roughly 20 USD extra (local price).
  • dpp - Saturday, November 19, 2005 - link

    I've bought http://www.giga-byte.com/VGA/Products/Products_GV-...">Gigabyte GV-NX66T128VP (TurboForce, no fan at all)
    Start up temperature 52C, maximum 65C.
    Is that normal?
  • ylp88 - Monday, April 18, 2005 - link

    I found the article quite informative. Thank you. I purchased two Palit 6600GT cards a week ago and have put them in SLI mode.

    I have a few questions/comments:
    1) The Palit overview is rather short compared to the others. The Palit card is also never mentioned on the last page. Is there a reason for this?
    2) The Palit cards I got DO NOT have memory heatsinks as indicted on the photo for the Palit card. The memory remins cool, however.

    Thanks again for the article.

    ylp88
  • zexe - Wednesday, April 6, 2005 - link

    Do not go for XFX 6600GT !!!!
    The card is NOT longer equipped with 1.6ns
    The chips on my card are Samsung K4J55323QF-GC20
    THAT MEANS 2.ms !!!
  • zexe - Wednesday, April 6, 2005 - link

  • marketmuse - Friday, April 1, 2005 - link

    does anyone know the difference between the Leadtek A6600GT and PX6600GT, besides the PCI-E and AGP?

    I'm looking to purchase a A6600GT, but I don't know if it will have the same performance as the PX version.

    Thanks
    MM
  • Monypennyuk - Monday, March 14, 2005 - link

    Hello all.

    WOW a great review site.:)

    Just one problem. I was having problems deciding between two of these cards on the ebuyer.co.uk site.

    PNY Verto GeForce 6 6600GT AGP8x £119

    or

    Inno 3D 128MB GeForce 6600 GT 8xAGP TV-Out DVI DirectX9 £116

    This review does not mention the PNY version. although i now notice that they have the LEADTEK at about the same price. Going by these comments i GUESS i should get the LEADTEk??? Anyone know about the PNY cos my mate rekons thats the better one...

    Leadtek Winfast Geforce 6600 Gt128mb Ddr3 Agp Dvi-i Tv-out £117.

    Any help much appreciated.

    A

  • BlackMamba - Tuesday, March 8, 2005 - link

    #75: That link to MSI is for the AGP version (note the sink for the bridge chip).

    Not sure if they've fixed the problems with the PCI-E version, and would also like to know.
  • JensErik - Tuesday, March 1, 2005 - link

    Looking at the pictures of the MSI card in the review and the pics at MSI's page it seems that MSI has changed the a lot on their card, including the HSF.

    (Check it out here: http://www.msi.com.tw/program/products/vga/vga/pro...

    Does anyone know if this has solved the HSF mounting problem encountered in the test??

Log in

Don't have an account? Sign up now