Original Link: http://www.anandtech.com/show/1563
IntroductionToday, we'll be covering the performance of 11 different vendor's versions of the Geforce 6600GT. When that many of the same part get into the same room at the same time, you know that we're going to have a good cross-section of what the market should look like. If you're interested in buying a 6600GT, then this is the article for you.
Not only will we see what all these different vendors have to offer to you as a customer, but we will really see how hard the NV43 can be pushed, pulled, and stretched when it hits your system. We don't usually like to test overclocking on a large scale with the engineering sample parts that NVIDIA and ATI send us just after a product launch. These test samples are often just strung together by the skin of their IHV's proverbial teeth. It's not uncommon to see wires, resistors, and capacitors soldered onto an early PCB. We're actually lucky that these things work at all in some cases. We received an overclocked 6800 Ultra Extreme from NVIDIA that never booted, as well as an NV41 that was DOA. These preproduction boards are not the kind of boards that we would actually buy and use in our home systems.
And so, when an incredible number of vendors responded to our call for parts, we were very happy. Shipping parts means that we have what the end user will have. Heat tests, noise tests, overclocking tests - they all become very relevant and interesting. We will be looking at which vendors offer the best products to the consumer. Cards will be judged based on their idle and load thermal diode temperatures, the sound pressure level in dB of the system at a one meter distance, overclockability, features, bundle, and price.
We do spend a lot of time looking at the benchmarks of these cards at overclocked speeds, but these benchmarks aren't the "be all, end all" judge of what vendor makes a better card. First of all, the potential of any given ASIC to achieve a certain overclock is not something over which a vendor can have any power, unless they bin their chips and sell a special line of overclocker friendly cards (or, more likely, pre-overclocked cards). None of these 6600GTs fall into that category. This means that our BrandX card running at a certain speed doesn't guarantee anything about yours.
Overclocking tests are still important, as they assure that the cards which do achieve a high stable clock are able to support a GPU that is capable of running at a high speed. Some boards are not. It's just more of an art than a science sometimes and these numbers shouldn't be used as an absolute metric.
Heat management is especially important when overclocking. With a new breed of game on store shelves, such as Doom 3, Half-Life 2, and the onslaught of titles that will surely be based on their engines, GPU temperatures have no where to go but up. Increasing the core clock speed will help performance, but in our tests, it also raised maximum load temperature by a degree or two. The more a graphics card maker can do to keep heat down, the better. And that will be especially tricky with these cards once they've been in end users' hands for a while. Allow me to explain.
The way that the cooling solution attaches to NVIDIA's reference design is with 2 holes. Basically, the popular rectangular heatsink design is positioned precariously on top of the GPU and can pivot easily around the ASIC. This means: don't touch the heatsink. This really causes problems in situations where the thermal tape or glue is used. The kind of fulcrum that the NVIDIA reference design created is beyond powerful enough to tear through tape and snap the strongest glue without a second thought. Once those seals have been broken, cooling is severely compromised. Looking back at our numbers, this may be the reason why we see some of the extreme temperature numbers that we do. Of course, we were extraordinarily careful to avoid touching any HSFs after we realized what was going on.
Overclocking our Geforce 6600GTsOnce again, we're going to commend NVIDIA for including the coolbits registry tweak in their drivers that allows core and memory clock speed adjustment (among other things). No matter how new and pretty ATI makes overdrive look, it just doesn't give the end user the kind of control that these two slider bars have. Unless something pretty big changes by the end of the year, ATI users still have to rely on 3rd party tools for any manual overclocking.
But, when all is said and done, overclocking is about much more than just moving some sliders to the right. After spending quite a few hours (each) doing nothing but playing with the clock speeds of eleven different Geforce 6600GT cards, we started to wonder if there was any purpose in life but to increase the speed of small square bits of silicon to a point just before failure. Hopefully, we can pass on what we have learned.
There are multiple things to keep in mind; it's not just core and memory speed. Power and GPU quality are also concerns. If the device doesn't have the power to drive all its components at the requested clock speed, something has to give. This means it may be able to push memory really high, and core really high, but not both at the same time. Also, GPU quality physically limits the maximum speed at which a core can run and yield correct results. One of the nice things about coolbits is that it won't let you try to run your card at a clock speed that is impossible. If the card can't provide enough power to the components, or the GPU simply fails to run correctly at that speed, the driver won't let you enable that clock speed. With utilities such as PowerStrip, the end user only has visual glitches and system lockups/reboots to indicate problems of this nature.
For anyone who doesn't know, to enable the clock controls on NVIDIA hardware, simply add a DWORD named coolbits with hex value 3 to registry key: "HKLM/Software/NVIDIA Corporation/Global/NVTweak/"
The beginner's and advanced method for a stable overclock begin at the same place: basing your core and mem clock speeds near what NVIDIA's driver picks. Go into the NVIDIA control panel after enabling coolbits, choose the clock control panel, and select manual control. Make sure that you are always setting clocks for 3D performance and not 2D. Let NVIDIA pick some clock speeds for you. At this point, we aren't sure exactly what process NVIDIA uses to determine these clock speeds, but at the very least, it makes sure that both the GPU and RAM will have enough power at those frequencies. We will try to look into the other conditions of this feature for future articles.
The stable way out is to look at what NVIDIA set the clock speeds to, drop them by 10MHz (core and mem), and set them there. Then grab Half-Life 2, 3dmark05, or Doom 3 and run a timedemo numerous times, watching closely for glitches and signs of overheating or other issues. Those are the three hottest running titles that we have in our labs at the moment, but Half-Life 2 is, hands down, the leader in turning video cards into cookware.
If you want more performance, it's possible to go faster than what NVIDIA says you can do. The first thing to do is to find the fastest speed that the driver will let you set the core. Then you have somewhat of range of what is possible. Of course, that speed won't be it; try half way between the NVIDIA recommendation and the max clock speed - but leave the memory at its factory setting. Pay close attention, and make sure that you're using a benchmark that you can bail quickly in case you notice any problems. If there are glitches, cut the space between where you are and the NVIDIA setting in half and try again. It's almost like a binary search for the sweet spot, but you can stop when you know that you're safe. When you find a core clock speed that you like, if it's much higher than the NVIDIA driver-determined setting, you may wish to bring the memory clock up slowly to keep from throwing off the balance.
So how do you know if something is wrong when you've overclocked? In newer games like Half-Life 2, all the shaders start to render slightly incorrectly. In HL2 especially, the anomalies tend to have high locality of reference (similar problems happen near each other) and form an almost grid-like pattern of disruption on surfaces. It used to be that disappearing geometry and hard locks were the number one tell tale sign, but now vertex and pixel shaders are a little more sensitive and subtle. On the memory side, if clocks are too high, we might see speckling or off-color pixels. Edges could be disjoint, and texturing issues can occur.
AlbatronAlbatron has a great design with their Trinity board. The top overclocker in our tests, with a 590 MHz core clock speed, the Albatron was also able to maintain the second lowest load temperature even at the highest clock speed that we ran.
The fatal flaw with this card is the same problem that we had mentioned on the first page. The rectangular HSF is only attached to the board at two diagonals with the GPU somewhere underneath. This can and will cause contact problems if the heatsink is nudged or bumped too hard.
This really is a nice, solid board. It's equipped with a powerful (if not loud) HSF that keeps it cool even under the most averse conditions. GDDR3 doesn't usually get that hot, so ramsinks are just a little icing on the cake. And they might help out a little for the occasional overclock.
ChaintechThe Chaintech 6600 GT is most notable as the quietest of the shrouded HSF offerings. This is a good card that puts in numbers somewhere in the middle of the pack every time. There is quite a lot to be said about consistency, especially in the graphics market. So many companies just focus on one selling point and let everything else fall by the wayside.
Unfortunately, this Geforce 6600 GT also has the aforementioned HSF attachment issue. Even so, idle temperatures fell in the middle of the pack, and this card performs well.
GalaxyWe haven't had a Galaxy card in our labs before, and we were happy to see a card with a custom-round HSF attached very firmly after all the loosely attached rectangular solutions that we had the pleasure of handling. The springs on the hardware are much tighter than on the other solutions that we've seen, and the shape of the heatsink doesn't give us enough leverage when we press on it to pop it off the GPU. Though it is still possible to twist the HSF around the axis created by the two spring pin mounts, it is something that a consumer would have to try to do.
Since this card has no shroud on its fan, this will also sound different than some of the other solutions that we've seen. Shrouds can help to direct airflow on larger heatsinks, but since this cooling solution doesn't try to cool the RAM as well as the core, a small circular design is fine.
We were happy with this card's ability to stay cool at idle and load. The only downside that we saw was the board's overclocking performance. Galaxy sent us this part at a pre-overclocked 525/550. We originally thought this was a press sample card, as the packaging materials that came with the box indicated a 500/500 clock speed. On the contrary, Galaxy has informed us that all of their 6600 GT products are shipping at 525/550 clock speeds. In the end, this part was our worst overclocker, which is a surprising combination when talking about our coolest part. It really could have been luck of the draw as the worst case GPU and worst case RAM configuration of the bunch, but that does seem like an awful bit of luck.
If you aren't planning to overclock from stock, this is absolutely a wonderful 6600 GT option. For someone who needs good cooling and low noise in their case, Galaxy is top notch. Since this product comes with added performance at no extra hassle, it's definitely a top pick in our books.
GigabyteThe Gigabyte 6600 GT uses the standard rectangular HSF. This card has good fan noise characteristics and overclocked well in our tests.
Unfortunately, we had a little lab accident, which dislodged the HSF unit from the GPU. Wiggling the heatsink on other cards didn't matter as much because they used a thermal grease, but Gigabyte went with a thermal glue. The glue cracked, and we could never quite get the same connection that the chip had originally to the HSF.
This made it so that our idle and load temp numbers were a little higher than what we would expect otherwise.
We would love to get our hands on a fresh sample from Gigabyte for testing, but at the same time, this could easily have happened to any end user without anyone knowing any better. The importance of designing a thermal solution that will stay attached to the GPU can't be understated. This type of problem shouldn't be an issue.
Inno3DInno3D didn't overclock the highest, has the loudest fan, and isn't the coolest card of the bunch (though it was competitive with other box-shaped HSF solutions). But the one thing that Inno3D did better than everyone else that we had looked at was the way that they attached their heatsink to the board. The HSF doesn't move and makes excellent contact with the GPU.
The sound of the HSF had a higher pitched hair dryer quality to it, though we are still talking about a mid-range card level fan speed. This was the loudest card that we ran, but that speaks well for Geforce 6600 GT cards. On the upside, the heatsink is all copper, as are the ramsinks. They also used 1.6ns GDDR3 on their board, but clocked it at 500MHz. In light of this, a 600MHz memory clock speed should be achievable even though we were unable to push it that high.
LeadtekThe Leadtek card uses a circular HSF. They have a larger circumference than the Galaxy card, so it is still possible to tilt cooling solution, but Leadtek have added a pad around the silicon that helps to stabilize the heatsink. While support on the edges of the HSF would have been perfectly stable, this design is much improved over some of the other solutions that we've seen.
As far as being quiet, no shroud helped. This was also one of the coolest cards that we tested, and achieved a very high overclock. Overall, the Winfast PX6600 GT TDH is a very nice implementation of a Geforce 6600 GT.
MSIMSI has an excellent concept, bringing the only cooling solution that attaches the ramsinks to the main body of the heatsink. This should allow the fan to cool the block of metal that stretches across the ram as well. Our test also showed this to be one of the quietest cards that we tested.
When we began testing, we noticed that we had a problem. Even though MSI went with a mostly round design, there was apparently enough leverage between the two spring pins to rip the thermal tape loose from the RAM. Even at stock clocks, it did get way too hot and so, we needed to use makeshift clamps on the ram to hold the heatsink in place on the GPU.
It is unfortunate to see a card with such potential overcome by HSF mounting issues. We would love the opportunity to retest this card with a properly mounted cooling solution, and update our cooling numbers. This could have been an unnoticed manufacturing defect, but the setup lends itself to easily pulling up off the RAM if the end user were to press down too hard on the opposite side. In fact, it seemed as if the spring pins held the heatsink off the GPU rather than down onto it. This would have been useful to add pressure to the GPU if the thermal tape had held on the RAM, but again, we received a part in non-working order, so we aren't sure what it should have looked like.
PalitThis is another rounded type of design, but it's not all round. And the card has a shroud, but it's not a loud solution. The bottom right corner of the HSF doesn't have any problem spots on which to push, but watch out for that top right corner. There are no spacers or pads on this card and though the leverage problem is less of an issue due to the implementation, the HSF can be lifted off the surface of the GPU.
ProlinkIt looks like Prolink just dropped a generic cooling solution on their Pixelview part. It's not pretty, and it should have done the job, but we ended up running into a heat issue as we did with MSI and Gigabyte.
We extend the same offer to Prolink that we do to MSI and Gigabyte - send us an updated card and we will retest your part's cooling ability and include an update to our article. Aside from being the hottest part that we tested due to its thermal contact issues, this card was also fairly loud.
The Pixelview solution from Prolink was equipped with an ASIC that lends itself well to overclocking. Unfortunately, this doesn't do much to make up for the problems that could have been solved if the heatsink had been mounted differently.
SparkleSparkle is using the reference NVIDIA HSF, which in this case, wins a great deal of points. The NVIDIA reference cooling solution has rubber nubs around all four corners of the HSF solution. This adds stability to the part that many other cards in this review lack. Unfortunately, the spring tension isn't wonderfully high, so cooling is a little lower than what might be desired. The spacers may add stability, but they also make it so that in order to get really good contact for cooling, the part would need to be pressed down harder than it currently is.
There were a couple of vendors who sent us "press samples" that were clocked at non-shipping clock speeds. Sparkle is included in this group, but we've asked them to only send us products that we can buy on the shelves. The part that we have in our labs boots up at 507/550, but we clocked it higher than that, easily.
You have to wonder who made the decision that seven MHz and a tiny bit of memory bandwidth was really worth sending us a press sample when we asked them not to do so. Because of that, we also can't be sure that the components that went into the part weren't binned beforehand - the 2ns DDR3 memory easily hit 610MHz.
Getting 2ns memory to clock that high is obviously not impossible. It can happen by chance (we've seen it before on earlier parts that used it), but we'd rather have no doubts.
At the same time, we asked for shipping parts and this is what we received. We'll report all the information that we have along with the numbers that we get and let the readers make the call.
XFXThis card is the beast of our roundup: The XFX 6600GT Extreme Gamer Edition. XFX is the only vendor that we've seen take a stand and do something different. The first thing to notice is the dual DVI connectors on the board. This isn't normally something one would need on a mid-range solution, but having just come from newegg.com and noticed that the standard XFX card with dual DVI costs less than some PCIe 6600 GT parts without dual DVI, there's no reason to start talking about cost being a huge issue, and thus, no good argument for why dual DVI isn't on these cards.
There is something that this card has for which a premium may be charged: 1.6ns GDDR3 running at 600MHz. We haven't seen pricing yet, but this part is obviously not going to be the "be all, end all" value of graphics cards. Adding memory bandwidth is a good thing for the 6600 GT, considering the 128-bit bus. The problem is that the performance benefit is maybe half the increase in memory bandwidth, if we are lucky. And we might see better scaling with AA enabled, but on a mainstream part, that's pushing the limits.
Anyway, modifying the stock HSF, XFX placed a copper plate between the die and heatsink in order to increase the tension in the spring pegs and keep harder pressure on the GPU. Also, they are doing the same thing that we saw Leadtek do - there is a bit of material around the silicon that acts as a spacer between the rest of the GPU and the heatsink. This is necessary because the copper plate lifted the rubber nubs off the PCB making them ineffective stabilizers.
This card was loud, but cooled well due to their innovative adaptation of the stock cooling solution. The inclusion of 1.6ns GDDR3 will also be very attractive at a default clock speed of 600MHz. But this will not be appealing if it is incredibly higher priced than the current round of 6600 GT products, especially since (whether by design or chance) Sparkle's 6600 GT had 2ns RAM that overclocked to 610MHz as well.
The TestFor this test, we used an Intel based PCI Express system. We would have preferred an nForce 4 based setup, but there's only so many to go around the lab these days. Here's the setup that we used:
|Performance Test Configuration|
|Processor(s):||3.4 GHz Pentium 4 Extreme Edition|
|RAM:||2 x 256MB Samsung DDR2 (4:4:4:11)|
|Hard Drive(s):||Seagate Barracuda 7200.7 120GB PATA|
|Chipset Drivers:||Intel Chipset INF v18.104.22.1681|
|Video Card(s):||Albatron Trinity Geforce 6600 GT
Chaintech Geforce 6600 GT
Galaxy Geforce 6600 GT
Gigabyte Geforce 6600 GT
Inno3D Geforce 6600 GT
Leadtek Winfast PX6600 GT TDH
Palit Geforce 6600 GT
Prolink Geforce 6600 GT
Sparkle Geforce 6600 GT
XFX Geforce 6600 GT
|Video Drivers:||NVIDIA 67.03 Beta|
|Operating System(s):||Windows XP Professional SP2|
|Power Supply:||OCZ PowerStream 520 PSU|
And now on to the comparisons.
Overclocking ComparisonWith overclocking, every GPU is different, so a side effect of looking at this is that we get a good idea of how the 6600GT will overclock in a general sense. We can't really say that all Albatron cards will overclock by 90MHz. Believe us when we say that if they all would run at that speed, they would all be running at that speed and out-selling the competition. There are a lot of factors that go into it. That's why we base most of our recommendation and ranking decisions on cooling and noise levels rather than overclocking. It is still a factor though.
Those with a calculator handy will notice that the mean median and standard deviation are:
Std. Dev.: 17.1840
Knowing NVIDIA, QA is going to assure that chips leaving labs will run at a little higher than stock clock speeds. This translates to a little bit of breathing room. What we pull away from this testing is that we expect Geforce 6600GT's to achieve a minium 7% overclock. A 9% to 12% overclock should be possible to most people who decide to own this card. Beyond that is icing on the cake. Of course, we are working with a very small sample size and we don't know much about the population as a whole either. We would have been more comfortable making predictions had this data looked more like a bell curve, but what we see is a little too flat for us to say anything with any statistical confidence.
Our memory clock speed graph shows Sparkle on top, but that's 2ns RAM on a 110MHz overclock. The XFX RAM is running 1.6ns RAM at a 10MHz overclock. This could be really lucky for Sparkle, but it isn't likely to happen on most boards. A 22% memory overclock, even with the added features of GDDR3, is still tough to pull off, especially when the 1.6ns memory only matched its performance. Inno3D also uses 1.6ns memory, but our final overclock ended up lower than the 600MHz that should have been possible with this part.
All the other solutions are 2ns memories which overclock between 50 and 100MHz. All the memories we looked at on 6600 GT boards are Samsung GDDR3 solutions.
Overclocked Doom 3 PerformanceWe achieved the minimum overclock configuration of 7% core, and 12% mem was able to bring a 7% performance improvement in Doom 3. We actually see the XFX and Sparkle parts do better than higher core clocked parts because of the added memory bandwidth.
The memory bandwidth need is even stronger in Doom 3 with 4x AA turned on. These benchmark score improvements tend to be more influenced by the percent improvement in memory clock than core clock.
If you don't tend to overclock, and you want to play with AA on, you will absolutely want the XFX part for its 1.6ns GDDR3.
Overclocked Half-Life 2 PerformanceJust as we saw with Doom 3, Half-Life 2 wants all the GPU and memory bandwidth that you can throw at it. Even the Galaxy card's modest overclock affords it a nice jump over the stock performance.
Turning on AA and AF just ensures that the memory clock speed increases have some extra help to offer the performance.
Overclocked Unreal Tournament 2004 PerformanceOverclocking really doesn't do anything for the old DX8.1 game that we have laying around the lab.
We turn on 4xAA and 8xAF and let it stretch its legs a bit. This performance leap just goes to show how memory limited developers really are when they render beautiful graphics without the flexibility of current technology (at least on the software side).
Noise and HeatHere are the meat and potatoes of our comparison between the 11 Geforce 6600GT cards today. How quiet and efficient are the cooling solutions attached to the cards? We are about to find out.
Our noise test was done using an SPL meter in a very quiet room. Unfortunately, we haven't yet been able baffle the walls with sound deadening material in the lab, and the CPU and PSU fans were still on as well. But in each case, the GPU fan was the loudest contributor to the SPL in the room by a large margin. Thus, the SPL of the GPU is the factor that drove the measured SPL of the system.
Our measurement was taken at 1 meter from the caseless computer. Please keep in mind when looking at this graph that everyone experiences sound and decibel levels a little bit differently. Generally, though, a one dB change in SPL translates to a perceivable change in volume. Somewhere between a 6 dB and 10 dB difference, people perceive the volume of a sound to double. That means the Inno3D fan is over twice as loud as the Galaxy fan. Two newcomers to our labs end up rounding out the top and bottom of our chart.
The very first thing to notice about our chart is that the top three spots are held by our custom round HSF solutions with no shroud. This is clearly the way to go for quiet cooling.
Chaintech, Palit, and Gigabyte are the quietest of the shrouded solutions, and going by our rule of thumb, the Palit and Gigabyte cards may be barely audibly louder than the Chaintech card.
The Albatron, Prolink, and XFX cards have no audible difference, and they are all very loud cards. The fact that the Sparkle card is a little over 1 dB down from the XFX card is a little surprising: they use the same cooling solution with a different sticker attached. Of course, you'll remember from the XFX page that it seems that they attached a copper plate and a pad to the bottom of the HSF. The fact that Sparkle's solution is more stable (while XFX has tighter pressure on the GPU from the springs) could mean the slight difference in sound here.
All of our 6600 GT solutions are fairly quiet. These fans are nothing like the ones on larger models, and the volume levels are nothing to be concerned about. Of course, the Inno3D fan did have a sort of whine to it that we could have done without. It wasn't shrill, but it was clearly a relatively higher pitch than the low drone of the other fans that we had listened to.
NVIDIA clocks its 6600 GT cores at 300 MHz when not in 3D mode, and since large sections of the chip are not in use, not much power is needed, and less heat is dissipated than if a game were running. But there is still work going on inside the silicon, and the fan is still spinning its heart out.
Running the coolest is the XFX card. That extra copper plate and tension must be doing something for it, and glancing down at the Sparkle card, perhaps we can see the full picture of why XFX decided to forgo the rubber nubs on their HSF.
The Leadtek and Galaxy cards come in second, pulling in well in two categories.
We have the feeling that the MSI and Prolink cards had their thermal tape or thermal glue seals broken at some point at the factory or during shipping. We know that the seal on the thermal glue on the Gigabyte card was broken, as this card introduced us to the problems with handling 6600 GT solutions that don't have good 4 corner support under the heatsink. We tried our best to reset it, but we don't think that these three numbers are representative of what the three companies can offer in terms of cooling. We will see similar numbers in the load temp graphs as well.
Our heat test consists of running a benchmark over and over and over again on the GPU until we hit a maximum temperature. There are quite a few factors that go into the way a GPU is going to heat up in response to software, and our goal in this test was to push maximum thermal load. Since we are looking at the same architecture, only the particular variance in GPU and the vendor's implementation of the product are factors in the temperature reading we get from the thermal diode. These readings should be directly comparable.
We evaluated Doom 3, Half-Life 2, 3dmark05, and UT2K4 as thermal test platforms. We selected resolutions that were not CPU bound but had to try very hard not to push memory bandwith beyond saturation. Looping demos in different levels and different resolutions with different settings while observing temperatures gave us a very good indication of the sweet spot for putting pressure on the GPU in these games, and the winner for the hardest hitting game in the thermal arena is: Half-Life 2.
The settings we used for our 6600 GT test were 1280x1024 with no AA and AF. The quality settings were cranked up. We looped our at_coast_12-rev7 demo until a stable maximum temperature was found.
We had trouble in the past observing the thermal diode temperature, but this time around, we setup multiple monitors. Our second monitor was running at 640x480x8@60 in order to minimize the framebuffer impact. We kept the driver open to the temperature panel on the second monitor while the game ran and observed the temperature fluctuations. We still really want an application from NVIDIA that can record these temperatures over time, as the part heats and cools very rapidly. This would also eliminate any impact from running a second display. Performance impact was minimal, so we don't believe temperature impact was large either. Of course, that's no excuse for not trying to do thing in the optimal way. All we want is an MBM5 plugin, is that too much to ask?
It is somewhat surprising that Galaxy is the leader in handling load temps. Of course, the fact that it was the lightest overclock of the bunch probably helped it a little bit, but most other cards were running up at about 68 degrees under load before we overclocked them. The XFX card, slipping down a few slots from the Idle clock numbers with its relatively low core overclock, combined with the fact that our core clock speed leader moves up to be the second coolest card in the bunch, makes this a very exciting graph.
For such a loud fan, we would have liked to see Inno3D cool the chip a little better under load, but their placement in the pack is still very respectable. The Sparkle card again shows that XFX had some reason behind their design change. The added copper bar really helped them even though they lost some stability.
The Gigabyte (despite my best efforts to repair my damage), MSI, and Prolink cards were all way too hot, even at stock speeds. I actually had to add a clamp and a rubber band to the MSI card to keep it from reaching over 110 degrees C at stock clocks. The problem was that the thermal tape on the RAM had come loose from the heatsink. Rather than having the heatsink stuck down to both banks of RAM as well as the two spring pegs, the heatsink was lifting off of the top of the GPU. We didn't notice this until we started testing because the HSF had pulled away less than a millimeter. The MSI design is great, and we wish that we could have had an undamaged board. MSI could keep this from happening if they put a spacer between the board and the HSF on the opposite side from the RAM near the PCIe connectors.
Final WordsWorking with all of these cards has been difficult, especially when one wrong move could cause quite a few problems. We apologize for not being able to report numbers that reflect what cards from Gigabyte, MSI, and Prolink can do in terms of cooling. We extend to these three companies an invitation: if you would send us another sample, we will retest your card and amend updated numbers to our tests. At the same time, we went ahead and included these numbers because these cards were how they were sent. It is rather alarming how few precautions were taken to prevent loss of contact between the GPU and the HSF on Geforce 6600 GT cards as a whole.
Just to clarify, the specific problem that we have is when companies used a thermal adhesive that cracks, tape that tears, or something that is otherwise compromised by the torque of a freely rotating heatsink. Vendors using non-adhesive solutions tended not to suffer the kind of immediate long-term damage that we saw with the aforementioned cards, but the ultimate solution is really to stop the heatsink from moving.
In the end, we survived the test, and we have handed out our awards. Here they are without further ado.
|Among the top three performers in Noise, Idle Temp, and Load Temp, this card overclocked well to boot. Armed with a padded surface mounted around the exposed silicon and a circular HSF solution, many of the stability issues that plagued other implementations were avoided. We are very pleased to award the AnandTech Gold Editor's Choice to the Leadtek Winfast PX6600 GT TDH.|
|This card might not be the cheapest of the bunch, but it surpasses everyone else easily with Dual DVI, 1.6ns GDDR3, a loud fan, an even louder retail package, an attempt at stabilizing the HSF, and a load temperature that never rose above 69 degrees C. The only problem with this card is that all the added features likely contribute to its less than stellar overclocking capability. And thus, the XFX Geforce 6600GT Extreme Gamer is awarded our Silver Editor's Choice.|
|The Galaxy 6600 GT has quite a lot going for it. It is the coolest, quietest, fastest stock card that we tested. It also has a good HSF solution that doesn't fall off as easily as some of its competition. The problem is that this coolest card is also the worst overclocker. This could be bad luck, but it could also be indicative of something else. This is the first time we've had them in our labs, and shipping 525/550 while leading in cooling secures the Galaxy 6600GT AnandTech's Bronze Editor's Choice Award.|
As for the rest of the pack, they all had many strengths that are spread among many cards. We'll tell you why potential candidates didn't quite make the Editor's Choice list.
When it comes to Inno3D, we liked them because of their firmly attached HSF solution and very solid all-around performance. The real downside to Inno3D was their noise level. They weren't the best overclocker in the bunch, but they weren't worst the either.
Chaintech and Albatron missed Editor's Choice because they didn't have any stabilization on their heatsinks. The problems that inflicted Gigabyte and MSI could just as easily have happened to them.
Solving this HSF mounting problem was one of the top issues for us today, and it should be a key factor in the decision for anyone in an IT build room or whose idea of a good time is playing around in their case. Being careful (taunting fate?) is fine if you open your box once every year-and-a-half to dust and upgrade. If your job has anything to do with video cards, and you might be seeing one of the cards that we mentioned in this review, don't get anything without a completely stable HSF mounting system. The expanded pads are a little more stable than the solutions that only make contact with the silicon, but if were building systems with these cards, I would limit purchasing descisions to cards with some sort of 4-corner support (or zero leverage). Of the products we tested, here's our short list of IT-friendly 6600 GT parts:
Inno3D - solid mounting foam at two non-attached corners
Sparkle - rubber nubs around 4 corners
Galaxy - very tight springs and no leverage around the circle to move the HSF
XFX doesn't make the list because, at this point, we aren't sure which way they are going to go with the design. It looks as if they are adopting a design more like Leadtek's and just expanding the contact area with the area around the core, so they may be dropping the rubber altogether. Hopefully, they'll just find some rubber that fits and squeeze it on in there.
When all is said and done, we have to put a good part of the responsibility for the HSF mounting issues on NVIDIA. They do come up with the reference board design, and they end up placing the mounting holes for the cooling solutions on these boards. Obviously, these boards aren't 6800 Ultra Extreme parts and they don't need to have the cooling solution torqued down onto the core. But, at the same time, it would be nice if vendors didn't have to rely on spacers, pads, or other tricks in order to keep their cooling solutions in contact with the GPU.