OC: Power, Temperature, & Noise

Our final task is our look at the overclocking capabilities of our GTX 660 Ti cards. Based on what we’ve seen thus far with GTX 660 Ti, these factory overclocked parts are undoubtedly eating into overclocking headroom, so we’ll have to see just what we can get out of them. The very similar GTX 670 topped out at around 1260MHz for the max boost clock, and between 6.6GHz and 6.9GHz for the memory clock.

GeForce 660 Ti Overclocking
  EVGA GTX 660 Ti SC Zotac GTX 660 Ti AMP Gigabyte GTX 660 Ti OC
Shipping Core Clock 980MHz 1033MHz 1033MHz
Shipping Max Boost Clock 1150MHz 1175MHz 1228MHz
Shipping Memory Clock 6GHz 6.6GHz 6GHz
Shipping Max Boost Voltage 1.175v 1.175v 1.175v
       
Overclock Core Clock 1030MHz 1033MHz 1083MHz
Overclock Max Boost Clock 1200MHz 1175MHz 1278MHz
Overclock Memory Clock 6.5GHz 6.8GHz 6.6GHz
Overclock Max Boost Voltage 1.175v 1.175v 1.175v

As we suspected, starting with factory overclocked cards isn’t helping here. Our Zotac card wouldn’t accept any kind of meaningful GPU core overclock, so it shipped practically as fast as it could go. We were able to squeeze out another 200MHz on the memory clock though.

Meanwhile our EVGA and Gigabyte cards fared slightly better. We could push another 50MHz out of their GPU clocks, bringing us to a max boost clock of 1200MHz on the EVGA card and 1278MHz on the Gigabyte card. Memory overclocking was similarly consistent; we were able to hit 6.5GHz on the EVGA card and 6.6GHz on the Gigabyte card.

Altogether these are sub-5% GPU overclocks, and at best 10% memory overclocks, which all things considered are fairly low overclocks. The good news is that reference-clocked cards should fare better since their headroom has not already been consumed by factory overclocking, but binning also means the best cards are going to be going out as factory overclocked models.

Moving on to our performance charts, we’re going to once again start with power, temperature, and noise, before moving on to gaming performance.

Unsurprisingly, given the small power target difference between the GTX 670 and the GTX 660 Ti, any kind of overclocking that involves raising the power target quickly pushes power consumption past the GTX 670’s power consumption. How much depends on the test and the card, with the higher power target Gigabyte card starting with a particular disadvantage here as its power consumption ends up rivaling that of the GTX 680.

We also see the usual increase in load temperatures due to the increased power consumption.  The Zotac and Gigabyte cards fare well enough due to their open air coolers, but the blower-type EVGA card is about as high as we want to go at 80C under OCCT.

Last but not least, looking at noise levels we can see an increase similar to the temperature increases we just saw. For the Zotac and EVGA cards noise levels are roughly equal with the reference GTX 680, which will be important to remember for when we’re looking at performance. Meanwhile the Gigabyte card continues to shine in these tests thanks to its oversized cooler; even OCCT can only push it to 46.8dB.

Power, Temperature, & Noise OC: Gaming Performance
Comments Locked

313 Comments

View All Comments

  • Oxford Guy - Thursday, August 16, 2012 - link

    What is with the 285 being included? It's not even a DX 11 card.

    Where is the 480? Why is the 570 included instead of the 580?

    Where is the 680?
  • Ryan Smith - Saturday, August 18, 2012 - link

    The 285 was included because I wanted to quickly throw in a GTX 285 card where applicable, since NVIDIA is promoting the GTX 660 Ti as a GTX 200 series upgrade. Basically there was no harm in including it where we could.

    As for the 480, it's equivalent to the 570 in performance (eerily so), so there's never a need to break it out separately.

    And the 680 is in Bench. It didn't make much sense to include a card $200 more expensive which would just compress the results among the $300 cards.
  • CeriseCogburn - Sunday, August 19, 2012 - link

    So you're saying the 680 is way faster than the 7970 which you included in every chart, since the 7970 won't compress those $300 card results.
    Thanks for admitting that the 7970 is so much slower.
  • Pixelpusher6 - Friday, August 17, 2012 - link

    Thanks Ryan. Great review as always.

    I know one of the differentiating factors for the Radeon 7950s is the 3GB of ram but I was curious are there any current games which will max out 2GB of RAM with high resolution, AA, etc.?

    I think it's interesting how similar AMDs and Nvidias GPUs are this generation. I believe Nvidia will be releasing the GTX 660 non Ti based on GK106. Leaked specs seem to be similar to this card but the texture units will be reduced to 64. I wonder how much of a performance reduction this will account for. I think it will be hard for Nvidia to get the same type of performance / $ as say GTX 460 / 560 Ti this generation because of having to have GK104 fill in more market segments.

    Also I wasn't aware that Nvidia was still having trouble meeting demand with GK104 chips I thought those issues were all cleared up. I think when AMD released their 7000 series chips they should have taken advantage of being first to market and been more competitive on price to grow market share rather than increase margins. At that time someone sitting on 8800GT era hardware would be hard pressed to upgrade knowing that AMDs inflated prices would come down once Nvidia brought their GPUs to market. People who hold on to their cards for a number of years is unlikely to upgrade 6 months later to Nvidias product. If AMD cards were priced lower at this time a lot more people would have bought them, thereby beating Nvidia before they even have a card to market. I do give some credit to AMD for preparing for this launch and adjusting prices, but in my opinion this should have been done much earlier. AMD management needs to be more aggressive and catch Nvidia off guard, rather than just reacting to whatever they do. I would "preemptively" strike at the GTX 660 non Ti by lowering prices on the 7850 to $199. Instead it seems they'll follow the trend and keep it at $240-250 right up until the launch of the GTX 660 then lower it to $199.
  • Ryan Smith - Saturday, August 18, 2012 - link

    Pixelpusher, there are no games we test that max out 2GB of VRAM out of the box. 3GB may one day prove to be advantageous, but right even at multi-monitor resolutions 2GB is doing the job (since we're seeing these cards run out of compute/render performance before they run out of RAM).
  • Sudarshan_SMD - Friday, August 17, 2012 - link

    Where are naked images of the card?
  • CeriseCogburn - Thursday, August 23, 2012 - link

    You don't undress somebody you don't love.
  • dalearyous - Friday, August 17, 2012 - link

    it seems the biggest disappointment i see in comments is the price point.

    but if this card comes bundled with borderlands 2, and you were already planning on buying borderlands 2 then this puts the card at $240, worth it IMO.
  • rarson - Friday, August 17, 2012 - link

    but it's the middle of freaking August. While Tahiti was unfortunately clocked a bit lower than it probably should have been, and AMD took a bit too long to bring out the GE edition cards, Nvidia is now practically 8 months behind AMD, having only just released a $300 card. (In the 8 months that have gone by since the release of the 7950, its price has dropped from $450 to $320, effectively making it a competitor to the 660 Ti. AMD is able to compete on price with a better-performing card by virtue of the fact that it simply took Nvidia too damn long to get their product to market.) By the time the bottom end appears, AMD will be ready for Canary Islands.

    It's bad enough that Kepler (and Fermi, for that matter) was so late and so not available for several months, but it's taking forever to simply roll out the lower-tier products (and yes, I know 28nm wafers have been in short supply, but that's partially due to Nvidia's crappy Kepler yields... AMD have not had such supply problems). Can you imagine what would have happened if Nvidia actually tried to release GK110 as a consumer card? We'd have NOTHING. Hot, unmanufacturable nothing.

    Nvidia needs to get their shit together. At the rate they're going, they'll have to skip an entire generation just to get back on track. I liked the 680 because it was a good performer, but that doesn't do consumers any good when it's 4 months late to the party and almost completely unavailable. Perhaps by the end of the year, 28nm will have matured enough and Nvidia will be able to design something that yields decently while still offering the competitiveness that the 680 brought us, because what I'd really like to see is both companies releasing good cards at the same time. Thanks to Fermi and Kepler, that hasn't happened for a while now. Us consumers benefit from healthy competition and Nvidia has been screwing that up for everyone. Get it together, Nvidia!
  • CeriseCogburn - Sunday, August 19, 2012 - link

    So as any wacko fanboy does, you fault nVidia for releasing a card later that drives the very top end tier amd cards down from the 579+ shipping I paid to $170 less plus 3 free games.
    Yeah buddy, it's all nVidia's fault, and they need to get their act together, and if they do in fact get their act together, you can buy the very top amd card for $150, because that's likely all it will be worth.
    Good to know it's all nVidia's fault. AMD from $579+plus ship to $409 and 3 free games and nVidia sucks for not having it's act together.
    The FDA as well as the EPA should ban the koolaid you're drinking.

Log in

Don't have an account? Sign up now