OC: Power, Temperature, & Noise

Before wrapping things up, we wanted to quickly take a look at the overclocking potential of the GTX 660. As the first GK106 product GTX 660 should give us some idea as to how capable GK106 is at overclocking, though like GK104 we’re eventually at the mercy of NVIDIA’s locked voltages and limited power target control.

In its rawest form, GTX 660 will have two things going against it for overclocking. First and foremost, as the highest clocked GK106 part it’s already starting out at a fairly high clockspeed – 980MHz for reference cards, and upwards of 1050MHz for factory overclocked cards – so there may not be a great deal of overclocking headroom left to exploit. Furthermore because NVIDIA is keeping the power consumption of the card low (it needs to stay under 150W max), the maximum power target is the lowest we’ve seen for any GTX 600 card yet: it’s a mere 110%. As a result even if we can hit a large GPU clock offset, there may not be enough power headroom available to let the GPU regularly reach those speeds.

Memory overclocking on the other hand looks much better. With the same memory controllers and the same spec’d RAM as on the other high-end GTX 600 cards, there’s no reason to believe that the GTX 660 shouldn’t be able to hit equally high memory clocks, which means 6.5GHz+ is a reasonable goal.

GeForce GTX 660 Overclocking
  Ref GTX 660 EVGA GTX 660 SC Zotac GTX 660 Gigabyte GTX 660 OC
Shipping Core Clock 980MHz 1046MHz 993MHz 1033MHz
Shipping Max Boost Clock 1084MHz 1123MHz 1110MHz 1123MHz
Shipping Memory Clock 6GHz 6GHz 6GHz 6GHz
Shipping Max Boost Voltage 1.175v 1.175v 1.162v 1.175v
         
Overclock Core Clock 1080MHz 1096MHz 1093MHz 1083MHz
Overclock Max Boost Clock 1185MHz 1174MHz 1215MHz 1174MHz
Overclock Memory Clock 6.7GHz 6.9GHz 6.7GHz 6.5GHz
Overclock Max Boost Voltage 1.175v 1.175v 1.162v 1.175v

Throwing in our factory overclocked cards from our companion roundup, our core overclocking experience was remarkably consistent. The difference in the max boost clock between the slowest and fastest card was a mere 41MHz, with the Zotac card being a clear outlier compared to the rest of our cards. This comes as no great surprise since all of these launch cards are using the NVIDIA reference PCB, so there’s little room at this moment for overclocking innovation.

Memory overclocking is as volatile as ever, with a 400MHz spread between our best and worst cards. Again with the use of the reference PCB (and the same Samsung RAM), memory overclocking is entirely the luck of the draw.

For the moment at least GTX 660 overclocking looks to be on a level playing field due to all partners using the same PCB. For overclockers the choice of a card will come down to pricing, what cooler they prefer, and any preference in vendor.

The end result of all of this is that at best we’re seeing 100MHz overclocks (going by the max boost clock), which represents roughly a 10% overclock. Coupling this with a good memory overclock and the 10% increase in the power target will result in around a 10% increase in performance, which isn’t shabby but also is the same kind of shallow overclocking potential that we’ve seen on cards like the GTX 670 and GTX 660 Ti. All told the GTX 660 isn’t a poor overclocker – 10% more performance for free is nothing to sneeze at – but it’s also not going to enamor itself with hardware overclockers who like to chase 20% or more.

Moving on to our performance charts, we’re going to once again start with power, temperature, and noise, before moving on to gaming performance. Due to popular demand we’ll also be including overclocking results with just a 110% power target so that you can see the impact of adjusting the power target separately from the clock offsets.

With a 110% power target we should be seeing an 11W-14W increase in power consumption, which is indeed roughly what we’re seeing at the wall after accounting for PSU inefficiencies. In Metro this is just enough of a difference to erase most of the GTX 660’s power consumption advantage over the GTX 660 Ti, though the GTX 660 still draws marginally less power than the stock 7870. Meanwhile under OCCT the GTX 660 now draws more power than the 7870, but still is still drawing over 20W less than the stock GTX 660 Ti.

Our increased power consumption pushes temperatures up by another 2-3C. This is nothing a blower can’t handle, let alone an open-air cooler.

Interestingly enough, despite the increase in power consumption and temperatures, overclocking has almost no impact on noise. In the worst case scenario our GTX 660’s increased its fan speed by all of 2%, which increases noise by less than 1dB. As a result the amount of noise generated by the overclocked GTX 660 is practically identical to that generated by the stock GTX 660, and still below the reference 7870.

Power, Temperature, & Noise OC: Gaming Performance
POST A COMMENT

147 Comments

View All Comments

  • TemjinGold - Thursday, September 13, 2012 - link

    "For today’s launch we were able to get a reference clocked card, but in order to do so we had to agree not to show the card or name the partner who supplied the card."

    "Breaking open a GTX 660 (specifically, our EVGA 660 SC using the NV reference PCB),"

    So... didn't you just break your promise as soon as you made it AND show a pic of the card right underneath?
    Reply
  • Sufo - Thursday, September 13, 2012 - link

    Haha, shhhh! Reply
  • Homeles - Thursday, September 13, 2012 - link

    Reading comprehension is such an endangered resource...

    If it's the super clocked edition, it's obviously not a reference clocked card.
    Reply
  • jonup - Thursday, September 13, 2012 - link

    Exactly my thoughts. Reply
  • Ryan Smith - Thursday, September 13, 2012 - link

    Homeles is correct. That's one of the cards from the launch roundup we're publishing later today.. The reference-clocked GTX 660 we tested is not in any way pictured (I'm not quite that daft). Reply
  • knutjb - Saturday, September 15, 2012 - link

    No matter what you try to say it still reads poorly. It should be blatantly obvious about which card was which up front, which the article wasn't. I should have to dig when scanning through.

    Also, your picking it as the better choice over a card that has been out how long, over slight differences... If nvivda really wanted to me to say wow I'll buy it now, the card would have been no more than 199 at launch. 10 bucks under is the best they can do for being late to the party? And you bought the strategy. I have been equally disappointed with AMD when they have done the same thing.
    Reply
  • MrSpadge - Sunday, September 16, 2012 - link

    When reading Anadtech articles it's almost always safe to assume "he actually means what he's saying". Helps a lot with understanding. Reply
  • thomp237 - Sunday, September 23, 2012 - link

    So where is this roundup? We are now 10 days on from your comment and still no signs of a roundup. Reply
  • CeriseCogburn - Friday, October 12, 2012 - link

    I have been wondering where all the eyefinity amd fragglers have gone to, and now I know what has occurred.

    Eyefinity is Dead.

    These Kepler GPU's from nVidia all can do 4 monitors out of the box. Sure you might find a cheap version with 3 ports, whatever - that's the minority.

    So all the amd fanboys have shut their fat traps about eyefinity, since nVidia surpassed them with A+ 4 easy monitors out of the box on all the Kelpers.

    Thank you nVidia dearly for shutting the idiot pieholes of the amd fanboys.

    It took me this long to comment on this matter because nVidia fanboys don't all go yelling in unison sheep fashion about stuff like the little angry losing amd fans do.

    I have also noticed all the reviewers who are so used to being amd fan rave boys themselves almost never bring up multimonitor and abhor pointing out nVidia does 4 while amd only does 3 except in very expensive special cases.

    Yeah that's notable too. As soon as amd got utterly and totally crushed, it was no longer a central topic and central theme for all the review sites like this place.

    That 2 week Island vacation every year amd puts hundreds of these reporters on must be absolutely wonderful.
    I do hope they are treated very well and have a great time.
    Reply
  • EchoOne - Wednesday, November 21, 2012 - link

    LOL dude,the 660ti vs the 7950 in eyefinity would get destroyed.I know this because my friend has a comp build with a phenom 965be 4.2ghz and 660ti with 16gb of ram (i built this for him) and i have a fx 6100 4.7ghz,16gb ram and a 7950 i run a triple monitor setup

    https://www.youtube.com/watch?v=ZRXGveviruw&fe...

    And his 660ti DIED trying to play the games at that res and at the same settings as i do.He had to take down his graphics settings from say gta4 from max settings down to about medium and high (i run very high)

    So yeah sure it can run a couple monitors out of the box but same with eyefinity.And trust me their nvidia surround is not as polished as eyefinity..But they get props for trying.
    Reply

Log in

Don't have an account? Sign up now