OC: Power, Temperature, & Noise

Our final task is our look at the overclocking capabilities of our GTX 660 Ti cards. Based on what we’ve seen thus far with GTX 660 Ti, these factory overclocked parts are undoubtedly eating into overclocking headroom, so we’ll have to see just what we can get out of them. The very similar GTX 670 topped out at around 1260MHz for the max boost clock, and between 6.6GHz and 6.9GHz for the memory clock.

GeForce 660 Ti Overclocking
  EVGA GTX 660 Ti SC Zotac GTX 660 Ti AMP Gigabyte GTX 660 Ti OC
Shipping Core Clock 980MHz 1033MHz 1033MHz
Shipping Max Boost Clock 1150MHz 1175MHz 1228MHz
Shipping Memory Clock 6GHz 6.6GHz 6GHz
Shipping Max Boost Voltage 1.175v 1.175v 1.175v
       
Overclock Core Clock 1030MHz 1033MHz 1083MHz
Overclock Max Boost Clock 1200MHz 1175MHz 1278MHz
Overclock Memory Clock 6.5GHz 6.8GHz 6.6GHz
Overclock Max Boost Voltage 1.175v 1.175v 1.175v

As we suspected, starting with factory overclocked cards isn’t helping here. Our Zotac card wouldn’t accept any kind of meaningful GPU core overclock, so it shipped practically as fast as it could go. We were able to squeeze out another 200MHz on the memory clock though.

Meanwhile our EVGA and Gigabyte cards fared slightly better. We could push another 50MHz out of their GPU clocks, bringing us to a max boost clock of 1200MHz on the EVGA card and 1278MHz on the Gigabyte card. Memory overclocking was similarly consistent; we were able to hit 6.5GHz on the EVGA card and 6.6GHz on the Gigabyte card.

Altogether these are sub-5% GPU overclocks, and at best 10% memory overclocks, which all things considered are fairly low overclocks. The good news is that reference-clocked cards should fare better since their headroom has not already been consumed by factory overclocking, but binning also means the best cards are going to be going out as factory overclocked models.

Moving on to our performance charts, we’re going to once again start with power, temperature, and noise, before moving on to gaming performance.

Unsurprisingly, given the small power target difference between the GTX 670 and the GTX 660 Ti, any kind of overclocking that involves raising the power target quickly pushes power consumption past the GTX 670’s power consumption. How much depends on the test and the card, with the higher power target Gigabyte card starting with a particular disadvantage here as its power consumption ends up rivaling that of the GTX 680.

We also see the usual increase in load temperatures due to the increased power consumption.  The Zotac and Gigabyte cards fare well enough due to their open air coolers, but the blower-type EVGA card is about as high as we want to go at 80C under OCCT.

Last but not least, looking at noise levels we can see an increase similar to the temperature increases we just saw. For the Zotac and EVGA cards noise levels are roughly equal with the reference GTX 680, which will be important to remember for when we’re looking at performance. Meanwhile the Gigabyte card continues to shine in these tests thanks to its oversized cooler; even OCCT can only push it to 46.8dB.

Power, Temperature, & Noise OC: Gaming Performance
Comments Locked

313 Comments

View All Comments

  • rarson - Friday, August 17, 2012 - link

    I might have said that ten years ago, but when I read stuff like "the GTX 680 marginalized the Radeon HD 7970 virtually overnight," I wonder what kind of bizarro universe I've stumbled into.
  • CeriseCogburn - Sunday, August 19, 2012 - link

    That's the sales numbers referred to there rarson - maybe you should drop the problematic amnesia ( I know you can't since having no clue isn't amnesia), but as a reminder, amd's crap card was $579 bucks and beyond and nVidia dropped in 680 at $499 across the board...
    Amd was losing sales in rapid fashion, and the 680 was piling up so many backorders and pre-purchases that resellers were gbegging for relief, and a few reviewers were hoping something could be done to stem the immense backorders for the 680.
    So:
    " "the GTX 680 marginalized the Radeon HD 7970 virtually overnight,"
    That's the real world, RECENT HISTORY, that real bizarro world you don't live in, don't notice, and most certainly, will likely have a very difficult time admitting exists.
    Have a nice day.
  • Biorganic - Saturday, August 18, 2012 - link

    Go look up Bias in a dictionary instead of flinging around insults like a child. When the adults converse amongst themselves they like to Add things to the actual conversation, not unnecessarily degrade people. Thanks! @$$-O
  • Jamahl - Thursday, August 16, 2012 - link

    The point I was making was that Nvidia has seeded overclocked cards to the majority of the tech press, while you had a go at AMD for their 7950 boost.

    After all the arguments and nonsense over the 7950 boost, hardly anyone benchmarked it but still plenty went ahead and benched the overclocked cards sent by Nvidia. Two AMD partners have shown they are releasing the 7950 boost edition asap, prompting a withdrawal of the criticisms from another nvidia fansite, hardwarecanucks.com

    So again I ask, AMD's credibility? The only credibility at stake is the reviewers who continually bend over to suit Nvidia. Nvidia has no credibility to lose.
  • silverblue - Friday, August 17, 2012 - link

    I'm afraid I have to back you up on this one. NVIDIA released not one, not two but THREE GT 640s, and I think people have forgotten about that one. AMD have replaced the 7950 BIOS and as such have overclocked it to the performance level where it probably should've been to start with (the gap between 7950 and 7970 was always far more than the one between 7870 and 7950).

    Yes, AMD should've given it a new name - 7950 XT as I said somewhere recently - but it's not even two-thirds as bad as the GT 640 fiasco. At least this time, we're talking two models separated only by a BIOS change and the consequently higher power usage, not two separate GPU generations with vastly different clocks, shader counts, memory types and so on.

    If I'm wrong, I'm wrong, however I don't understand how AMD's GPU division's credibility could be damaged by any of this. Feel free to educate me. :)
  • CeriseCogburn - Sunday, August 19, 2012 - link

    For your education and edification: amd failed in their card release by clocking it too low because they had lost the power useage war(and they knew it), and charging way too much on release.
    They suck, and their cred is ZERO, because of this.
    Now it not only harmed amd, it harmed all of us, and all their vender partners, we all got screwed and all lost money because of amd's greed and incompetence.
    Now amd, in a desperate panic, and far too long after the immense and debilitating blunder, that also left all their shareholders angry (ahem), after dropping the prices in two or three steps and adding 3 games to try to quell the massive kicking their falling sales to nVidia injuries...
    FINALLY pulled their head out of it's straight jacket, well, halfway out, and issued permission for a GE version.
    Now, maybe all you amd fans have been doing much and very excessive lying on 78xx79xx OC capabilities, or amd is just dumb as rocks and quite literally dangerous to themselves, the markets, their partners, all of us.
    I think it's a large heaping of BOTH.
    So there we have it - amd cred is where amd fanboy cred is - at the bottom of the barrel of slime.
  • Galidou - Sunday, August 19, 2012 - link

    Anyway, with you AMD fails, always failed and will continue to fail at everything... I don't know if you think people will read your posts like religious madmans and beleive it a 100%, you're making it so exagerated, that it's barely readable.

    The words nazi and such comes back so often when you go on the madman route, that it's a wonder if anyone gives you any credibility. A little sad because you have nice arguments, you just display them surrounded by so much hate, it's hard to give you any credit for them.

    We do exagerate AMD's performance just for the sake of being fanboys, but not to the point of saying such debilitating stuff like you're so good at it. Not to the point of totally destroying Nvidia and saying it's worth NOTHING like you do for AMD. I may lean a little on AMD's side because for my money they gave me more performance from the radeon 4xxx to the 6xxx series. I won't forget my 8800gt either, that was a delight for the price too. But I can recon when a video card wins at EVERYTHING and is doing WONDERS and none is happening now, it's a mixed bag of feeling. between overclockability, optimization on certain games, etc...

    When the 8800gt and radeon 4870 came out, there was nothing people could say, just nothing, for the price, they were wonders, trampling over anythingbefore and after but at the same time you said they were mistake because they were not greedy enough moves.

    Wanna speak about greed, why is Nvidia so rich, you defend the most rich video card maker in history but you accuse the others of being greedy, society is built on greed, go blame others. Couldn't they sell their GPU at lower prices to kill AMD and be less greedy? No, if AMD die, you'll see greed and 800$ gpus, speak about greed.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    Didn't read your loon spiel, again, not even glossed, part of 1st sentence.
    I won't tell you to shut up or change what you say, because I'm not a crybaby like you.
    AMD sucks, they need help, and they only seem to fire more people.
  • Galidou - Thursday, August 23, 2012 - link

    To date your best argument that repeats itself is ''AMD sucks'' which is something you learn to say when you're a kid. You're not a crybaby ohhh that's new, you keep crying more than everyone else I've seen, TheJian might be a fanboy but you're more related to the fanatic side of the thing.

    Still, they are the most rich video maker in history, but they still try to manipulate opinions like every company does. Why? if their product is so good and perfect, why do they have to manipulate? I hear you already saying something like: It's because that AMD suck, they suck so much that Nvidia has to make em suck even more by manipulating the stoopid reviewers because the world is against Nvidia and I'm their Crusader.... good job.
  • CeriseCogburn - Thursday, August 23, 2012 - link

    Yes, I've never crapload of facts nor a single argument of note, and your head is a bursting purple strawberry too mr whiner.

Log in

Don't have an account? Sign up now