Power, Temperature, & Noise

As always, we’re wrapping up our look at a video card’s stock performance with a look at power, temperature, and noise. Unlike GTX 660 Ti, which was a harvested GK104 GPU, GTX 660 is based on the brand-new GK106 GPU, which will have interesting repercussions for power consumption. Scaling down a GPU by disabling functional units often has diminishing returns, so GK106 will effectively “reset” NVIDIA’s position as far as power consumption goes. As a reminder, NVIDIA’s power target here is a mere 115W, while their TDP is 140W.

GeForce GTX 660 Series Voltages
Ref GTX 660 Ti Load Ref GTX 660 Ti Idle Ref GTX 660 Load Ref GTX 660 Idle
1.175v 0.975v 1.175v 0.875v

Stopping to take a quick look at voltages, even with a new GPU nothing has changed. NVIDIA’s standard voltage remains at 1.175v, the same as we’ve seen with GK104. However idle voltages are much lower, with the GK106 based GTX 660 idling at 0.875v versus 0.975v for the various GK104 desktop cards. As we’ll see later, this is an important distinction for GK106.

Up next, before we jump into our graphs let’s take a look at the average core clockspeed during our benchmarks. Because of GPU boost the boost clock alone doesn’t give us the whole picture, we’ve recorded the clockspeed of our GTX 660 during each of our benchmarks when running it at 1920x1200 and computed the average clockspeed over the duration of the benchmark

GeForce GTX 600 Series Average Clockspeeds
  GTX 670 GTX 660 Ti GTX 660
Max Boost Clock 1084MHz 1058MHz 1084MHz
Crysis 1057MHz 1058MHz 1047MHz
Metro 1042MHz 1048MHz 1042MHz
DiRT 3 1037MHz 1058MHz 1054MHz
Shogun 2 1064MHz 1035MHz 1045MHz
Batman 1042MHz 1051MHz 1029MHz
Portal 2 988MHz 1041MHz 1033MHz
Battlefield 3 1055MHz 1054MHz 1065MHz
Starcraft II 1084MHz N/A 1080MHz
Skyrim 1084MHz 1045MHz 1084MHz
Civilization V 1038MHz 1045MHz 1067MHz

With an official boost clock of 1033MHz and a maximum boost of 1084MHz on our GTX 660, we see clockspeeds regularly vary between the two points. For the most part our average clockspeeds are slightly ahead of NVIDIA’s boost clock, while in CPU-heavy workloads (Starcraft II, Skyrim), we can almost sustain the maximum boost clock. Ultimately this means that the GTX 660 is spending most of its time near or above 1050MHz, which will have repercussions when it comes to overclocking.

Starting as always with idle power we immediately see an interesting outcome: GTX 660 has the lowest idle power usage. And it’s not just a one or two watt either, but rather a 6W (all the wall) difference between the GTX 660 and both the Radeon HD 7800 series and the GTX 600 series. All of the current 28nm GPUs have offered refreshingly low idle power usage, but with the GTX 660 we’re seeing NVIDIA cut into what was already a relatively low idle power usage and shrink it even further.

NVIDIA’s claim is that their idle power usage is around 5W, and while our testing methodology doesn’t allow us to isolate the video card, our results corroborate a near-5W value. The biggest factors here seem to be a combination of die size and idle voltage; we naturally see a reduction in idle power usage as we move to smaller GPUs with fewer transistors to power up, but also NVIDIA’s idle voltage of 0.875v is nearly 0.1v below GK104’s idle voltage and 0.075v lower than GT 640 (GK107)’s idle voltage. The combination of these factors has pushed the GTX 660’s idle power usage to the lowest point we’ve ever seen for a GPU of this size, which is quite an accomplishment. Though I suspect the real payoff will be in the mobile space, as even with Optimus mobile GPUs have to spend some time idling, which is another opportunity to save power.

At this point the only area in which NVIDIA doesn’t outperform AMD is in the so-called “long idle” scenario, where AMD’s ZeroCore Power technology gets to kick in. 5W is nice, but next-to-0W is even better.

Moving on to load power consumption, given NVIDIA’s focus on efficiency with the Kepler family it comes as no great surprise that NVIDIA continues to hold the lead when it comes to load power consumption. The gap between GTX 660 and 7870 isn’t quite as large as the gap we saw between GTX 680 and 7970 but NVIDIA still has a convincing lead here, with the GTX 660 consuming 23W less at the wall than the 7870. This puts the GTX 660 at around the power consumption of the 7850 (a card with a similar TDP) or the GTX 460. On AMD’s part, Pitcairn is a more petite (and less compute-heavy) part than Tahiti, which means AMD doesn’t face nearly the disparity as they do on the high-end.

OCCT on the other hand has the GTX 660 and 7870 much closer, thanks to AMD’s much more aggressive throttling through PowerTune. This is one of the only times where the GTX 660 isn’t competitive with the 7850 in some fashion, though based on our experience our Metro results are more meaningful than our OCCT results right now.

As for idle temperatures, there are no great surprises. A good blower can hit around 30C in our testbed, and that’s exactly what we see.

Temperatures under Metro look good enough; though despite their power advantage NVIDIA can’t keep up with the blower-equipped 7800 series.  At the risk of spoiling our noise results, the 7800 series doesn’t do significantly worse for noise so it’s not immediately clear why the GTX 660 is 6C warmer here. Our best guess would be that the GTX 660’s cooler just quite isn’t up to the potential of the 7800 series’ reference cooler.

OCCT actually closes the gap between the 7870 and the GTX 660 rather than widening it, which is the opposite of what we would expect given our earlier temperature data. Reaching the mid-70s neither card is particularly cool, but both are still well below their thermal limits, meaning there’s plenty of thermal headroom to play with.

Last but not least we have our noise tests, starting with idle noise. Again there are no surprises here; the GTX 660’s blower is solid, producing no more noise than any other standard blower we’ve seen.

While the GTX 660 couldn’t beat the 7870 on temperatures under Metro, it can certainly beat the 7870 when it comes to noise. The difference isn’t particularly great – just 1.4dB – but every bit adds up, and 47.4dB is historically very good for a blower. However the use of a blower on the GTX 660 means that NVIDIA still can’t match the glory of the GTX 560 Ti or GTX 460; for that we’ll have to take a look at retail cards with open air coolers.

Similar to how AMD’s temperature lead eroded with OCCT, AMD’s slight loss in load noise testing becomes a much larger gap under OCCT. A 4.5dB difference is now solidly in the realm of noticeable, and further reinforces the fact that the GTX 660 is the quieter card under both normal and extreme situations.

We’ll be taking an in-depth look at some retail cards later today with our companion retail card article, but with those results already in hand we can say that despite the use of a blower the “reference” GTX 660 holds up very well. Open air coolers can definitely beat a blower with the usual drawbacks (that heat has to go somewhere), but when a blower is only hitting 47dB, you already have a fairly quiet card. So even a reference GTX 660 (as unlikely as it is to appear in North America) looks good all things considered.

Synthetics OC: Power, Temperature, & Noise
Comments Locked

147 Comments

View All Comments

  • Galidou - Thursday, September 20, 2012 - link

    Whenever I see CeriseCogburn commenting, Chizow is not, and vice versa....

    If you never heard about price fixing, sorry for you but it's a fact, THAT is a fact, people don't have to beleive in that, it's happening right now and always has been and beleive me it will continue, because almost every company in the world is greedy even if it means communicating with the competition to maximize profit....
  • CeriseCogburn - Thursday, November 29, 2012 - link

    Gal, you silly gal, Chizow knows a lot more than I do, but I'll say this, you're an insane and incorrect amd fanboy of the worst kind.
    I hope david's butt remains a delicacy to you, even after the corpse is buried, which is, by the way, to happen, very soon.
  • CeriseCogburn - Thursday, November 29, 2012 - link

    Galidou, you win NOTHING for being a lying sack, then whining when someone is so sick of your complete bs, they offend your idiot retarded estrogen doused amd licking being because they aren't a sick lying gasbag biased amd pig.
    Glad that religious Bible story has you kissing david amd's tokus furiously though, as that surely commands respect.
    LOL
    NOT !
    Oh, were you insulted ?
    Let's hope so, because of course, you tell so many lies, it's IMPOSSIBLE for you to not be insulted.
  • rarson - Tuesday, September 18, 2012 - link

    AMD's pricing doesn't need to be defended because anyone with a grasp of basic economics can easily understand why they priced them the way they did. That's why most people are ignoring your inane and mind-bogglingly stupid comments.

    "How do you feel now about those $550, $450, and $350 pricepoints you so vigorously defended when the 7970/7950/7870 launched?"

    Absolutely fine, dumbass, because it's September now. Duh.

    "So just as I asked then"

    Nobody cares, dude. Go fanboy somewhere else.
  • chizow - Tuesday, September 18, 2012 - link

    Yes anyone with a basic grasp of economics would never have defended the worst increase in price and performance in the last decade and then be OK with the biggest price drop in the least amount of time within the same generation. AMD now holds the notorious distinction for both and their fanboys (like you) get to suffer the consequences.

    How much did the GTX 580 cost 15 months after release? $500 still dumbass, duh, now go fanboy somewhere else? Parts like this don't lose their value unless they suck, or their pricing sucks, or both, but obviously you're too oblivious or stupid to realize this, or maybe you're just accustomed to it as an AMD fan.
  • CeriseCogburn - Thursday, November 29, 2012 - link

    You're an idiot.
    AMD cost me plenty, and I will NEVER fall for your stupid amd lies, ever again.
  • Klimax - Saturday, September 15, 2012 - link

    A thing: Dirt Showdown is AMD game using DC codepath optimised ONLY for Radeons severly penalising nVidia's cards. It is not valid for any comparsions.
    (At least not with that option enabled)
  • chizow - Thursday, September 13, 2012 - link

    Just wondered if there was any news about price drops for higher-end SKUs. It becomes more obvious with every newly released SKU that the original asking prices from both AMD and Nvidia on 28nm parts were far too high. $350 for a 7870 looks like a complete debacle at this point given a $229 part outperforms it just a few months later.

    Also it looks like the Summer 2012 GPU pricing chart needs to be adjusted for the GTX 660 (it shows $239).

    Thanks for the commentary on page 3 about Nvidia's Competition. Much like Intel, they still need to compete with themselves to entice owners of their previous products to upgrade. I'm glad someone else gets it, its pretty obvious Nvidia does as well. I guess they heard the complaints of all their enthusiasts when asking $500 flagship dollars for a part based on a midrange ASIC.
  • RussianSensation - Thursday, September 13, 2012 - link

    A couple months?

    HD7850 - $249 March 3, 2012
    HD7870 - $349 March 3, 2012

    GTX660 - $229 September 13, 2012

    It's been 7 months.

    Someone who bought an HD7850 and OCed it enjoyed ~ GTX580 / HD7950 level of performance for 7 months now. Using the same exact logic you have just outlined, then we should recommend people to wait 7 more months for HD8000 series and skip GTX660 because for them the 660 would be an "early adopter" premium vs. HD8870. See how illogical your comment is?

    GPUs often drop in price over time as the generation goes on.

    Interesting how GTX280 for $649 and GTX260 $399 weren't a problem for you.
  • chizow - Thursday, September 13, 2012 - link

    Except we've already covered this pricing debacle months ago, pretty sure you were onboard then, what happened since then?

    The 7870 was already vastly overpriced because it offered 6970/GTX 570 at.....6970 and GTX 570 prices. Parts that were already widely available for at least 20 months prior to the 7870's launch at the exact same prices. Anyone who already had that performance level would have no incentive to sidegrade to a 7870 at that pricepoint.

    What is obvious now as it was then is that there was no movement in terms of price:performance that you would expect from a new generation, the metric didn't shift at all for 28nm until Kepler launched. Now that Kepler has finally trickled down to this performance level, its that much more clear. Bringing your 8870 argument into the fold, I wouldn't agree with that view either as I would expect the 8870 to offer more performance at a lower pricepoint, not the same performance at the same price as is the case with the 7870 at launch.

    I don't know why you're trying to defend AMD's horrid 28nm pricing but the fact of the matter is, the current pricing structure is really what 28nm should have been from the outset, anyone who bought in March and didn't actually need a new GPU is undoubtedly feeling the burn of all the recent price drops, but hey, at least its not as bad as Facebook's IPO?

    And no, GTX 260/280 weren't a problem for me because the difference is with those parts, the performance justified the premium relative to the last generation of cards (8800GT/GTX). This generation clearly does not adhere to those same expectations, which again, is a view I'm pretty sure you were onboard with months ago. What Nvidia didn't expect was for AMD to lowball them so much on a certain performance level, something AMD has clearly worked to remedy with each successive generation with their increases in asking prices for their 1st and 2nd tier single-GPU SKUs.

Log in

Don't have an account? Sign up now