Power, Temperature, & Noise

As always, we’re wrapping up our look at a video card’s stock performance with a look at power, temperature, and noise. Unlike GTX 660 Ti, which was a harvested GK104 GPU, GTX 660 is based on the brand-new GK106 GPU, which will have interesting repercussions for power consumption. Scaling down a GPU by disabling functional units often has diminishing returns, so GK106 will effectively “reset” NVIDIA’s position as far as power consumption goes. As a reminder, NVIDIA’s power target here is a mere 115W, while their TDP is 140W.

GeForce GTX 660 Series Voltages
Ref GTX 660 Ti Load Ref GTX 660 Ti Idle Ref GTX 660 Load Ref GTX 660 Idle
1.175v 0.975v 1.175v 0.875v

Stopping to take a quick look at voltages, even with a new GPU nothing has changed. NVIDIA’s standard voltage remains at 1.175v, the same as we’ve seen with GK104. However idle voltages are much lower, with the GK106 based GTX 660 idling at 0.875v versus 0.975v for the various GK104 desktop cards. As we’ll see later, this is an important distinction for GK106.

Up next, before we jump into our graphs let’s take a look at the average core clockspeed during our benchmarks. Because of GPU boost the boost clock alone doesn’t give us the whole picture, we’ve recorded the clockspeed of our GTX 660 during each of our benchmarks when running it at 1920x1200 and computed the average clockspeed over the duration of the benchmark

GeForce GTX 600 Series Average Clockspeeds
  GTX 670 GTX 660 Ti GTX 660
Max Boost Clock 1084MHz 1058MHz 1084MHz
Crysis 1057MHz 1058MHz 1047MHz
Metro 1042MHz 1048MHz 1042MHz
DiRT 3 1037MHz 1058MHz 1054MHz
Shogun 2 1064MHz 1035MHz 1045MHz
Batman 1042MHz 1051MHz 1029MHz
Portal 2 988MHz 1041MHz 1033MHz
Battlefield 3 1055MHz 1054MHz 1065MHz
Starcraft II 1084MHz N/A 1080MHz
Skyrim 1084MHz 1045MHz 1084MHz
Civilization V 1038MHz 1045MHz 1067MHz

With an official boost clock of 1033MHz and a maximum boost of 1084MHz on our GTX 660, we see clockspeeds regularly vary between the two points. For the most part our average clockspeeds are slightly ahead of NVIDIA’s boost clock, while in CPU-heavy workloads (Starcraft II, Skyrim), we can almost sustain the maximum boost clock. Ultimately this means that the GTX 660 is spending most of its time near or above 1050MHz, which will have repercussions when it comes to overclocking.

Starting as always with idle power we immediately see an interesting outcome: GTX 660 has the lowest idle power usage. And it’s not just a one or two watt either, but rather a 6W (all the wall) difference between the GTX 660 and both the Radeon HD 7800 series and the GTX 600 series. All of the current 28nm GPUs have offered refreshingly low idle power usage, but with the GTX 660 we’re seeing NVIDIA cut into what was already a relatively low idle power usage and shrink it even further.

NVIDIA’s claim is that their idle power usage is around 5W, and while our testing methodology doesn’t allow us to isolate the video card, our results corroborate a near-5W value. The biggest factors here seem to be a combination of die size and idle voltage; we naturally see a reduction in idle power usage as we move to smaller GPUs with fewer transistors to power up, but also NVIDIA’s idle voltage of 0.875v is nearly 0.1v below GK104’s idle voltage and 0.075v lower than GT 640 (GK107)’s idle voltage. The combination of these factors has pushed the GTX 660’s idle power usage to the lowest point we’ve ever seen for a GPU of this size, which is quite an accomplishment. Though I suspect the real payoff will be in the mobile space, as even with Optimus mobile GPUs have to spend some time idling, which is another opportunity to save power.

At this point the only area in which NVIDIA doesn’t outperform AMD is in the so-called “long idle” scenario, where AMD’s ZeroCore Power technology gets to kick in. 5W is nice, but next-to-0W is even better.

Moving on to load power consumption, given NVIDIA’s focus on efficiency with the Kepler family it comes as no great surprise that NVIDIA continues to hold the lead when it comes to load power consumption. The gap between GTX 660 and 7870 isn’t quite as large as the gap we saw between GTX 680 and 7970 but NVIDIA still has a convincing lead here, with the GTX 660 consuming 23W less at the wall than the 7870. This puts the GTX 660 at around the power consumption of the 7850 (a card with a similar TDP) or the GTX 460. On AMD’s part, Pitcairn is a more petite (and less compute-heavy) part than Tahiti, which means AMD doesn’t face nearly the disparity as they do on the high-end.

OCCT on the other hand has the GTX 660 and 7870 much closer, thanks to AMD’s much more aggressive throttling through PowerTune. This is one of the only times where the GTX 660 isn’t competitive with the 7850 in some fashion, though based on our experience our Metro results are more meaningful than our OCCT results right now.

As for idle temperatures, there are no great surprises. A good blower can hit around 30C in our testbed, and that’s exactly what we see.

Temperatures under Metro look good enough; though despite their power advantage NVIDIA can’t keep up with the blower-equipped 7800 series.  At the risk of spoiling our noise results, the 7800 series doesn’t do significantly worse for noise so it’s not immediately clear why the GTX 660 is 6C warmer here. Our best guess would be that the GTX 660’s cooler just quite isn’t up to the potential of the 7800 series’ reference cooler.

OCCT actually closes the gap between the 7870 and the GTX 660 rather than widening it, which is the opposite of what we would expect given our earlier temperature data. Reaching the mid-70s neither card is particularly cool, but both are still well below their thermal limits, meaning there’s plenty of thermal headroom to play with.

Last but not least we have our noise tests, starting with idle noise. Again there are no surprises here; the GTX 660’s blower is solid, producing no more noise than any other standard blower we’ve seen.

While the GTX 660 couldn’t beat the 7870 on temperatures under Metro, it can certainly beat the 7870 when it comes to noise. The difference isn’t particularly great – just 1.4dB – but every bit adds up, and 47.4dB is historically very good for a blower. However the use of a blower on the GTX 660 means that NVIDIA still can’t match the glory of the GTX 560 Ti or GTX 460; for that we’ll have to take a look at retail cards with open air coolers.

Similar to how AMD’s temperature lead eroded with OCCT, AMD’s slight loss in load noise testing becomes a much larger gap under OCCT. A 4.5dB difference is now solidly in the realm of noticeable, and further reinforces the fact that the GTX 660 is the quieter card under both normal and extreme situations.

We’ll be taking an in-depth look at some retail cards later today with our companion retail card article, but with those results already in hand we can say that despite the use of a blower the “reference” GTX 660 holds up very well. Open air coolers can definitely beat a blower with the usual drawbacks (that heat has to go somewhere), but when a blower is only hitting 47dB, you already have a fairly quiet card. So even a reference GTX 660 (as unlikely as it is to appear in North America) looks good all things considered.

Synthetics OC: Power, Temperature, & Noise
Comments Locked

147 Comments

View All Comments

  • chizow - Tuesday, September 18, 2012 - link

    Where did I call you an idiot? You took issue with my response to rarson, who fits my profile as someone who continuously ignores or is unable to understand some very simple concepts backed by mounds of evidence and historical data.

    Then he has the gall to question my ability to understand certain concepts? Of course I have trouble understanding opinions founded on stupidity. Unless you have the same problems, why would you take offense?
  • CeriseCogburn - Thursday, November 29, 2012 - link

    Here, I'll call him an idiot and a liar.
    He's an idiot and a liar.
    He's been one forever.
    It will never change.
    As least David's butt is smakc full of his lipstick, and poor Goliath is rich as can be and the one still standing and alive.
    I guess Galidou sucked too hard now David (amd) is almost dead.
    Poor Galidou, supporting the underdog under it's jockstrap just hasn't worked out at all.
    I have a feeling David's paramour might be a bit "upset" again, and again, and again, and again, and again.
    Did the idiot get anything correct ?
    Were his correction to his incorrect comments that he corrected not needed anyway since even after the corrections he issued to himself he was still wrong?
    I'll answer that.
    YES.
  • Galidou - Monday, September 17, 2012 - link

    http://www.techpowerup.com/reviews/NVIDIA/GeForce_...

    20% more performance than last gen for the same price one year and a half later isn't a big deal either. Sure you win on thermal and consumption constraints.

    You don't even know me personally and still you have to insult my intelligence, that's what fanboys do... and that's far worse than lacking of judgement in my opinion.

    I admit that AT LAUNCH the 7970 was worse than the gtx 280 compared to last gen parts but you have to consider what's coming out too. And we all know they have this kind of information, and estimation of the performance of the part for the price.

    So right, they should of priced 7970 400$ but that would of made another war with Nvidia(which already sued AMD for price fixing between them) so this price might just reflect the return to normal for both companies. No more 4870 BIG DEAL, back to normal, not because AMD want to price it BADLY because they have been sued to do so....
  • Galidou - Monday, September 17, 2012 - link

    You get the first shot on new technology, you price it higher, you lower the price when the new stuff comes out. Same laws for both companies. 4870 was an unknown mistake, the chip wasn't out and the preliminary tests showed it performing way less than when it launched.

    It was a precipitated launch. Prices had been fixed WAY before the final product. With drivers enhancements and such the 4870 performed WAY above what AMD was hoping for, it was a surprise to them. They couldn't play too much with the price because it was already out in the medias for a while. Shit happens, they have been sued for being lucky with their final products for price fixin and next gen cards AHD to go up in prices breaking the amazing deal they sold for.
  • chizow - Tuesday, September 18, 2012 - link

    "I admit that AT LAUNCH the 7970 was worse than the gtx 280 compared to last gen parts but you have to consider what's coming out too."

    Finally, now was that so hard?
  • Galidou - Monday, September 17, 2012 - link

    Worst increase in performance, not, gtx 680 is 20-25% average faster than gtx 580. Biggest increase in price, sure but do you know anything about price fixing between AMD and Nvidia, yep, the prices are fixed by both companies.

    Even if they were sued just before the days of radeon 4870 and gtx 280(thus explaining in part why the price of the 4870 wasn't adjusted to Nvidia because they were forbid to and were being checked) they continue to do that.
  • Galidou - Monday, September 17, 2012 - link

    While speaking about all that, pricing of the 4870 and 7970 do you really know everything around that, because it seems not when you are arguing, you just seem to put everything on the shoulder of a company not knowing any of the background.

    Do you know the price of the 4870 was already decided and it was in correlation with Nvidia's 9000 series performance. That the 4870 was supposed to compete against 400$ cards and not win and the 4850 supposed to compete against 300$ series card and not win. You heard right, the 9k series, not the GTX 2xx.

    The results even just before the coming out of the cards were already ''known''. The real things were quite different with the final product and last drivers enhancements. The performance of the card was actually a surprise, AMD never thought it was supposed to compete against the gtx 280, because they already knew the performance of the latter and that it was ''unnaittanable'' considering the size of the thing. Life is full of surprise you know.

    Do you know that after that, Nvidia sued AMD/ATI for price fixing asking for more communications between launch and less ''surprises''. Yes, they SUED them because they had a nice surprise... AMD couldn't play with prices too much because they were already published by the media and it was not supposed to compete against gtx2xx series. They had hoped that at 300$ it would ''compete'' against the gtx260 and not win against i thus justifying the price of the things at launch. And here you are saying it's a mistake launching insults at me, telling me I have a low intelligence and showing you're a know it all....

    Do you know that this price fixing obligation is the result of the pricing of the 7970, I bet AMD would of loved to price the latter at 400$ and could do it but it would of resulted in another war and more suing from Nvidia that wanted to price it's gtx 680 500$ 3 month after so to not break their consumers joy, they communicate A LOT more than before so everyone is happy, except now it hurts AMD because you compare to last gen and it makes things seems less of a deal. But with things back to normal we will be able to compare last gen after the refreshed radeon 7xxx parts and new gen after that.

    Nvidia the ''giant'' suing companies on the limit of ''extinction'', nice image indeed. Imagine the rich bankers starting to sue people in the streets, and they are the one you defend so vigorously. If they are that rich, do you rightly think the gtx 280 was well priced even considering it was double the last generation... It just means one thing, they could sell their card for less money but instead they sue the other company to take more money from our pockets, nice image.... very nice..... But that doesn't mean I won't buy an Nvidia card, I just won't defend them as vigorously as you do.... For every Goliath, we need a David, and I prefer David over Goliath.... even if I admire the strenght of the latter....
  • Galidou - Monday, September 17, 2012 - link

    I was wrong, Nvidia didn't sue over AMD, both companies were sued for price fixing but things are back now, anyway all this stuff is taking way too much of my time, you have your way of seeing things as facts, I have my way of seeing things as my opinion, I'll give you the benefit of the doubt because you're so much more intelligent than me and I don't care about the ultimate truth as I don't beleive in such a thing.

    Being sued back in 2008 in the times they were working on gtx2xx and 4870 series might explain the lack of information on each others and the reason why they couldn'T play with the price once they knew the surprise. They were probably forbid to adjust price based on each other performance for the benefit of the consumer. But the surprise of that SO small chip performing sometimes better than a gpu 110% bigger was a real shock for the small company.
  • CeriseCogburn - Wednesday, September 19, 2012 - link

    You truly are an estrogen doused total licker bleeding red that no tamp can ever stop.
    Thanks for the pathetic entertainment.
    Now you may whine some more in your sensitive little girl voice.
  • Galidou - Thursday, September 20, 2012 - link

    Wow, chizow's acolyte is back. I guess it's his troll name and when he can'T stand it anymore he logs with CeriseCogburn to insult people so he Chizow's name remain clean.

    Who's whining, when I read you, it seems that's all you can do whine whine whine.... read everything you ever wrote in the last 6 months and that's ALL you do insulting people and whining.... look in the mirror dude.

Log in

Don't have an account? Sign up now