Power, Temperature, & Noise

As always, we’re wrapping up our look at a video card’s stock performance with a look at power, temperature, and noise. Unlike GTX 660 Ti, which was a harvested GK104 GPU, GTX 660 is based on the brand-new GK106 GPU, which will have interesting repercussions for power consumption. Scaling down a GPU by disabling functional units often has diminishing returns, so GK106 will effectively “reset” NVIDIA’s position as far as power consumption goes. As a reminder, NVIDIA’s power target here is a mere 115W, while their TDP is 140W.

GeForce GTX 660 Series Voltages
Ref GTX 660 Ti Load Ref GTX 660 Ti Idle Ref GTX 660 Load Ref GTX 660 Idle
1.175v 0.975v 1.175v 0.875v

Stopping to take a quick look at voltages, even with a new GPU nothing has changed. NVIDIA’s standard voltage remains at 1.175v, the same as we’ve seen with GK104. However idle voltages are much lower, with the GK106 based GTX 660 idling at 0.875v versus 0.975v for the various GK104 desktop cards. As we’ll see later, this is an important distinction for GK106.

Up next, before we jump into our graphs let’s take a look at the average core clockspeed during our benchmarks. Because of GPU boost the boost clock alone doesn’t give us the whole picture, we’ve recorded the clockspeed of our GTX 660 during each of our benchmarks when running it at 1920x1200 and computed the average clockspeed over the duration of the benchmark

GeForce GTX 600 Series Average Clockspeeds
  GTX 670 GTX 660 Ti GTX 660
Max Boost Clock 1084MHz 1058MHz 1084MHz
Crysis 1057MHz 1058MHz 1047MHz
Metro 1042MHz 1048MHz 1042MHz
DiRT 3 1037MHz 1058MHz 1054MHz
Shogun 2 1064MHz 1035MHz 1045MHz
Batman 1042MHz 1051MHz 1029MHz
Portal 2 988MHz 1041MHz 1033MHz
Battlefield 3 1055MHz 1054MHz 1065MHz
Starcraft II 1084MHz N/A 1080MHz
Skyrim 1084MHz 1045MHz 1084MHz
Civilization V 1038MHz 1045MHz 1067MHz

With an official boost clock of 1033MHz and a maximum boost of 1084MHz on our GTX 660, we see clockspeeds regularly vary between the two points. For the most part our average clockspeeds are slightly ahead of NVIDIA’s boost clock, while in CPU-heavy workloads (Starcraft II, Skyrim), we can almost sustain the maximum boost clock. Ultimately this means that the GTX 660 is spending most of its time near or above 1050MHz, which will have repercussions when it comes to overclocking.

Starting as always with idle power we immediately see an interesting outcome: GTX 660 has the lowest idle power usage. And it’s not just a one or two watt either, but rather a 6W (all the wall) difference between the GTX 660 and both the Radeon HD 7800 series and the GTX 600 series. All of the current 28nm GPUs have offered refreshingly low idle power usage, but with the GTX 660 we’re seeing NVIDIA cut into what was already a relatively low idle power usage and shrink it even further.

NVIDIA’s claim is that their idle power usage is around 5W, and while our testing methodology doesn’t allow us to isolate the video card, our results corroborate a near-5W value. The biggest factors here seem to be a combination of die size and idle voltage; we naturally see a reduction in idle power usage as we move to smaller GPUs with fewer transistors to power up, but also NVIDIA’s idle voltage of 0.875v is nearly 0.1v below GK104’s idle voltage and 0.075v lower than GT 640 (GK107)’s idle voltage. The combination of these factors has pushed the GTX 660’s idle power usage to the lowest point we’ve ever seen for a GPU of this size, which is quite an accomplishment. Though I suspect the real payoff will be in the mobile space, as even with Optimus mobile GPUs have to spend some time idling, which is another opportunity to save power.

At this point the only area in which NVIDIA doesn’t outperform AMD is in the so-called “long idle” scenario, where AMD’s ZeroCore Power technology gets to kick in. 5W is nice, but next-to-0W is even better.

Moving on to load power consumption, given NVIDIA’s focus on efficiency with the Kepler family it comes as no great surprise that NVIDIA continues to hold the lead when it comes to load power consumption. The gap between GTX 660 and 7870 isn’t quite as large as the gap we saw between GTX 680 and 7970 but NVIDIA still has a convincing lead here, with the GTX 660 consuming 23W less at the wall than the 7870. This puts the GTX 660 at around the power consumption of the 7850 (a card with a similar TDP) or the GTX 460. On AMD’s part, Pitcairn is a more petite (and less compute-heavy) part than Tahiti, which means AMD doesn’t face nearly the disparity as they do on the high-end.

OCCT on the other hand has the GTX 660 and 7870 much closer, thanks to AMD’s much more aggressive throttling through PowerTune. This is one of the only times where the GTX 660 isn’t competitive with the 7850 in some fashion, though based on our experience our Metro results are more meaningful than our OCCT results right now.

As for idle temperatures, there are no great surprises. A good blower can hit around 30C in our testbed, and that’s exactly what we see.

Temperatures under Metro look good enough; though despite their power advantage NVIDIA can’t keep up with the blower-equipped 7800 series.  At the risk of spoiling our noise results, the 7800 series doesn’t do significantly worse for noise so it’s not immediately clear why the GTX 660 is 6C warmer here. Our best guess would be that the GTX 660’s cooler just quite isn’t up to the potential of the 7800 series’ reference cooler.

OCCT actually closes the gap between the 7870 and the GTX 660 rather than widening it, which is the opposite of what we would expect given our earlier temperature data. Reaching the mid-70s neither card is particularly cool, but both are still well below their thermal limits, meaning there’s plenty of thermal headroom to play with.

Last but not least we have our noise tests, starting with idle noise. Again there are no surprises here; the GTX 660’s blower is solid, producing no more noise than any other standard blower we’ve seen.

While the GTX 660 couldn’t beat the 7870 on temperatures under Metro, it can certainly beat the 7870 when it comes to noise. The difference isn’t particularly great – just 1.4dB – but every bit adds up, and 47.4dB is historically very good for a blower. However the use of a blower on the GTX 660 means that NVIDIA still can’t match the glory of the GTX 560 Ti or GTX 460; for that we’ll have to take a look at retail cards with open air coolers.

Similar to how AMD’s temperature lead eroded with OCCT, AMD’s slight loss in load noise testing becomes a much larger gap under OCCT. A 4.5dB difference is now solidly in the realm of noticeable, and further reinforces the fact that the GTX 660 is the quieter card under both normal and extreme situations.

We’ll be taking an in-depth look at some retail cards later today with our companion retail card article, but with those results already in hand we can say that despite the use of a blower the “reference” GTX 660 holds up very well. Open air coolers can definitely beat a blower with the usual drawbacks (that heat has to go somewhere), but when a blower is only hitting 47dB, you already have a fairly quiet card. So even a reference GTX 660 (as unlikely as it is to appear in North America) looks good all things considered.

Synthetics OC: Power, Temperature, & Noise
Comments Locked

147 Comments

View All Comments

  • chizow - Sunday, September 16, 2012 - link

    But SB had no impact on 980X pricing, Intel is very deliberate in their pricing and EOL schedules so these parts do not lose much value before they gracefully go EOL. Otherwise, 980X still offered benefit over SB with 6 cores, something that was not replaced until SB-E over 1 year later. Even then, there was plenty of indication before Intel launched their SB-E platform to mitigate any sense of buyer's remorse.

    As for the German site you're critical of, you need to read German to be able to understand numbers and bar graphs? Not to mention Computerbase is internationally acclaimed as one of the best resources for PC related topics. I linked their sites because they were one of the first to use such easy to read performance summaries and even break them down by resolution and settings.

    If you prefer since you listed it, TechPowerUp has similar listings, they only copied the performance summaries of course after ground-breaking sites like Computerbase were using them for some years.

    http://www.techpowerup.com/reviews/NVIDIA/GeForce_...

    As you can see, even at your lower 1680x1050 resolution, the GTX 280 still handily outclasses the 4870 512MB by ~21%, so I guess there goes your theory? I've seen more recent benchmarks in games like Skyrim or any title with 4xAA or high-res textures where the gap widens as the 4870 chokes on the size of the requisite framebuffer.

    As for your own situation and you're wife's graphics card, TigerDirect has the same GTX 670 GC card I bought for $350 after rebate. Not as good as the deal I got but again, there are certainly new ones out there to be had for cheap. I'm personally going to wait another week or two to see if the GTX 660 price and its impact on AMD prices (another round of cuts expected next week, LOL) forces Nvidia to drop their prices as I also need to buy another GPU for the gf.
  • Galidou - Sunday, September 16, 2012 - link

    And then once again you're off the subject, you send me relative performance of the gtx 280 to the 4870 2 years after their launch...... We were speaking of the rebate issued in 2008 for the performance it had back then. All the links I sent you were from 4 years ago and there's a reason to it and they're showing the 280 on average 10% difference in performance, and sometimes loosing big time.

    Sure the 1gb ram of the gtx 280 in the long run paid off. But we're speaking of 2008 situation that forced them to issue rebates..... which is 1 month after it's launch there's a part that performs ''similarly'' that costs less than HALF of it's price. computerbase is a good website, not the first time I see it, but it's the only one(and I never use only one website to base on the REAL average) that shows 20% difference in performance for a part that did still cost 115% more than the radeon 4870... 115 FREAKING % within one month!! nothing else to say.

    From a % point of view the 7970 and gtx 680 was a REALLY different fight.... and it was 3 month separating them which is something we commonly see in video card industry. while 115% more pricey parts that performs let's say 15% for your pleasure average from all websites than another part.....

    ''980X still offered benefit over SB with 6 cores''

    Never said it didn't, that's why I precised: ''A gamer buying an i7 980x 1 week prior to the sandy bridge launch''. For the gamer there was NO benefit at ALL, it even lost to the sandy bridge in games.

    But that's true, the 980x was out for a while unlike the gtx 280 who didn't have almost any time to keep it's amazing lead from last gen parts. The reason why they HAD to issue rebates, and the reason why I switched back to ATI from my 7800gt that died.

    The gtx 660 ti is a fine card, I'm just worried about the ROPs for future proofing, she'll keep the card a VERY long time so I regret not buying the 670 in the first place.
  • chizow - Sunday, September 16, 2012 - link

    No I'm not off the subject, you're obviously basing your performance differences based on a specific low-resolution setting that was important to you at launch while I'm showing performance numbers of all resolutions that have only increased over time. The TPU link I provided was from a later review because as I already stated, Computerbase was one of the first sites to use these aggregate performance numbers, only later did other sites like TPU follow suit. The GTX 280 was always the best choice for enthusiasts running higher resolutions and more demanding AA and those differences only increased over time, just as I stated.

    Nvidia didn't feel they needed to drop the price any more than the initial cut on the 280 because after the cut they had a "similar" part to compete with the 4870 with their own GTX 260. Once again, AMD charged too little for their effort, but that has no bearing on the fact that the 280's launch price was *JUSTIFIED* based on relative performance to last-gen parts, unlike the situation with AMD's 28nm launch prices.

    As for the 680 and 7970? It just started driving home the fact the 7970 was grossly overpriced, as it offered 10-15% *more* performance than the 7970 at *10%* less price which began the tumble on AMD prices we see today. I've also been critical of the GTX 680 though, as it only offers ~35-40% increase in performance over GTX 580 at 100% of the price, which is still the worst increase in the last 10 years for Nvidia, but still obviously better than the joke AMD launched with Tahiti. 115% performance for 110% of the price compared to last-gen after 18 months is an absolute debacle.

    As for the 980X and SB, again the whole tangent is irrelevant. What would make it applicable would be if AMD launched a bulldozer variant that offered 90% of 980X performance at $400 price point and forced Intel to drop prices and issue rebates, but that obviously didn't happen. You're comparing factors that Intel has complete control over where in the case of the GTX 280, Nvidia obviously had no control over what AMD decided to do with the 4870.
  • chizow - Sunday, September 16, 2012 - link

    There were numerous other important resolutions that took advantage of the 280's larger frame buffer, 1600x1200, 1920x1080 and 1920x1200. While they were obviously not as prevalent as they are now, they were certainly not uncommon for anyone shopping for a $300+ or $500 video card.

    As for the 7970 asking price, are you kidding? I had 10x as many AMD fanboys saying the 7970 price was justified at launch (not just Rarson), and where do you get 70% more perf? Its 50% being generous.

    So you got 150% performance for 150% of last-gen AMD price compared to 6970, how is that a good deal? Or similarly, you got 120% more performance for 110% the price compared to GTX 580, both last-gen parts.

    What you *SHOULD* expect is 150-200% performance for 100% of last-gen price, which is what the GTX 280 offered relative to 8800GTX, which is why I stated its pricing was justified.

    We've already covered the RV770, AMD could've easily priced it higher, even matched the GTX 260 price at $400 and still won, but they admittedly chose to go after market/mindshare instead after being beaten so badly by Nvidia since R600. Ever since then, they have clearly admitted their pricing mistake and have done everything in their power to slowly creep those prices upwards, culminating in the HUGE price increase we saw with Tahiti (see 150% price increase from 6970).
  • Galidou - Sunday, September 16, 2012 - link

    They went for price related to the size of the Die. The radeon 4870 was more than half the size of the gtx 280 thus costing less than half to produce then justifying the value of the chip by the size and not the perdformance.

    We all know why this is happening now, AMD was battling to get back for competition against the top because they left this idea by building smaller die with the HD 3xxx, leaving the higher end to double chip boards, end parts below 400$ prices. So I knew it had to happen one day or another.

    So if you're really into making a wikipedia about pricing scheme for video cards and developping about it, go on. But in my opinion, as long as they do not sell us something worse than last gen for a higher price, I'll leave it for people to discern what they need. With all the competition in the market, it's hard to settle for anything that's a real winner, it's mostly based on personnal usage and money someone is willing to spend.

    If someone want to upgrade his video card for xxx$, only thing he have to do is look at the benchmark for the game(s) he plays. Not looking at the price of the last generation of video cards to see if the price is relevant to the price he pays now. Usually a good 30 minutes looking at 3-4 different web sites looking at the graphs reading a little will give you good indication without sending you in the dust by speaking and arguing about last gen stuff compared to what's out now...

    You speak like you're trying to justify what people should buy now because of how things were priced in the past..... Not working. You take your X bucks check out benchies, go out there and buy the card you want end of the freaking line. No video card is interesting you now, wait until something does. Stop living in the past and get to another chapter ffs...
  • chizow - Sunday, September 16, 2012 - link

    Heh Wikipedia page? Obviously its necessary to set the record straight as revisionists like yourself are only going to emphasize the lowlights rather than the highlights.

    What you don't seem to understand is that transactions in a free market are not conducted in a vacuum, so the purchases of others do directly impact you if you are in the same market for these goods.

    Its important for reviewers to emphasize such important factors like historical prices and changes in performance, otherwise it reinforces and encourages poor pricing practices like we've seen from AMD. It just sets a bad precedent.

    Obviously the market has reacted by rejecting AMD's pricing scheme, and as a result we see the huge price drops we've seen over the last few months on their 28nm parts. All that's left is all the ill will from the AMD early adopters. You think all those people who got burned are OK with all the price drops, and that AMD won't have to deal with those repercussions later?

    You want to get dismissive and condescending, if all it took was a good 30 minutes looking at 3-4 different websites to get it, why haven't people like you and rarson gotten it yet?
  • rarson - Tuesday, September 18, 2012 - link

    Yeah, it was rhetorical. It was also pointless and off-topic.

    "And even after the launch of 28nm, they still held their prices because there was no incentive or need to drop in price based on relative price and performance?"

    Your problem is that you just don't pay any goddamn attention. You have the attention span of a fruit fly. Let me refresh your memory. Kepler "launched" way back in March. All throughout April, the approximate availability of Kepler was zero. AMD didn't drop prices immediately because Kepler only existed in a few thousand parts. They dropped prices sometime around early May, when Kepler finally started appearing in decent quantities, because by then, the cards had already been on the market for FOUR MONTHS. See, even simple math escapes you.

    "You have no idea what you're talking about, stop typing."

    You're projecting and need to take your own advice.
  • CeriseCogburn - Thursday, November 29, 2012 - link

    AMD's 7970/7950 series supply finally became "available" on average a few days before Kepler launched.
    LOL
    You said something about amnesia ?
    You rarson, are a sad joke.
  • CeriseCogburn - Thursday, November 29, 2012 - link

    I love it when the penny pinching amd fanboy whose been whining about 5 bucks in amd "big win!" pricing loses their mind and their cool and starts yapping about new technology is expensive, achieving the highest amd price apologist marks one could hope for.
    LOL
    It's awesome seeing amd fanboys with zero cred and zero morality.
    The GTX570 made the 7850 and 7870 the morons choice from the very first date of release.
    You cannot expect the truth from the amd fans. It never happens. If there's any exception to that hard and fast rule, it's a mistake, soon to be corrected, with a vengeance, as the brainwashing and emotional baggage is all powerful.
  • Galidou - Monday, September 17, 2012 - link

    While speaking about all that, pricing of the 4870 and 7970 do you really know everything around that, because it seems not when you are arguing, you just seem to put everything on the shoulder of a company not knowing any of the background.

    Do you know the price of the 4870 was already decided and it was in correlation with Nvidia's 9000 series performance. That the 4870 was supposed to compete against 400$ cards and not win and the 4850 supposed to compete against 300$ series card and not win. You heard right, the 9k series, not the GTX 2xx.

    The results even just before the coming out of the cards were already ''known''. The real things were quite different with the final product and last drivers enhancements. The performance of the card was actually a surprise, AMD never thought it was supposed to compete against the gtx 280, because they already knew the performance of the latter and that it was ''unnaittanable'' considering the size of the thing. Life is full of surprise you know.

    Do you know that after that, Nvidia sued AMD/ATI for price fixing asking for more communications between launch and less ''surprises''. Yes, they SUED them because they had a nice surprise... AMD couldn't play with prices too much because they were already published by the media and it was not supposed to compete against gtx2xx series. They had hoped that at 300$ it would ''compete'' against the gtx260 and not win against i thus justifying the price of the things at launch. And here you are saying it's a mistake launching insults at me, telling me I have a low intelligence and showing you're a know it all....

    Do you know that this price fixing obligation is the result of the pricing of the 7970, I bet AMD would of loved to price the latter at 400$ and could do it but it would of resulted in another war and more suing from Nvidia that wanted to price it's gtx 680 500$ 3 month after so to not break their consumers joy, they communicate A LOT more than before so everyone is happy, except now it hurts AMD because you compare to last gen and it makes things seems less of a deal. But with things back to normal we will be able to compare last gen after the refreshed radeon 7xxx parts and new gen after that.

    Nvidia the ''giant'' suing companies on the limit of ''extinction'', nice image indeed. Imagine the rich bankers starting to sue people in the streets, and they are the one you defend so vigorously. If they are that rich, do you rightly think the gtx 280 was well priced even considering it was double the last generation..

    It just means one thing, they could sell their card for less money but instead they sue the other company to take more money from our pockets, nice image.... very nice..... But that doesn't mean I won't buy an Nvidia card, I just won't defend them as vigorously as you do.... For every Goliath, we need a David, and I prefer David over Goliath.... even if I admire the strenght of the latter....

Log in

Don't have an account? Sign up now