Power, Temperature, & Noise

As always, we’re wrapping up our look at a video card’s stock performance with a look at power, temperature, and noise. Unlike GTX 660 Ti, which was a harvested GK104 GPU, GTX 660 is based on the brand-new GK106 GPU, which will have interesting repercussions for power consumption. Scaling down a GPU by disabling functional units often has diminishing returns, so GK106 will effectively “reset” NVIDIA’s position as far as power consumption goes. As a reminder, NVIDIA’s power target here is a mere 115W, while their TDP is 140W.

GeForce GTX 660 Series Voltages
Ref GTX 660 Ti Load Ref GTX 660 Ti Idle Ref GTX 660 Load Ref GTX 660 Idle
1.175v 0.975v 1.175v 0.875v

Stopping to take a quick look at voltages, even with a new GPU nothing has changed. NVIDIA’s standard voltage remains at 1.175v, the same as we’ve seen with GK104. However idle voltages are much lower, with the GK106 based GTX 660 idling at 0.875v versus 0.975v for the various GK104 desktop cards. As we’ll see later, this is an important distinction for GK106.

Up next, before we jump into our graphs let’s take a look at the average core clockspeed during our benchmarks. Because of GPU boost the boost clock alone doesn’t give us the whole picture, we’ve recorded the clockspeed of our GTX 660 during each of our benchmarks when running it at 1920x1200 and computed the average clockspeed over the duration of the benchmark

GeForce GTX 600 Series Average Clockspeeds
  GTX 670 GTX 660 Ti GTX 660
Max Boost Clock 1084MHz 1058MHz 1084MHz
Crysis 1057MHz 1058MHz 1047MHz
Metro 1042MHz 1048MHz 1042MHz
DiRT 3 1037MHz 1058MHz 1054MHz
Shogun 2 1064MHz 1035MHz 1045MHz
Batman 1042MHz 1051MHz 1029MHz
Portal 2 988MHz 1041MHz 1033MHz
Battlefield 3 1055MHz 1054MHz 1065MHz
Starcraft II 1084MHz N/A 1080MHz
Skyrim 1084MHz 1045MHz 1084MHz
Civilization V 1038MHz 1045MHz 1067MHz

With an official boost clock of 1033MHz and a maximum boost of 1084MHz on our GTX 660, we see clockspeeds regularly vary between the two points. For the most part our average clockspeeds are slightly ahead of NVIDIA’s boost clock, while in CPU-heavy workloads (Starcraft II, Skyrim), we can almost sustain the maximum boost clock. Ultimately this means that the GTX 660 is spending most of its time near or above 1050MHz, which will have repercussions when it comes to overclocking.

Starting as always with idle power we immediately see an interesting outcome: GTX 660 has the lowest idle power usage. And it’s not just a one or two watt either, but rather a 6W (all the wall) difference between the GTX 660 and both the Radeon HD 7800 series and the GTX 600 series. All of the current 28nm GPUs have offered refreshingly low idle power usage, but with the GTX 660 we’re seeing NVIDIA cut into what was already a relatively low idle power usage and shrink it even further.

NVIDIA’s claim is that their idle power usage is around 5W, and while our testing methodology doesn’t allow us to isolate the video card, our results corroborate a near-5W value. The biggest factors here seem to be a combination of die size and idle voltage; we naturally see a reduction in idle power usage as we move to smaller GPUs with fewer transistors to power up, but also NVIDIA’s idle voltage of 0.875v is nearly 0.1v below GK104’s idle voltage and 0.075v lower than GT 640 (GK107)’s idle voltage. The combination of these factors has pushed the GTX 660’s idle power usage to the lowest point we’ve ever seen for a GPU of this size, which is quite an accomplishment. Though I suspect the real payoff will be in the mobile space, as even with Optimus mobile GPUs have to spend some time idling, which is another opportunity to save power.

At this point the only area in which NVIDIA doesn’t outperform AMD is in the so-called “long idle” scenario, where AMD’s ZeroCore Power technology gets to kick in. 5W is nice, but next-to-0W is even better.

Moving on to load power consumption, given NVIDIA’s focus on efficiency with the Kepler family it comes as no great surprise that NVIDIA continues to hold the lead when it comes to load power consumption. The gap between GTX 660 and 7870 isn’t quite as large as the gap we saw between GTX 680 and 7970 but NVIDIA still has a convincing lead here, with the GTX 660 consuming 23W less at the wall than the 7870. This puts the GTX 660 at around the power consumption of the 7850 (a card with a similar TDP) or the GTX 460. On AMD’s part, Pitcairn is a more petite (and less compute-heavy) part than Tahiti, which means AMD doesn’t face nearly the disparity as they do on the high-end.

OCCT on the other hand has the GTX 660 and 7870 much closer, thanks to AMD’s much more aggressive throttling through PowerTune. This is one of the only times where the GTX 660 isn’t competitive with the 7850 in some fashion, though based on our experience our Metro results are more meaningful than our OCCT results right now.

As for idle temperatures, there are no great surprises. A good blower can hit around 30C in our testbed, and that’s exactly what we see.

Temperatures under Metro look good enough; though despite their power advantage NVIDIA can’t keep up with the blower-equipped 7800 series.  At the risk of spoiling our noise results, the 7800 series doesn’t do significantly worse for noise so it’s not immediately clear why the GTX 660 is 6C warmer here. Our best guess would be that the GTX 660’s cooler just quite isn’t up to the potential of the 7800 series’ reference cooler.

OCCT actually closes the gap between the 7870 and the GTX 660 rather than widening it, which is the opposite of what we would expect given our earlier temperature data. Reaching the mid-70s neither card is particularly cool, but both are still well below their thermal limits, meaning there’s plenty of thermal headroom to play with.

Last but not least we have our noise tests, starting with idle noise. Again there are no surprises here; the GTX 660’s blower is solid, producing no more noise than any other standard blower we’ve seen.

While the GTX 660 couldn’t beat the 7870 on temperatures under Metro, it can certainly beat the 7870 when it comes to noise. The difference isn’t particularly great – just 1.4dB – but every bit adds up, and 47.4dB is historically very good for a blower. However the use of a blower on the GTX 660 means that NVIDIA still can’t match the glory of the GTX 560 Ti or GTX 460; for that we’ll have to take a look at retail cards with open air coolers.

Similar to how AMD’s temperature lead eroded with OCCT, AMD’s slight loss in load noise testing becomes a much larger gap under OCCT. A 4.5dB difference is now solidly in the realm of noticeable, and further reinforces the fact that the GTX 660 is the quieter card under both normal and extreme situations.

We’ll be taking an in-depth look at some retail cards later today with our companion retail card article, but with those results already in hand we can say that despite the use of a blower the “reference” GTX 660 holds up very well. Open air coolers can definitely beat a blower with the usual drawbacks (that heat has to go somewhere), but when a blower is only hitting 47dB, you already have a fairly quiet card. So even a reference GTX 660 (as unlikely as it is to appear in North America) looks good all things considered.

Synthetics OC: Power, Temperature, & Noise
Comments Locked

147 Comments

View All Comments

  • chrnochime - Thursday, September 13, 2012 - link

    You mean that they spun the results in NV's favor when the 670 came out, and then again in AMD's favor when comparing OC results from 7950 against 660TI OC and 670 OC?
  • chizow - Thursday, September 13, 2012 - link

    Is AMD going to issue rebates for the 7870? $150 price drop in 4 months is pretty sad beans for all the AMD early adopters.
  • RussianSensation - Thursday, September 13, 2012 - link

    No, because if you want the latest tech on latest 28nm, you understand you are paying a premium for it. If not, you sit out for 6-7 months and wait for more price drops. This is how it always worked. I am sure early AMD adopters don't care since their cards already paid for most of their cost with bitcoin mining on the side and they have enjoyed a cool and efficient card for 7 months. How are your 680s doing that you dropped $1k on?

    Care to remind everyone that GTX280 launched at $649 on June 16, 2008, dropped $150 1 month immediately when 4870 launched and then 9.5 months later AMD delivered a $269 HD4890 that offered similar performance.

    I guess in that generation the early adopter lost $380 by going with the 280 in just 9.5 months but you failed to mention that's how it works in the GPU industry.
  • chizow - Thursday, September 13, 2012 - link

    Except we already knew the 7870 wasn't worth the asking price when it was released, so the natural response would've been to wait by those who already knew price drops were imminent on grossly inflated 28nm parts.

    And I guess you already forgot, Nvidia did right by its early adopting customers by issuing $100-$150 rebates to those who bought a GTX 260 or 280 before the price drops, which is why I asked. Same reason I asked if AMD was going to do the same if and when Nvidia adjusted the pricing landscape with Kepler to force cuts across the board for AMD's ridiculous pricing structure. So again, where are AMD's rebates given every 28nm part they released is worth roughly 30-40% less than original MSRP? That's more than even your referenced drops on the GTX 260/280.

    4890 was nothing special, Nvidia released an equivalent GTX 275 for similar price and those prices were due largely to price wars in the midst of a massive global recession.

    As for my $1K GTX 680s, they don't exist because I wouldn't pay that much for such a small increase in generational performance on a midrange ASIC, I paid $660 for 2x GTX 670 on 680 PCBs instead which is probably still a bit more than I think they are worth, but I figure after the 2x Borderlands promos they are much closer to the $300 price point a 2nd tier GK104 SKU should have been sold at anyways. :D
  • rarson - Friday, September 14, 2012 - link

    "Except we already knew the 7870 wasn't worth the asking price when it was released"

    So the people who "knew" this weren't buying it anyway, and hence do not need a rebate. You really should read what you write before posting your comment.

    Here's how technology works, dude: new technology is expensive. As time goes on, it becomes cheaper as more people start adopting it. The 7870 really was worth the price when it came out (no shit, it really was). You can figure this out by seeing that people actually went and bought them. Supply was constrained and the process was very expensive (more so than previous process shrinks) so even just getting the wafers allocated was tougher than before. On top of that, AMD had to adjust their pricing to deal with the constrained supply. Price it too low, and whatever stock that you have sells out too quickly and you sit for months with no stock on the shelf, selling nothing (just ask Nvidia).

    I know, I know, you can't grasp basic economics. I'm wasting my breath. Maybe once you move out of the basement you'll figure out how the real world works.
  • chizow - Friday, September 14, 2012 - link

    Yeah it was a rhetorical question, I know AMD isn't issuing rebates, they don't have the money to return they'd just be borrowing more from Abu Dhabi to cut that check that might very well bounce.

    As for how technology works, you once again demonstrate how little you know about the industry. Prices drop, like the GTX 580/570 and 6970/6950 that held their prices for a good full 20 months before the launch of 28nm parts? And even after the launch of 28nm, they still held their prices because there was no incentive or need to drop in price based on relative price and performance?

    You have no idea what you're talking about, stop typing. Parts lose their value and drop in price when a new part forces that change. Usually this happens when a new generation of product or a new process/fabrication node forces the change by introducing a dramatic increase in price:performance. In this case, the prices drops are being forced by products that are the *SAME* process and relative generation (from Nvidia).

    What this *SHOULD* tell you is that the 28nm offerings from AMD were grossly overpriced and offered FAR less improvement for the asking price, but these simple concepts obviously escape you.
  • Galidou - Saturday, September 15, 2012 - link

    660$ for 2 gtx 670 on 680 pcbs, you got engineer samples? I've been looking on ebay for USED 670 and the best price one ended was 355$ with shipping and that was without the borderlands 2 coupon. I've seen some reference cards going down to 344$ before taxes and after 20$ mail in rebate(around 380$ shipped) but they were FAR from 680 PCBs.

    I'd really like to see those 670's at 330$ with 680 PCB's, would really like.....

    One comment about the rebate on the gtx 280, it's quite different from now. The 549$ radeon 7970 lost to a 499$ gtx 680 3 months after it's launch.

    The 650$ gtx 280 was on average 10% better and sometimes 10% worse than the 300$ radeon 4870 one month after it's launch...
  • chizow - Saturday, September 15, 2012 - link

    Hi Galidou, you're not here to defend AMD's launch prices again too are you?

    670 on 680 PCBs are quite commonplace, maybe you've heard of the EVGA FTW versions? Galaxy has a similar one with their GC parts, no engineering samples needed:
    http://forums.anandtech.com/showthread.php?t=22660...
    http://www.newegg.com/Product/Product.aspx?Item=N8...

    Please feel free to check the reviews on Newegg link, you will see I'm a verified owner. ;)

    As for the GTX 280, once again more revisionist history from the usual suspects. GTX 280 was closer to 15-20% faster especially at high resolutions with AA due to the 1GB VRAM compared to the 4870's 512MB. The gap widens even further if you look at later reviews.

    http://www.computerbase.de/artikel/grafikkarten/20...

    Only after the 1GB 4870 and 4890 releases months later did this change so the price difference even after the $150 price cut to $500 was still justifiable. I still have a GTX 280 as a backup card and it still runs modern games great, a 512MB 4870 would barely be able to handle the default frame buffer....

    Secondly, GTX 280's asking price was reasonable compared to last-gen parts, unlike AMD's 28nm parts, as it offered 2x the performance of the 8800/9800GTX, more performance than the 9800GX2 or 3870X2, and almost tripled the performance of AMD's fastest single GPU, the 3870. What Nvidia did not account for was both AMD's return to competitiveness with the RV770 *AND* their massive undercut on pricing simultaneously.

    Lastly, of course, is Nvidia actually did right by their customers by issuing those rebates, which is just good business to ensure they took care of their most enthusiastic customers. Certainly more than we can say for AMD though. AMD may have made some short-term profit, but at what cost? They certainly have more than a few fans who are going to be outraged by the massive cuts so soon after launch.
  • Galidou - Saturday, September 15, 2012 - link

    I just removed the 4870 512mb I had and it still runs perfectly but my wife used it on a 1680*1050 monitor. The 280 took advantage at 2560*1600, but you can imagine it was mainly for benchmark purposes. Not a lot of people do use this resolution NOW so imagine 4 years ago... let's say >0,05% back in 2008(most of them being artists and not gamers), considering the cost of one of those monitors... yep you guessed it, about the price of 2 gtx 280 at launch, for ONE monitor.... The price of my whole computer with the 4870 back in 2008...

    ''Secondly, GTX 280's asking price was reasonable compared to last-gen parts, unlike AMD's 28nm parts, as it offered 2x the performance of the 8800/9800GTX''

    When AMD fanboys said the 7970 was priced rightly compared to last gen parts(which it was, there was no bargain for sure but it was 40% higher price for 70% more perf) Nvidia fanboys said: You gotta be freaking kidding me, are you blind, ffs remove your red glasses and wake up''.

    ''Hi Galidou, you're not here to defend AMD's launch prices again too are you?''

    Not necessarily but it seems you are here to attack them again. A 550$ card loses in 65% of the games to a 500$ card 3 months after it's launch(radeon 7970 vs gtx 680)....a 650$ card is 10% faster than a 300$ card in 60% of the games one month after it's launch(gtx 280 vs 4870). My reflections above seemed logical to me, I guess life is a question of perception.
  • Galidou - Saturday, September 15, 2012 - link

    I think there's one thing we should say about pricing mistakes and rebate. A gamer buying an i7 980x 1 week prior to the sandy bridge launch, and I know it happened... poor them...

    Nice link on the German website, you must of looked a lot of websites to find a 20% advantage on average for a gtx 280 because tom hardware, techpowerup and anandtech shows it average 10% faster but nice finding you got there. If only I could read German. And we both know the websites I named above are certainly more highly regarded than computerbase.de..........

    And good job, because I'm really looking forward to change the gtx 660 ti I got for my wife for a gtx 670 but the best I could get to is either 355$ USED(an auction, not a buy it now) on ebay or 344$ before taxes and after mail in rebate(hate those I'm usually not even using them because they use your information to harass your life). But I guess it's possible to find them at 330$ after mail in rebate and before taxes but in the end it's closer to the 400$ mark than the 300$ mark.

Log in

Don't have an account? Sign up now