Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

Having already seen the Maxwell architecture in action with the GTX 750 series, the GTX 980 and its GM204 Maxwell 2 GPU have a very well regarded reputation to live up to. GTX 750 Ti shattered old energy efficiency marks, and we expect much the same of GTX 980. After all, NVIDIA tells us that they can deliver more performance than the GTX 780 Ti for less power than the GTX 680, and that will be no easy feat.

GeForce GTX 980 Voltages
GTX 980 Boost Voltage GTX 980 Base Voltage GTX 980 Idle Voltage
1.225v 1.075v 0.856v

We’ll start as always with voltages, which in this case I think makes for one of the more interesting aspects of GTX 980. Despite the fact that GM204 is a pretty large GPU at 398mm2 and is clocked at over 1.2GHz, NVIDIA is still promoting a TDP of just 165W. One way to curb power consumption is to build a processor wide-and-slow, and these voltage numbers are solid proof that NVIDIA has not done that.

With a load voltage of 1.225v, NVIDIA is driving GM204 as hard (if not harder) than any of the Kepler GPUs. This means that all of NVIDIA’s power optimizations – the key to driving 5.2 billion transistors at under 165W – lie with other architectural optimizations the company has made. Because at over 1.2v, they certainly aren’t deriving any advantages from operating at low voltages.

Next up, let’s take a look at average clockspeeds. As we alluded to earlier, NVIDIA has maintained the familiar 80C default temperature limit for GTX 980 that we saw on all other high-end GPU Boost 2.0 enabled cards. Furthermore as a result of reinvesting most of their efficiency gains into acoustics, what we are going to see is that GTX 980 still throttles. The question then is by how much.

GeForce GTX 980 Average Clockspeeds
Max Boost Clock 1252MHz
Metro: LL
1192MHz
CoH2
1177MHz
Bioshock
1201MHz
Battlefield 4
1227MHz
Crysis 3
1227MHz
TW: Rome 2
1161MHz
Thief
1190MHz
GRID 2
1151MHz
Furmark
923MHz

What we find is that while our GTX 980 has an official boost clock of 1216MHz, our sustained benchmarks are often not able to maintain clockspeeds at or above that level. Of our games only Bioshock Infinite, Crysis 3, and Battlefield 4 maintain an average clockspeed over 1200MHz, with everything else falling to between 1151MHz and 1192MHz.  This still ends up being above NVIDIA’s base clockspeed of 1126MHz – by nearly 100MHz at times – but it’s clear that unlike our 700 series cards NVIDIA is much more aggressively rating their boost clock. The GTX 980’s performance is still spectacular even if it doesn’t get to run over 1.2GHz all of the time, but I would argue that the boost clock metric is less useful this time around if it’s going to overestimate clockspeeds rather than underestimate. (ed: always underpromise and overdeliver)

Idle Power Consumption

Starting as always with idle power consumption, while NVIDIA is not quoting specific power numbers it’s clear that the company’s energy efficiency efforts have been invested in idle power consumption as well as load power consumption. At 73W idle at the wall, our testbed equipped with the GTX 980 draws several watts less than any other high-end card, including the GK104 based GTX 770 and even AMD’s cards. In desktops this isn’t going to make much of a difference, but in laptops with always-on dGPUs this would be helpful in freeing up battery life.

Load Power Consumption - Crysis 3

Our first load power test is our gaming test, with Crysis 3. Because we measure from the wall, this test means we’re seeing GPU power consumption as well as CPU power consumption, which means high performance cards will drive up the system power consumption numbers merely by giving the CPU more work to do. This is exactly what happens in the case of the GTX 980; at 304W it’s between the GK104 based GTX 680 and GTX 770, however it’s also delivering 30% better framerates. Accordingly the power consumption of the GTX 980 itself should be lower than either card, but we would not see it in a system power measurement.

Load Power Consumption - FurMark

For that reason, when looking at recent generation cards implementing GPU Boost 2.0 or PowerTune 3, we prefer to turn to FurMark as it essentially nullifies the power consumption impact of the CPU. In this case we can clearly see what NVIDIA is promising: GTX 980’s power consumption is lower than everything else on the board, and noticeably so. With 294W at the wall, it’s 20W less than GTX 770, 29W less than 290X, and some 80W less than the previous NVIDIA flagship, GTX 780 Ti. At these power levels NVIDIA is essentially drawing the power of a midrange class card, but with chart-topping performance.

Idle GPU Temperature

Moving on to temperatures, at idle we see nothing remarkable. All of these well-designed, low idle power designs are going to idle in the low 30s, especially since they’re not more than a few degrees over room temperature.

Load GPU Temperature - Crysis 3

With an 80C throttle point in place for the GTX 980, it’s here where we see the card top out at. The fact that we’re hitting 80C is the reason why the card is exhibiting clockspeed throttling as we saw earlier. NVIDIA’s chosen fan curve is tuned for noise over temperature, so it’s letting the GPU reach its temperature throttle point rather than ramp up the fan (and the noise) too much.

Load GPU Temperature - FurMark

Once again we see the 80C throttle in action. Like all GPU Boost 2.0 NVIDIA cards, NVIDIA makes sure their products aren’t going to get well over 80C no matter the workload.

Idle Noise Levels

Last but not least we have our noise results. Right off the bat the GTX 980 is looking strong; even with the shared heritage of the cooler with the GTX 780 series, the GTX 980 is slightly but measurably quieter at idle than any other high-end NVIDIA or AMD card. At 37.3dB, the GTX 980 comes very close to being silent compared to the rest of the system.

Load Noise Levels - Crysis 3

Our Crysis 3 load noise testing showcases the full benefits of the GTX 980’s well-built blower in action. GTX 980 doesn’t perform appreciably better than the GTX Titan cooler equipped GTX 770 and GTX 780, but then again GTX 980 is also not using quite as advanced of a cooler (forgoing the vapor chamber). Still, this is enough to edge ahead of the GTX 770 by 0.1dB, technically making it the quietest video card in this roundup. Though for all practical purposes, it’s better to consider it tied with the GTX 770.

Load Noise Levels - FurMark

FurMark noise testing on the other hand drives a wedge between the GTX 980 and all other cards, and it’s in the GTX 980’s favor. Despite the similar noise performance between various NVIDIA cards under Crysis 3, under our maximum, pathological workload of FurMark the GTX 980 pulls ahead thanks to its 165W TDP. At the end of the day its lower TDP limit means that the GTX 980 never has too much heat to dissipate, and as a result it never gets too loud. In fact it can’t. 48.1dB is as loud as the GTX 980 can get, which is why the GTX 980’s cooler and overall build are so impressive. There are open air cooled cards that now underperform the GTX 980 that can’t hit these low of noise levels, never mind the other cards with blowers.

Between the GTX Titan and its derivatives and now GTX 980, NVIDIA has spent quite a bit of time and effort on building a better blower, and with their latest effort it really shows. All things considered we prefer blower type coolers for their heat exhaustion benefits – just install it and go, there’s almost no need to worry about what the chassis cooling can do – and with NVIDIA’s efforts to build such a solid cooler for a moderately powered card, the end result is a card with a cooler that offers all the benefits of a blower with the acoustics that can rival and open air cooler. It’s a really good design and one of our favorite aspects of GTX Titan, its derivatives, and now GTX 980.

Compute Overclocking GTX 980
Comments Locked

274 Comments

View All Comments

  • Kutark - Sunday, September 21, 2014 - link

    I'd hold on to it. Thats still a damn fine card. Honestly you could probably find a used one on ebay for a decent price and SLI it up.

    IMO though id splurge for a 970 and call it a day. I've got dual 760's right now, first time i've done SLI in prob 10 years. And honestly, the headaches just arent worth it. Yeah, most games work, but some games will have weird graphical issues (BF4 near release was a big one, DOTA 2 doesnt seem to like it), others dont utilize it well, etc. I kind of wish id just have stuck with the single 760. Either way, my 2p
  • SkyBill40 - Wednesday, September 24, 2014 - link

    @ Kutark:

    Yeah, I tried to buy a nice card at that time despite wanting something higher than a 660Ti. But, as my wallet was the one doing the dictating, it's what I ended up with and I've been very happy. My only concern with a used one is just that: it's USED. Electronics are one of those "no go" zones for me when it comes to buying second hand since you have no idea about the circumstances surrounding the device and seeing as it's a video card and not a Blu Ray player or something, I'd like to know how long it's run, it's it's been OC'd or not, and the like. I'd be fine with buying another one new but not for the prices I'm seeing that are right in line with a 970. That would be dumb.

    In the end, I'll probably wait it out a bit more and decide. I'm good for now and will probably buy a new 144Hz monitor instead.
  • Kutark - Sunday, September 21, 2014 - link

    Psshhhhh.... I still have my 3dfx Voodoo SLI card. Granted its just sitting on my desk, but still!!!

    In all seriousness though, my roommate, who is NOT a gamer, is still using an old 7800gt card i had laying around because the video card in his ancient computer decided to go out and he didnt feel like building a new one. Can't say i blame him, Core 2 quad's are juuust fine for browsing the web and such.
  • Kutark - Sunday, September 21, 2014 - link

    Voodoo 2, i meant, realized i didnt type the 2.
  • justniz - Tuesday, December 9, 2014 - link

    >> the power bills are so ridiculous for the 8800 GTX!

    Sorry but this is ridiculous. Do the math.

    Best info I can find is that your card is consuming 230w.
    Assuming you're paying 15¢/kWh, even gaming for 12 hours a day every day for a whole month will cost you $12.59. Doing the same with a gtx980 (165w) would cost you $9.03/month.

    So you'd be paying maybe $580 to save $3.56 a month.
  • LaughingTarget - Friday, September 19, 2014 - link

    There is a major difference between market capitalization and available capital for investment. Market Cap is just a rote multiplication of the number of shares outstanding by the current share price. None of this is available for company use and is only an indirect measurement of how well a company is performing. Nvidia has $1.5 billion in cash and $2.5 billion in available treasury stock. Attempting to match Intel's process would put a significant dent into that with little indication it would justify the investment. Nvidia already took on a considerable chunk of debt going into this year as well, which would mean that future offerings would likely go for a higher cost of debt, making such an investment even harder to justify.

    While Nvidia is blowing out AMD 3:1 on R&D and capacity, Intel is blowing both of them away, combined, by a wide margin. Intel is dropping $10 billion a year on R&D, which is a full $3 billion beyond the entire asset base of Nvidia. It's just not possible to close the gap right now.
  • Silma - Saturday, September 20, 2014 - link

    I don't think you realize how many billion dollars you need to spend to open a 14 nm factory, not even counting R&D & yearly costs.
    It's humongous, there is a reason why there are so few foundries in the world.
  • sp33d3r - Saturday, September 20, 2014 - link

    Well, if the NVIDIA/AMD CEOs is blind enough and cannot see it coming, then intel are gonna manufacture their next integrated graphics on a 10 or 8 nm chip and though immature will be a tough competition to them in terms of power and efficiency and even weight.

    remember currently pcs load integrated graphics as a must by intel and people add third party graphics only 'cause intels is not good enough literally adding weight of two graphics cards (Intels and third partys) to the product. Its all worlds apart more convenient when integrated graphics outperforms or able to challenge third party GPUs, we would just throw away NVIDIA and guess what they wont remain a monopoly anymore rather completely wiped out

    Besides Intels integrated graphics are getting more mature in terms of not just die size with every launch, just compare 4000s with 5000s, it wont be long before they catch up.
  • wiyosaya - Friday, September 26, 2014 - link

    I have to agree that it is partly not about the verification cost breaking the bank. However, what I think is the more likely reason is that since the current node works, they will try to wring every penny out of that node. Look at the prices for the Titan Z. If this is not an attempt to fleece the "gotta have it buyer," I don't know what is.
  • Ushio01 - Thursday, September 18, 2014 - link

    Wouldn't paying to use the 22nm fabs be a better idea as there about to become under used and all the teething troubles have been fixed.

Log in

Don't have an account? Sign up now