Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

As the GM200 flagship card, GTX Titan X gets the pick of the litter as far as GM200 GPUs go. GTX Titan X needed fully-functional GM200 GPUs, and even then needed GPUs that were good enough to meet NVIDIA’s power requirements. GTX 980 Ti on the other hand, as a cut-down/salvage card, gets second pick. So we expect to see these chips be just a bit worse; to have either functional units that came out of the fab damaged, or have functional units that have been turned off due to power reasons.

GeForce GTX Titan X/980 Voltages
GTX Titan X Boost Voltage GTX 980 Ti Boost Voltage GTX 980 Boost Voltage
1.162v 1.187v 1.225v

Looking at voltages, we can see just that in our samples. GTX 980 Ti has a slightly higher boost voltage – 1.187v – than our GTX Titan X. NVIDIA sometimes bins their second-tier cards for lower voltage, but this isn’t something we’re seeing here. Nor is there necessarily a need to bin in such a manner since the 250W TDP is unchanged from GTX Titan X.

GeForce GTX 980 Ti Average Clockspeeds
Game GTX 980 Ti GTX Titan X
Max Boost Clock 1202MHz 1215MHz
Battlefield 4
1139MHz
1088MHz
Crysis 3
1177MHz
1113MHz
Mordor
1151MHz
1126MHz
Civilization: BE
1101MHz
1088MHz
Dragon Age
1189MHz
1189MHz
Talos Principle
1177MHz
1126MHz
Far Cry 4
1139MHz
1101MHz
Total War: Attila
1139MHz
1088MHz
GRID Autosport
1164MHz
1151MHz
Grand Theft Auto V
1189MHz
1189MHz

The far more interesting story here is GTX 980 Ti’s clockspeeds. As we have pointed out time and time again, GTX 980 Ti’s gaming performance trails GTX Titan X by just a few percent, this despite the fact that GTX 980 Ti is down by 2 SMMs and is clocked identically. On paper there is a 9% performance difference that in the real world we’re not seeing. So what’s going on?

The answer to that is that what GTX 980 Ti lacks in SMMs it’s making up in clockspeeds. The card’s average clockspeeds are frequently two or more bins ahead of GTX Titan X, topping out at a 64MHz advantage under Crysis 3. All of this comes despite the fact that GTX 980 Ti has a lower maximum boost clock than GTX Titan X, topping out one bin lower at 1202MHz to GTX Titan X’s 1215MHz.

Ultimately the higher clockspeeds are a result of the increased power and thermal headroom the GTX 980 Ti picks up from halving the number of VRAM chips along with disabling two SMMs. With those components no longer consuming power or generating heat, and yet the TDP staying at 250W, GTX 980 Ti can spend its power savings to boost just a bit higher. This in turn compresses the performance gap between the two cards (despite what the specs say), which coupled with the fact that performance doesn't scale lineraly with SMM count or clockspeed (you rarely lose the full theoretical performance amount when shedding frequency or functional units) leads to the GTX 980 Ti trailing the GTX Titan X by an average of just 3%.

Idle Power Consumption

Starting off with idle power consumption, there's nothing new to report here. GTX 980 Ti performs just like the GTX Titan X, which at 74W is second only to the GTX 980 by a single watt.

Load Power Consumption - Crysis 3

Load Power Consumption - FurMark

Meanwhile load power consumption is also practically identical to the GTX Titan X. With the same GPU on the same board operating at the same TDP, GTX 980 Ti ends up right where we expect it, next to GTX Titan X. GTX Titan X did very well as far as energy efficiency is concerned – setting a new bar for 250W cards – and GTX 980 Ti in turn does just as well.

Idle GPU Temperature

Load GPU Temperature - Crysis 3

Load GPU Temperature - FurMark

As was the case with power consumption, video card temperatures are similarly unchanged. NVIDIA’s metal cooler does a great job here, keeping temperatures low at idle while NVIDIA’s GPU Boost mechanism keeps temperatures from exceeding 83C under full load.

Idle Noise Levels

Load Noise Levels - Crysis 3

Load Noise Levels - FurMark

Finally for noise, the situation is much the same. Unexpected but not all that surprising, the GTX 980 Ti ends up doing a hair worse than the GTX Titan X here. NVIDIA has not changed the fan curves or TDP, so this ultimately comes down to manufacturing variability in NVIDIA’s metal cooler, with our GTX 980 Ti faring ever so slightly worse than the Titan. Which is to say that it's still right at the sweet spot for noise versus power consumption, dissipating 250W at no more than 53dB, and once again proving the mettle of NVIDIA's metal cooler.

Compute Overclocking
Comments Locked

290 Comments

View All Comments

  • RaistlinZ - Sunday, May 31, 2015 - link

    What more would a review of the 960 tell you that you don't already know, honestly? I'd rather read reviews about interesting products like the 980Ti. People need to let the 960 review go already, geez.
  • Michael Bay - Sunday, May 31, 2015 - link

    I only trust AT numbers and am in no hurry to upgrade.

    God I wish they would compare Baytrail/Cherrytrail to i3s.
  • Brett Howse - Sunday, May 31, 2015 - link

    I did compare Cherry Trail to the i3 SP3 in the Surface 3 review. Was there more you were looking for?
  • Michael Bay - Monday, June 1, 2015 - link

    I`m trying to get a cheap small notebook for my father. He is currently on i3-380UM and the choice is between N3558 and i3-4030U. Workload is strictly internet browsing/ms office.

    Not much point in changing anything if performance is going to be worse than it was...
  • sandy105 - Monday, June 1, 2015 - link

    Exactly , it would be interesting to see how much faster than baytrail they are ?
  • DanNeely - Sunday, May 31, 2015 - link

    DVI may be an obsolescent standard at this point; but 4/5k gaming is still expensive enough that a lot of the people buying into it now are ones who're upgrading from older 2560x1600 displays that don't do DP/HDMI 2. A lot of those people will probably keep using their old monitor as a secondary display after getting a new higher resolution one (I know I plan to); and good DL-DVI to display port adapters are still relatively expensive at ~$70. (There're cheaper ones; but they've all got lots of bad reviews from people who found they weren't operating reliably and were generating display artifacts: messed up scan lines.) Unless it dies first, I'd like to be able to keep using my existing NEC 3090 for a few more years without having to spend money on an expensive dongle.
  • YazX_ - Sunday, May 31, 2015 - link

    Dude, majority are still playing on 1920x1080 and just few now are making the leap to 2560x1440p, i have been gaming on 1440p since two years and not planning to go 4k anytime soon since hardware still not mature enough to play at 4k comfortably with single video card.

    thus, DVI is not going anywhere since dual layer DVI supports 1440p and probably most of 1080p gamers are using DVI unless if they have G-Sync or want to use Adaptive V-Sync then they have to use DP, and dont forget that there are too many people who bought 27" Korean 1440 monitors that doesnt have except DVI ports.
  • DanNeely - Sunday, May 31, 2015 - link

    If you're playing at 1920/60hz this card's massive overkill, and in any event it's a non-issue for you because your monitor is only using a single link in the DVI and you can use a dirt cheap passive DVI-HDMI/DP adapter now; and worst case would only need a cheap single link adapter in the future.

    My comment was directed toward Ryan's comment on page 2 (near the bottom, above the last picture) suggesting that the DVI port wasn't really needed since any monitor it could drive wouldn't need this much horse power to run games.
  • FlushedBubblyJock - Wednesday, June 10, 2015 - link

    totally disagree - I game at 1920x1200, the only rez the 980ti is capable of without knocking down the eye candy.
  • Kutark - Monday, June 1, 2015 - link

    Exactly. I literally just now upgraded to a 1440p monitor, and i can't even express in words how little of a sh*t i give about 4k gaming. Ive been a hardware nerd for a long time, but when i got into home theater i learned just how much resolution actually matters. 4k is overkill for a 120" projected image at a 15' seating distance. 4k at normal desk viewing distances is way beyond overkill. They've done tests on fighter pilots who have ridiculous vision, like 20/7.5 and such, and even they can't see a difference at those seating distances. 4k is almost as much of a marketing BS gimmick than 3D was for tv's.

    Anyways im clearly getting angry. But point still stands, every single gamer i know is still on 1080p, i was the first to splurge on a 1440p monitor. And now its put me into a position where my SLI'd 760's aren't really doing the deed, especially being 2gb cards. So, 980ti fits the bill for my gsync 144hz 1440p monitor just about perfectly.

Log in

Don't have an account? Sign up now