Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

As the GM200 flagship card, GTX Titan X gets the pick of the litter as far as GM200 GPUs go. GTX Titan X needed fully-functional GM200 GPUs, and even then needed GPUs that were good enough to meet NVIDIA’s power requirements. GTX 980 Ti on the other hand, as a cut-down/salvage card, gets second pick. So we expect to see these chips be just a bit worse; to have either functional units that came out of the fab damaged, or have functional units that have been turned off due to power reasons.

GeForce GTX Titan X/980 Voltages
GTX Titan X Boost Voltage GTX 980 Ti Boost Voltage GTX 980 Boost Voltage
1.162v 1.187v 1.225v

Looking at voltages, we can see just that in our samples. GTX 980 Ti has a slightly higher boost voltage – 1.187v – than our GTX Titan X. NVIDIA sometimes bins their second-tier cards for lower voltage, but this isn’t something we’re seeing here. Nor is there necessarily a need to bin in such a manner since the 250W TDP is unchanged from GTX Titan X.

GeForce GTX 980 Ti Average Clockspeeds
Game GTX 980 Ti GTX Titan X
Max Boost Clock 1202MHz 1215MHz
Battlefield 4
1139MHz
1088MHz
Crysis 3
1177MHz
1113MHz
Mordor
1151MHz
1126MHz
Civilization: BE
1101MHz
1088MHz
Dragon Age
1189MHz
1189MHz
Talos Principle
1177MHz
1126MHz
Far Cry 4
1139MHz
1101MHz
Total War: Attila
1139MHz
1088MHz
GRID Autosport
1164MHz
1151MHz
Grand Theft Auto V
1189MHz
1189MHz

The far more interesting story here is GTX 980 Ti’s clockspeeds. As we have pointed out time and time again, GTX 980 Ti’s gaming performance trails GTX Titan X by just a few percent, this despite the fact that GTX 980 Ti is down by 2 SMMs and is clocked identically. On paper there is a 9% performance difference that in the real world we’re not seeing. So what’s going on?

The answer to that is that what GTX 980 Ti lacks in SMMs it’s making up in clockspeeds. The card’s average clockspeeds are frequently two or more bins ahead of GTX Titan X, topping out at a 64MHz advantage under Crysis 3. All of this comes despite the fact that GTX 980 Ti has a lower maximum boost clock than GTX Titan X, topping out one bin lower at 1202MHz to GTX Titan X’s 1215MHz.

Ultimately the higher clockspeeds are a result of the increased power and thermal headroom the GTX 980 Ti picks up from halving the number of VRAM chips along with disabling two SMMs. With those components no longer consuming power or generating heat, and yet the TDP staying at 250W, GTX 980 Ti can spend its power savings to boost just a bit higher. This in turn compresses the performance gap between the two cards (despite what the specs say), which coupled with the fact that performance doesn't scale lineraly with SMM count or clockspeed (you rarely lose the full theoretical performance amount when shedding frequency or functional units) leads to the GTX 980 Ti trailing the GTX Titan X by an average of just 3%.

Idle Power Consumption

Starting off with idle power consumption, there's nothing new to report here. GTX 980 Ti performs just like the GTX Titan X, which at 74W is second only to the GTX 980 by a single watt.

Load Power Consumption - Crysis 3

Load Power Consumption - FurMark

Meanwhile load power consumption is also practically identical to the GTX Titan X. With the same GPU on the same board operating at the same TDP, GTX 980 Ti ends up right where we expect it, next to GTX Titan X. GTX Titan X did very well as far as energy efficiency is concerned – setting a new bar for 250W cards – and GTX 980 Ti in turn does just as well.

Idle GPU Temperature

Load GPU Temperature - Crysis 3

Load GPU Temperature - FurMark

As was the case with power consumption, video card temperatures are similarly unchanged. NVIDIA’s metal cooler does a great job here, keeping temperatures low at idle while NVIDIA’s GPU Boost mechanism keeps temperatures from exceeding 83C under full load.

Idle Noise Levels

Load Noise Levels - Crysis 3

Load Noise Levels - FurMark

Finally for noise, the situation is much the same. Unexpected but not all that surprising, the GTX 980 Ti ends up doing a hair worse than the GTX Titan X here. NVIDIA has not changed the fan curves or TDP, so this ultimately comes down to manufacturing variability in NVIDIA’s metal cooler, with our GTX 980 Ti faring ever so slightly worse than the Titan. Which is to say that it's still right at the sweet spot for noise versus power consumption, dissipating 250W at no more than 53dB, and once again proving the mettle of NVIDIA's metal cooler.

Compute Overclocking
Comments Locked

290 Comments

View All Comments

  • Kosiostin - Monday, June 1, 2015 - link

    I beg to differ. 4K at monitor viewing distance is not overkill, it's actually quite pleasantly sharp. Phones, tablets and laptops are already pushing for 2K+ displays which is phenomenally sharp and out of the league for normal FHD monitors. Gaming at 4K is still not coming but when it comes it will blow our minds, I am sure.
  • Oxford Guy - Monday, June 1, 2015 - link

    People who care so much for immersion should be using 1440 with HDTV screen sizes, not sitting way up close with small monitors.

    Too bad HDTVs have so much input lag, though.
  • Kutark - Monday, June 1, 2015 - link

    Basically at a 5' viewing distance, you would have to have a 40" monitor before 4k would start to become noticeable.

    Even at 30" monitor you would have to be sitting roughly 3.5' or closer to your monitor to be able to begin to tell the difference.

    We also have to keep in mind we're talking about severely diminishing returns. 1440p is about perfect for normal seating distances with a computer on a 27" monitor. 30" some arguments can be made for 4k but its a minor. Its not like we're going from 480p to 1080p or something 1440p is still very good at "normal" computer seating distances.
  • mapesdhs - Wednesday, June 3, 2015 - link

    Human vision varies as to who can discern what at a particular distance. There's no fixed cutoffs for this. Personally, when wandering around a TV store back in January (without knowing what type of screen I was looking at), for visual clarity the only displays that looked properly impressive turned out to be 4Ks. However, they're still a bit too pricey atm for a good one, with the cheaper models employing too many compromises such as reduced chroma sampling to bring down the pricing, or much lower refresh rates, etc. (notice how stores use lots of static imagery to advertise their cheaper 4K TVs?)

    Btw, here's a wonderfull irony for you: recent research, mentioned in New Scientist, suggests that long exposure by gamers to high-refresh displays makes them more able to tell the difference between standard displays and high-refresh models, ie. simply using a 144Hz monitor can make one less tolerant of standad 60Hz displays in the long term. :D It's like a self-reinforcing quality tolerance level. Quite funny IMO. No surprise to me though, years working in VR & suchlike resulted in my being able to tell the difference in refresh rates much more than I was able to beforehand.

    Anyway, I'm leaving 4K until cheaper models are better quality, etc. In the meantime I bought a decent (but not high-end) 48" Samsung which works pretty well. Certainly looks good for Elite Dangerous running off a 980, and Crysis looks awesome.
  • Laststop311 - Monday, June 1, 2015 - link

    Why would most people be using DVI? DVI is big and clunky and just sucks. Everyone that gets new stuff nowadays uses displayport it has the easiest to use plug.
  • Crest - Sunday, May 31, 2015 - link

    Thank you for including the GTX580. I'm still living and working on a pair of 580's and it's nice to know where they stand in these new releases.
  • TocaHack - Monday, June 1, 2015 - link

    I upgraded from SLI'd 580s to a 980 at the start of April. Now I'm wishing I'd waited for the Ti! It wasn't meant to launch this soon! :-/
  • mapesdhs - Wednesday, June 3, 2015 - link

    Indeed, one of the few sites to include 580 numbers, though it's a shame it's missing in some of the graphs (people forget there are lots of 3GB 580s around now, I bought ten last month).

    If it's of any use, I've done a lot of 580 SLI vs. 980 (SLI) testing, PM for a link to the results. I tested with 832MHz 3GB 580s, though the reference 783MHz 3GB models I was already using I sold for a nice profit to a movie company (excellent cards for CUDA, two of them beat a Titan), reducing the initial 980 upgrade to a mere +150.

    Overall, a 980 easily beats 580 SLI, and often comes very close to 3-way 580 SLI. The heavier the load, the bigger the difference, eg. for Firestrike Ultra, one 980 was between 50% and 80% faster than two 3GB 580s. I also tested 2/3-way 980 SLI, so if you'd like the numbers, just PM me or Google "SGI Ian" to find my site, contact page and Yahoo email adr.

    I've been looking for a newer test. I gather GTA V has a built-in benchmark, so finally I may have found something suitable, need to look into that.

    Only one complaint about the review though, why no CUDA test??? I'd really like to know how the range of NV cards stacks up now, and whether AE yet supports MW CUDA V2. I've tested 980s with Arion and Blender, it came close to two 580s, but not quite. Would be great to see how the 980 Ti compares to the 980 for this. Still plenty of people using CUDA with pro apps, especially AE.

    Ian.
  • mapesdhs - Wednesday, June 3, 2015 - link

    Btw Crest, which model 580s are you using? I do have some 1.5GB 580s aswell, but I've not really done much yet to expose where VRAM issues kick in, though it does show up in Unigine pretty well at 1440p.

    For reference, I do most testing with a 5GHz 2700K and a 4.8GHz 3930K, though I've also tested three 980s on a P55 with an i7 870 (currently the fastest P55 system on 3DMark for various tests).
  • Mikemk - Sunday, May 31, 2015 - link

    Since it has 2 SMM's disabled, does it have the memory issue of the 970? (Haven't read full article yet, sorry if answered in article)

Log in

Don't have an account? Sign up now