Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

GeForce Video Card Voltages
GTX 1080 Ti Boost GTX 1080 Boost GTX 980 Ti Boost GTX 1080 Ti Idle
1.062v 1.062v 1.187v 0.65v

Starting with voltages, there aren’t any big surprises with the GTX 1080 Ti. The underlying GP102 GPU has the same load voltages as the GP104 GPU in the GTX 1080, resulting in a load voltage of 1.062v.

Moving on, let’s take a look at average clockspeeds. The GTX 1080 Ti’s official base and boost clockspeeds are lower than the GTX 1080’s, but as we’ve seen before with other NVIDIA video cards, the actual clockspeeds are a little more variable and almost always higher than NVIDIA’s official values. Consequently the GTX 1080 Ti’s clockspeeds may on average trail the GTX 1080 less than what the specifications say.

GeForce Video Card Average Clockspeeds
Game GTX 1080 Ti GTX 1080
Max Boost Clock
1898MHz
1898MHz
Tomb Raider
1620MHz
1721MHz
DiRT Rally
1721MHz
1771MHz
Ashes
1680MHz
1759MHz
Battlefield 4
1657MHz
1771MHz
Crysis 3
1632MHz
1759MHz
The Witcher 3
1645MHz
1759MHz
The Division
1645MHz
1721MHz
Grand Theft Auto V
1746MHz
1797MHz
Hitman
1657MHz
1771MHz

On the whole, the GTX 1080 Ti does average lower clockspeeds than the GTX 1080. Whereas the latter would frequently average clockspeeds in the 1700MHz range, the GTX 1080 Ti averages clockspeeds in the 1600MHz range. This, in part, is why NVIDIA is promoting the GTX 1080 Ti as being 35% faster than the GTX 1080, despite the card having a 40% advantage in total hardware units.

It is interesting to note though that our GTX 1080 Ti sample has the same maximum boost clock as the GTX 1080: 1898MHz. If the GTX 1080 Ti didn’t hit its thermal limit as often, it likely would come even closer to the GTX 1080 in average clockspeeds.

Idle Power Consumption

Moving on to power consumption, we’ll start as always with idle power. All told, there are no surprises here. GTX 1080 Ti’s idle power consumption is right next to GTX 980 Ti, which is where we’d expect it given the 250W design.

Load Power Consumption - Crysis 3

System power consumption under Crysis 3 is also right where we’d expect it to be for the GTX 1080 Ti. In absolute terms it’s second only to the R9 Fury X – which is to say that it’s high – but as Crysis 3 is a real-world test, power numbers here are influenced by the rest of the system. The faster the card, the more work required of the CPU, RAM, etc, and that’s exactly what’s happening here.

Load Power Consumption - FurMark

Switching over to FurMark, which is a much more GPU-focused test, we find that our GTX 1080 Ti-equipped testbed draws 394W at the wall. This is a smidge higher than the GTX 980 Ti, but not meaningfully so. All 3 250W NVIDIA cards are closely clustered together, showing that NVIDIA’s power throttling is working as expected, and at the levels expected. GTX 1080 Ti is rated for 70W higher than the GTX 1080, and our results back this rating up. With NVIDIA’s well-established power/performance tiers, GTX 1080 Ti makes the expected leap in power consumption in order to reach its loftier performance target.

Idle GPU Temperature

Moving on to temperatures, at idle the GTX 1080 Ti settles at 30C, the same as its siblings.

Load GPU Temperature - Crysis 3

As for Crysis 3, the GTX 1080 Ti ends up being the hottest card here despite the cooling improvements, though it should be noted that this is intentional. While NVIDIA doesn’t publish this information directly, the GTX 1080 Ti’s preset thermal throttle point is 84C, which is a degree or two higher than on NVIDIA’s previous cards. As a result, the card reaches equilibrium at a slightly higher temperature than NVIDIA’s other cards.

It’s interesting to note that the throttle point has been slowly creeping up over the years; going back to the original Titan, it was only 80C. As far as reference specification designs go, the higher temperatures improve the efficiency of the cooler. The downside to higher temperatures is that power leakage increases with the temperature. So there’s a careful balancing act here in getting better cooling performance without drowning it out in more leakage-induced heat. In the case of the GTX 1080 Ti, I suspect NVIDIA paid their dues here with the additional MOSFETs, giving them a bit more headroom for leakage.

Load GPU Temperature - FurMark

The story is much the same under FurMark. The GTX 1080 Ti settles at 84C here as well – though it did peak at 86C before reaching equilibrium – showcasing that regardless of the workload, the card always levels out at its thermal throttling point.

Idle Noise Levels

Finally we have our look at noise, starting with idle noise. Relative to the GTX 1080 NVIDIA has tweaked the fan curve a bit here, but at idle the GTX 1080 Ti is already below our noise floor.

Load Noise Levels - Crysis 3

Moving over to Crysis 3, we find that the GTX 1080 Ti holds a small edge over our other 250W TI-series cards. NVIDIA’s latest card hits 50.3dB, about 1.5dB below the GTX 980 Ti and GTX 780 Ti. This won’t make much of a difference, but it does close the gap between the 250W cards and the GTX 1080 by a bit.

Load Noise Levels - FurMark

Finally, the situation with FurMark is much the same. The GTX 1080 Ti is still ever so slightly quieter than the other 250W NVIDIA cards, but a few dB louder than the GTX 1080.

Overall, judging from the power and noise characteristics of the GTX 1080 Ti, along with its throttling practices, it looks like NVIDIA invested most of their gains with the improved cooling system in removing more heat from the card itself. With few exceptions, the GTX 1080 Ti thermal throttles before it TDP throttles, and anecdotally, it tends to be closer to its TDP limit than most of the other high-end NVIDIA cards we’ve seen in the past couple of years.

Going back to what NVIDIA said earlier, they are claiming that the GTX 1080 Ti offers the largest performance uplift over its non-TI counterpart. Given that the TDP gap between the GTX 1080 Ti and GTX 1080 is actually smaller than the gap between the GTX 980 Ti and GTX 980 Ti – 70W versus 85W – if anything we’d expect this to be smaller. But by investing their gain from the improved cooler in better heat removal, NVIDIA is actually getting the GTX 1080 Ti closer to its performance/TDP limit than the previous generations of Ti cards. The only downside here is that you can only remove the DVI port once, so this isn’t an act that NVIDIA will be able to repeat in the next generation.

Synthetics Final Words
Comments Locked

161 Comments

View All Comments

  • ddriver - Thursday, March 9, 2017 - link

    It is kinda both, although I wouldn't really call it a job, because that's when you are employed by someone else to do what he says. More like it's my work and hobby. Building a super computer on the budget out of consumer grade hardware turned out very rewarding in every possible aspect.
  • Zingam - Friday, March 10, 2017 - link

    This is something I'd like to do. Not necessarily with GPUs but I have no idea how to make any money tobpay the bill yet. I'vw only started thinking about it recently.
  • eddman - Thursday, March 9, 2017 - link

    "nvidia will pay off most game developers to sandbag"

    AMD, nvidia, etc. might work with developers to optimize a game for their hardware.

    Suggesting that they would pay developers to deliberately not optimize a game for the competition or even make it perform worse is conspiracy theories made up on the internet.

    Not to mention it is illegal. No one would dare do it in this day and age when everything leaks eventually.
  • DanNeely - Thursday, March 9, 2017 - link

    Something that blatant would be illegal. What nVidia does do is to offer a bunch of blobs that do various effects simulations/etc that can save developers a huge amount of time vs coding their own versions but which run much faster on their own hardware than nominally equivalent AMD cards. I'm not even going accuse them of deliberately gimping AMD (or Intel) performance, only having a single code path that is optimized for the best results on their hardware will be sub-optimal on anything else. And because Gameworks is offered up as blobs (or source with can't show it to AMD NDA restrictions) AMD can't look at the code to suggest improvements to the developers or to fix things after the fact with driver optimizations.
  • eddman - Thursday, March 9, 2017 - link

    True, but most of these effects are CPU-only, and fortunately the ones that run on the GPU can be turned off in the options.

    Still, I agree that vendor specific, source-locked GPU effects are not helping the industry as a whole.
  • ddriver - Thursday, March 9, 2017 - link

    Have you noticed anyone touching nvidia lately? They are in bed with the world's most evil bstards. Nobody can touch them. Their practice is they offer assistance on exclusive terms, all this aims to lock in developers into their infrastructure, or the very least on the implied condition they don't break a sweat optimizing for radeons.

    I have very close friends working at AAA game studios and I know first hand. It all goes without saying. And nobody talks about it, not if they'd like to keep their job, or be able to get a good job in the industry in general.

    nvidia pretty much do the same intel was found guilty of on every continent. But it is kinda less illegal, because it doesn't involve discounts, so they cannot really pin bribery on them, in case that anyone would dare challenge them.

    amd is actually very competitive hardware wise, but failing at their business model, they don't have the money to resist nvidia's hold on the market. I run custom software at a level as professional as it gets, and amd gpus totally destroy nvidian at the same or even higher price point. Well, I haven't been able to do a comparison lately, as I have migrated my software stack to OpenCL2, which nvidia deliberately do not implement to prop up their cuda, but couple of years back I was able to do direct comparisons, and as mentioned above, nvidia offered 2 to 3 times worse value than amd. And nothing has really changed in that aspect, architecturally amd continue to offer superior compute performance, even if their DP rates have been significantly slashed in order to stay competitive with nvidia silicon.

    A quick example:
    ~2500$ buys you either a:
    fire pro with 32 gigs of memory and 2.6 tflops FP64 perf and top notch CL support
    quadro with 8 gigs of memory and 0.13 tflops FP64 perf and CL support years behind

    Better compute features, 4 times more memory and 20 times better compute performance at the same price. And yet the quadro outsells the firepro. Amazing, ain't it?

    It is true that 3rd party cad software still runs a tad better on a quadro, for the reasons and nvidian practices outlined above, but even then, the firepro is still fast enough to do the job, while completely annihilating quadros in compute. Which is why at this year's end I will be buying amd gpus by the dozens rather than nvidia ones.
  • eddman - Friday, March 10, 2017 - link

    So you're saying nvidia constantly engages in illegal activities with developers?

    I don't see how pro cards and software have to do with geforce and games. There is no API lock-in for games.
  • thehemi - Friday, March 10, 2017 - link

    > "And nobody talks about it, not if they'd like to keep their job"

    Haha we're not scared of NVIDIA, they are just awesome. I'm in AAA for over a decade, they almost bought my first company and worked closely with my next three so I know them very well. Nobody is "scared" of NVIDIA. NVIDIA have their devrel down. They are much more helpful with optimizations, free hardware, support, etc. Try asking AMD for the same and they treat you like you're a peasant. When NVIDIA give us next-generation graphics cards for all our developers for free, we tend to use them. When NVIDIA sends their best graphics engineers onsite to HELP us optimize for free, we tend to take them up on their offers. Don't think I haven't tried getting the same out of AMD, they just don't run the company that way, and that's their choice.

    And if you're really high up, their dev-rel includes $30,000 nights out that end up at the strip club. NVIDIA have given me some of the best memories of my life, they've handed me a next generation graphics card at GDC because I joked that I wanted one, they've funded our studio when it hit a rough patch and tried to justify it with a vendor promotion on stage at CES with our title. I don't think that was profitable for them, but the good-will they instilled definitely has been.

    I should probably write a "Secret diaries of..." blog about my experiences, but the bottom line is they never did anything but offer help that was much appreciated.

    Actually, heh, The worst thing they did, was turn on physx support by default for a game we made with them for benchmarks back when they bought Ageia. My game engine was used for their launch demo, and the review sites (including here I think) found out that if you turned a setting off to software mode, Intel chips doing software physics were faster than NVIDIA physics accelerated mode. Still not illegal, and still not afraid of keeping my job, since I've made it pretty obvious who I am to the right people.
  • ddriver - Friday, March 10, 2017 - link

    Well, for you it might be the carrot, but for others is the stick. Not all devs are as willing to leave their products upoptimized in exchange for a carrot as you are. Nor do they need nvidia to hold them by the hand and walk them through everything that is remotely complex in order to be productive.

    In reality both companies treat you like a peasant, the difference is that nvidia has the resources to make into a peasant they can use, while to poor old amd you are just a peasant they don't have the resources to pamper. Try this if you dare - instead of being a lazy grateful slob take the time and effort to optimize your engine to take the most of amd hardware and brag about that marvelous achievement, and see if nvidia's pampering will continue.

    It is still technically a bribe - helping someone to do something for free that ends up putting them at an unfair advantage. It is practically the same thing as giving you the money to hire someone who is actually competent to do what you evidently cannot be bother with or are unable to do. They still pay the people who do that for you, which would be the same thing if you paid them with money nvidia gave you for it. And you are so grateful for that assistance, that you won't even be bothered to optimize your software for that vile amd, who don't rush to offer to do your job for you like noble, caring nvidia does.
  • ddriver - Friday, March 10, 2017 - link

    It is actually a little sad to see developers so cheap. nvidia took you to see strippers once and now you can't get your tongue out their ass :)

    but it is understandable, as a developer there is a very high chance it was the first pussy you've seen in real life :D

Log in

Don't have an account? Sign up now