Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

GeForce Video Card Voltages
GTX 1080 Ti Boost GTX 1080 Boost GTX 980 Ti Boost GTX 1080 Ti Idle
1.062v 1.062v 1.187v 0.65v

Starting with voltages, there aren’t any big surprises with the GTX 1080 Ti. The underlying GP102 GPU has the same load voltages as the GP104 GPU in the GTX 1080, resulting in a load voltage of 1.062v.

Moving on, let’s take a look at average clockspeeds. The GTX 1080 Ti’s official base and boost clockspeeds are lower than the GTX 1080’s, but as we’ve seen before with other NVIDIA video cards, the actual clockspeeds are a little more variable and almost always higher than NVIDIA’s official values. Consequently the GTX 1080 Ti’s clockspeeds may on average trail the GTX 1080 less than what the specifications say.

GeForce Video Card Average Clockspeeds
Game GTX 1080 Ti GTX 1080
Max Boost Clock
1898MHz
1898MHz
Tomb Raider
1620MHz
1721MHz
DiRT Rally
1721MHz
1771MHz
Ashes
1680MHz
1759MHz
Battlefield 4
1657MHz
1771MHz
Crysis 3
1632MHz
1759MHz
The Witcher 3
1645MHz
1759MHz
The Division
1645MHz
1721MHz
Grand Theft Auto V
1746MHz
1797MHz
Hitman
1657MHz
1771MHz

On the whole, the GTX 1080 Ti does average lower clockspeeds than the GTX 1080. Whereas the latter would frequently average clockspeeds in the 1700MHz range, the GTX 1080 Ti averages clockspeeds in the 1600MHz range. This, in part, is why NVIDIA is promoting the GTX 1080 Ti as being 35% faster than the GTX 1080, despite the card having a 40% advantage in total hardware units.

It is interesting to note though that our GTX 1080 Ti sample has the same maximum boost clock as the GTX 1080: 1898MHz. If the GTX 1080 Ti didn’t hit its thermal limit as often, it likely would come even closer to the GTX 1080 in average clockspeeds.

Idle Power Consumption

Moving on to power consumption, we’ll start as always with idle power. All told, there are no surprises here. GTX 1080 Ti’s idle power consumption is right next to GTX 980 Ti, which is where we’d expect it given the 250W design.

Load Power Consumption - Crysis 3

System power consumption under Crysis 3 is also right where we’d expect it to be for the GTX 1080 Ti. In absolute terms it’s second only to the R9 Fury X – which is to say that it’s high – but as Crysis 3 is a real-world test, power numbers here are influenced by the rest of the system. The faster the card, the more work required of the CPU, RAM, etc, and that’s exactly what’s happening here.

Load Power Consumption - FurMark

Switching over to FurMark, which is a much more GPU-focused test, we find that our GTX 1080 Ti-equipped testbed draws 394W at the wall. This is a smidge higher than the GTX 980 Ti, but not meaningfully so. All 3 250W NVIDIA cards are closely clustered together, showing that NVIDIA’s power throttling is working as expected, and at the levels expected. GTX 1080 Ti is rated for 70W higher than the GTX 1080, and our results back this rating up. With NVIDIA’s well-established power/performance tiers, GTX 1080 Ti makes the expected leap in power consumption in order to reach its loftier performance target.

Idle GPU Temperature

Moving on to temperatures, at idle the GTX 1080 Ti settles at 30C, the same as its siblings.

Load GPU Temperature - Crysis 3

As for Crysis 3, the GTX 1080 Ti ends up being the hottest card here despite the cooling improvements, though it should be noted that this is intentional. While NVIDIA doesn’t publish this information directly, the GTX 1080 Ti’s preset thermal throttle point is 84C, which is a degree or two higher than on NVIDIA’s previous cards. As a result, the card reaches equilibrium at a slightly higher temperature than NVIDIA’s other cards.

It’s interesting to note that the throttle point has been slowly creeping up over the years; going back to the original Titan, it was only 80C. As far as reference specification designs go, the higher temperatures improve the efficiency of the cooler. The downside to higher temperatures is that power leakage increases with the temperature. So there’s a careful balancing act here in getting better cooling performance without drowning it out in more leakage-induced heat. In the case of the GTX 1080 Ti, I suspect NVIDIA paid their dues here with the additional MOSFETs, giving them a bit more headroom for leakage.

Load GPU Temperature - FurMark

The story is much the same under FurMark. The GTX 1080 Ti settles at 84C here as well – though it did peak at 86C before reaching equilibrium – showcasing that regardless of the workload, the card always levels out at its thermal throttling point.

Idle Noise Levels

Finally we have our look at noise, starting with idle noise. Relative to the GTX 1080 NVIDIA has tweaked the fan curve a bit here, but at idle the GTX 1080 Ti is already below our noise floor.

Load Noise Levels - Crysis 3

Moving over to Crysis 3, we find that the GTX 1080 Ti holds a small edge over our other 250W TI-series cards. NVIDIA’s latest card hits 50.3dB, about 1.5dB below the GTX 980 Ti and GTX 780 Ti. This won’t make much of a difference, but it does close the gap between the 250W cards and the GTX 1080 by a bit.

Load Noise Levels - FurMark

Finally, the situation with FurMark is much the same. The GTX 1080 Ti is still ever so slightly quieter than the other 250W NVIDIA cards, but a few dB louder than the GTX 1080.

Overall, judging from the power and noise characteristics of the GTX 1080 Ti, along with its throttling practices, it looks like NVIDIA invested most of their gains with the improved cooling system in removing more heat from the card itself. With few exceptions, the GTX 1080 Ti thermal throttles before it TDP throttles, and anecdotally, it tends to be closer to its TDP limit than most of the other high-end NVIDIA cards we’ve seen in the past couple of years.

Going back to what NVIDIA said earlier, they are claiming that the GTX 1080 Ti offers the largest performance uplift over its non-TI counterpart. Given that the TDP gap between the GTX 1080 Ti and GTX 1080 is actually smaller than the gap between the GTX 980 Ti and GTX 980 Ti – 70W versus 85W – if anything we’d expect this to be smaller. But by investing their gain from the improved cooler in better heat removal, NVIDIA is actually getting the GTX 1080 Ti closer to its performance/TDP limit than the previous generations of Ti cards. The only downside here is that you can only remove the DVI port once, so this isn’t an act that NVIDIA will be able to repeat in the next generation.

Synthetics Final Words
Comments Locked

161 Comments

View All Comments

  • eddman - Friday, March 10, 2017 - link

    So we moved from "nvidia pays devs to deliberately not optimize for AMD" to "nvidia works with devs to optimize the games for their own hardware, which might spoil them and result in them not optimizing for AMD properly".

    How is that bribery, illegal? If they did not prevent the devs from optimizing for AMD then nothing illegal happened. It was the devs own doing.
  • ddriver - Friday, March 10, 2017 - link

    Nope, there is an implicit, unspoken condition to receiving support from nvidia. To lazy slobs, that's welcome, and most devs are lazy slobs. Their line of reasoning is quite simple:

    "Working to optimize for amd is hard, I am a lazy and possibly lousy developer, so if they don't do that for me like nvidia does, I won't do that either, besides that would angry nvidia, since they only assist me in order to make their hardware look better, if I do my job and optimize for amd and their hardware ends up beating nvidia's, I risk losing nvidia's support, since why would they put money into helping me if they don't get the upper hand in performance. Besides, most people use nvidia anyway, so why even bother. I'd rather be taken to watch strippers again than optimize my software."

    Manipulation, bribery and extortion. nvidia uses its position to create situation in which game developers have a lot to profit from NOT optimizing for amd, and a lot to lose if they do. Much like intel did with its exclusive discounts. OEM's weren't exactly forced to take those discounts in exchange for not selling amd, they did what they knew would please intel to get rewarded for it. Literally the same thing nvidia does. Game developers know nvidia will be pleased to see their hardware getting an unfair performance advantage, and they know amd doesn't have the money to pamper them, so they do what is necessary please nvidia and ensure they keep getting support.
  • akdj - Monday, March 13, 2017 - link

    Where to start?
    Best not to start, as you are completely, 100% insane and I've spent two and a half 'reads' of your replies... trying to grasp WTH you're talking about and I'm lost
    Totally, completely lost in your conspiracy theories about two major GPU silicon builders while being apparently and completely clueless about ANY of it!
    Lol - Wow, I'm truly astounded that you were able to make up that much BS ...
  • cocochanel - Friday, March 10, 2017 - link

    You forgot to mention one thing. Nvidia tweaking the drivers to force users into hardware updates. Say, there is a bunch of games coming up this Christmas. If you have a card that's 3-4 years old, they release a new driver which performs poorly on your card ( on those games ) and another driver which performs way better on the newest cards. Then, if you start crying, they say: It's an old card, pal, why don't you buy a new one !
    With DX11 they could do that a lot. With DX12 and Vulkan it's a lot harder. Most if not all optimizations have to be done by the game programmers. Very little is left to the driver.
  • eddman - Friday, March 10, 2017 - link

    That's how the ENTIRE industry is. Do you really expect developers to optimize for old architectures. Everyone does it, nvidia, AMD, intel, etc.

    It is not deliberate. Companies are not going to spend time and money on old hardware with little market share. That's how it's been forever.

    Before you say that's not the case with radeons, it's because their GCN architecture hasn't changed dramatically since its first iteration. As a result, any optimization done for the latest GCN, affects the older ones to some extent too.
  • cocochanel - Friday, March 10, 2017 - link

    There is good news for the future. As DX12 and Vulkan become mainstream API's, game developers will have to roll up their sleeves and sweat it hard. Architecturely, these API's are totally different from the ground up and both trace their origin from Mantle. And Mantle was the biggest advance in graphics API's in a generation. The good days for lazy game developers is coming to an end, since these new API's put just about everything back into their hands whether they like it or not. Tweaking the driver won't make much of a difference. Read the API's documentation.
  • cmdrdredd - Monday, March 13, 2017 - link

    Yes hopefully this will be the future where games are the responsibility of the developer. Just like on Consoles. I know people hate consoles sometimes but the closed platform shows which developers have their stuff together and which are lazy bums because Sony and Microsoft don't optimize anything for the games.
  • Nfarce - Friday, March 10, 2017 - link

    Always amusing watching to tin foil hat Nvidia conspiracy nuts talk. Here's my example: working on Project Cars as an "early investor." Slightly Mad Studios gave both Nvidia and AMD each 12 copies of the beta release to work on, the same copy I bought. Nvidia was in constant communication with SMS developers and AMD was all but never heard from. After about six months, Nvidia had a demo of the racing game ready for a promotion of their hardware. Since AMD didn't take Project Cars seriously with SMS, Nvidia was able to get the game tweaked better for Nvidia. And SMS hat-tipped Nvidia with having billboards in the game showing Nvidia logos.

    Of course all the AMD fanboys claimed unfair competition and the usual whining when their GPUs do not perform as well in some games as Nvidia (they amazingly stayed silent when DiRT Rally, another development I was involved with, ran better on AMD GPUs and had AMD billboards).
  • ddriver - Friday, March 10, 2017 - link

    So was there anything preventing the actual developers from optimizing the game? They didn't have nvidia and amd hardware, so they sent betas to the companies to profile things and see how it runs?

    How silly one must be to expect that nvidia - a company that rakes in billions every year, and amd - a company is in the red most of the time and has lost billions, will have the same capacity to do game developers jobs for them?

    It is the game developer's job top optimize. Alas, as it seems, nvidia has bred a new breed of developers - those who do their job half-assedly and then wait on them to optimize, conveniently creating unfair advantage to their hardware.
  • ddriver - Friday, March 10, 2017 - link

    Also talking about fanboys - I am not that. Yes, I am running dozens of amd gpus, and I don't see myself buying any nvidia product any time soon, but that's only because the offer superior value to what I need them for.

    I don't give amd extra credit for offering a better value. I know this is not what they want. It is what they are being forced into.

    I am in a way grateful to nvidia for sandbagging amd, because this way I can get a much better value products. If things were square between the two, and all games were equally optimized, then both companies would offer products with approximately identical value.

    Which I would hate, because I'd lose the currently, 2-3x better value for the money i get with amd. I benefit and profit from nvidia being crooks, and I am happy that I can do that.

    So nvidia, keep doing what you are doing. I am not really objecting, I am simply stating the facts. Of course, nvidia fanboys would have a problem understanding that, and a problem with anyone tarnishing the good name of that helpful awesome and paying for strippers company.

Log in

Don't have an account? Sign up now