Gaming Performance, Power, Temperature, & Noise

So with the basics of the architecture and core configuration behind us, let’s dive into some numbers.

Rise of the Tomb Raider - 3840x2160 - Very High (DX11)

Dirt Rally - 3840x2160 - Ultra

Ashes of the Singularity - 3840x2160 - Extreme

Battlefield 4 - 3840x2160 - Ultra Quality (0x MSAA)

Crysis 3 - 3840x2160 - Very High Quality + FXAA

The Witcher 3 - 3840x2160 - Ultra Quality (No Hairworks)

The Division - 3840x2160 - Ultra Quality

Grand Theft Auto V - 3840x2160 - Very High Quality

Hitman - 3840x2160 - Ultra Quality

As the first high-end card of this generation to launch, NVIDIA gets to set the pace for the market. At the risk of being redundant the GTX 1080 is now the fastest single-GPU card on the market, and even at 4K it wins at every single gaming benchmark, typically by a good margin. In practice we’re looking at a 31% performance lead over GTX 980 Ti – the card the GTX 1080 essentially replaces – with a similar 32% lead over AMD’s Radeon R9 Fury X. Meanwhile against the slightly older GTX 980, that gap is 70%.

On a generational basis this ends up being very close to the 74% jump in 4K performance going from the GTX 680 to GTX 980. And although the pricing comparison is not especially flattering for NVIDIA here, it should be evident that NVIDIA isn’t just looking to sell GTX 1080 as an upgrade for high-end Kepler cards, but as an upgrade for GTX 980 as well, just 20 months after it launched.

The Witcher 3 - 1920x1080 - Ultra Quality (No Hairworks)

I also wanted to quickly throw in a 1080p chart, both for the interest of comparing the GTX 1080 to the first-generation 28nm cards, and for gamers who are playing on high refresh rate 1080p monitors. Though this will of course vary from game to game, roughly speaking the GTX 1080 should be 3x faster than the GTX 680 or Radeon HD 7970. This is a good reminder of how architectural efficiency has played a greater role in past years, as this is a much larger gain than we saw jumping from 55nm to 40nm or 40nm to 28nm, both of which were much closer to the historical norm of 2x.

Load Power Consumption - Crysis 3

Meanwhile when it comes to power, temperature, and noise, NVIDIA continues to execute very well here. Power consumption under Crysis 3 is some 20W higher than GTX 980 or 52W lower than GTX 980 Ti, generally in line with NVIDIA’s own TDP ratings after accounting for the slightly higher CPU power consumption incurred by the card’s higher performance. The end result is that GTX 1080 is a bit more power hungry than GTX 980, but still in the sweet spot NVIDIA has carved out in the gaming market. Broadly speaking, this amounts to a 54% increase in energy efficiency in the case of Crysis 3.

Load GPU Temperature - Crysis 3

Load Noise Levels - Crysis 3

Otherwise from a design perspective the GTX 1080 Founders Edition carries on from NVIDIA’s high-end GTX 700/900 reference design, allowing NVIDIA to once again offer a superior blower-based solution. NVIDIA’s temperature management technology has not changed relative to Maxwell, so like their other cards, the GTX 1080 tops out in the low 80s for load temperature. More significantly, at 47.5 db(A) load noise, the card is on par with the GTX 780 and half a dB off of the GTX 980.

Ultimately NVIDIA has designed the GTX 1080 to be a drop-in replacement for the GTX 980, and this data confirms just that, indicating that GTX 1080’s much higher performance comes with only a slight increase in power consumption and no meaningful change in temperatures or acoustics.

The NVIDIA GeForce GTX 1080 Preview First Thoughts
Comments Locked

262 Comments

View All Comments

  • doggface - Tuesday, May 17, 2016 - link

    The problem with low end amd cards atm is they lack features. Give us a $150-200 card with 4k 10bit h/w 265 decode, hdmi2, dp 1.4, etc and moderate gaming performance and it will sell. Give us great performance/cost and shitty features. Watch it sit on the shelf.
  • Michael Bay - Wednesday, May 18, 2016 - link

    Everything destroys GT730, extrapolating anything out of such comparisons is a wishful thinking at best.
  • BrokenCrayons - Wednesday, May 18, 2016 - link

    The 730 was a cheap upgrade I did to get a hotter running and far older GeForce 8800 GTS out of my system last year to take some load off the power supply (only 375 watts) so I could upgrade the CPU from a tired Xeon 3065 to a Q6600 without pushing too hard on the PSU. The only feature I really did bother with making sure I got was GDDR5 so the chip wasn't hamstrung by64-bit DDR3's bandwidth issues. The A10's iGPU would indeed make it look underpowered, but I'm not in the market for integrated graphics for my desktop. However, it's long overdue for a rebuild for which I'm gathering parts now. I would have considered an A10, but instead I just picked up an Athlon x4 and will carry the 730 forward onto the new motherboard for a little while until 16/14nm makes its way down the product stack into lower end cards. Since I plan to eventually purchase whatever new generation hardware is out on the smaller process node anyway, a CPU with an iGPU that ultimately ends up being unused doesn't make a lot of sense. In the short term future 730 should be fine for anything I do anyway since I have no reason to push higher resolutions or use any sort of spatial anti-aliasing. All of that doesn't really matter once the game's video and audio are rolled up in an h.264 stream and pushed across my network from my gaming box to my netbook where I ultimately end up playing any games on a low resolution screen anyway. I think something around a GTX950's performance would be perfectly fine for anything I need to do so I'm content to wait until I can get that performance for around $100 or less. Spending my fun money on a computer is a very low priority and I can always wait until later to get newer/faster hardware if I game I'm interested in playing doesn't run on my current PC. Such is the case with Fallout 4, but I won't bother with it until all of its DRM is out, there are patches that address most of its issues, and it's got a GOTY edition on discount through Steam for $20. By then, whatever I'm running will be more than fast enough to offer an enjoyable gaming experience without me struggling and grubbing around to find high end gear for it or divert money from seeing films, traveling, or dining out. I also don't have to bother overclocking, buying aftermarket cooling solutions, managing cables to optimize airflow, or any of that other garbage I used to deal with years ago...I don't know how many hours I spent playing IDE cable origami so those big ribbons wouldn't impede a case fan's air current over a heatsink so I could eek out one or two meaninglessly fewer degrees C on an unimportant component. Now, screw it, I put crap together once and forget about it for a few years, enjoying the fun it provides on the way because I finally figured out that the parts are just a means to obtain a few hours a week of recreation and not the ends themselves.
  • paulemannsen - Thursday, May 19, 2016 - link

    Man you really try thinking too hard just for Angry Birds.
  • BrokenCrayons - Thursday, May 19, 2016 - link

    I understand that the idea of someone playing casual games while also keeping tabs on computer hardware is somehow a really threatening concept, but don't let it cloud your thoughts too much that you assume that Angry Birds and Fallout 4 are mutually exclusive. You can be smarter and better than that if you try.
  • lashek37 - Friday, May 20, 2016 - link

    As soon as AMD comes out with their card, Nvidia will unleash the GTX1080 T.I ,lol.😂
  • Yojimbo - Tuesday, May 17, 2016 - link

    They won't have the middle of the market completely to themselves. They'll have the only new cards in the segment for 2 or 3 months. But during that time those cards will be competing with the 980 and 970. AMD, on the other hand, probably can't make much money selling Fury cards priced to compete with the 1070, and they'll have virtually nothing competing with the 1080, and that situation will last for 6 or more months. That's the reason AMD will be hurt, not because of "ignorant customers", as you claim.
  • Yojimbo - Tuesday, May 17, 2016 - link

    As an aside, if consumers were ignorant to choose new Maxwell cards over older AMD cards competing against them, why will they not similarly be ignorant to choose new Polaris cards over the older Maxwell cards competing with them?
  • etre - Tuesday, May 24, 2016 - link

    I fail to see how chosing old tech over new tech for a price difference of few euros is the smart thing to do.
    Everyone wants new tech, it's a psychological and practical factor.

    As an example, where I'm living, in winter we can have -10 or -20 C but in summer is not uncommon to exceed 40C. For me power consumption is a factor. Less heat, less noise. The GTX line is well worth the money.
  • cheshirster - Tuesday, May 17, 2016 - link

    Last time they went middle first was a big success (4870 vs GTX265).
    Don't see a problem for them if their P10 can touch 1070 perf for <400$

Log in

Don't have an account? Sign up now