Gaming Performance, Power, Temperature, & Noise

So with the basics of the architecture and core configuration behind us, let’s dive into some numbers.

Rise of the Tomb Raider - 3840x2160 - Very High (DX11)

Dirt Rally - 3840x2160 - Ultra

Ashes of the Singularity - 3840x2160 - Extreme

Battlefield 4 - 3840x2160 - Ultra Quality (0x MSAA)

Crysis 3 - 3840x2160 - Very High Quality + FXAA

The Witcher 3 - 3840x2160 - Ultra Quality (No Hairworks)

The Division - 3840x2160 - Ultra Quality

Grand Theft Auto V - 3840x2160 - Very High Quality

Hitman - 3840x2160 - Ultra Quality

As the first high-end card of this generation to launch, NVIDIA gets to set the pace for the market. At the risk of being redundant the GTX 1080 is now the fastest single-GPU card on the market, and even at 4K it wins at every single gaming benchmark, typically by a good margin. In practice we’re looking at a 31% performance lead over GTX 980 Ti – the card the GTX 1080 essentially replaces – with a similar 32% lead over AMD’s Radeon R9 Fury X. Meanwhile against the slightly older GTX 980, that gap is 70%.

On a generational basis this ends up being very close to the 74% jump in 4K performance going from the GTX 680 to GTX 980. And although the pricing comparison is not especially flattering for NVIDIA here, it should be evident that NVIDIA isn’t just looking to sell GTX 1080 as an upgrade for high-end Kepler cards, but as an upgrade for GTX 980 as well, just 20 months after it launched.

The Witcher 3 - 1920x1080 - Ultra Quality (No Hairworks)

I also wanted to quickly throw in a 1080p chart, both for the interest of comparing the GTX 1080 to the first-generation 28nm cards, and for gamers who are playing on high refresh rate 1080p monitors. Though this will of course vary from game to game, roughly speaking the GTX 1080 should be 3x faster than the GTX 680 or Radeon HD 7970. This is a good reminder of how architectural efficiency has played a greater role in past years, as this is a much larger gain than we saw jumping from 55nm to 40nm or 40nm to 28nm, both of which were much closer to the historical norm of 2x.

Load Power Consumption - Crysis 3

Meanwhile when it comes to power, temperature, and noise, NVIDIA continues to execute very well here. Power consumption under Crysis 3 is some 20W higher than GTX 980 or 52W lower than GTX 980 Ti, generally in line with NVIDIA’s own TDP ratings after accounting for the slightly higher CPU power consumption incurred by the card’s higher performance. The end result is that GTX 1080 is a bit more power hungry than GTX 980, but still in the sweet spot NVIDIA has carved out in the gaming market. Broadly speaking, this amounts to a 54% increase in energy efficiency in the case of Crysis 3.

Load GPU Temperature - Crysis 3

Load Noise Levels - Crysis 3

Otherwise from a design perspective the GTX 1080 Founders Edition carries on from NVIDIA’s high-end GTX 700/900 reference design, allowing NVIDIA to once again offer a superior blower-based solution. NVIDIA’s temperature management technology has not changed relative to Maxwell, so like their other cards, the GTX 1080 tops out in the low 80s for load temperature. More significantly, at 47.5 db(A) load noise, the card is on par with the GTX 780 and half a dB off of the GTX 980.

Ultimately NVIDIA has designed the GTX 1080 to be a drop-in replacement for the GTX 980, and this data confirms just that, indicating that GTX 1080’s much higher performance comes with only a slight increase in power consumption and no meaningful change in temperatures or acoustics.

The NVIDIA GeForce GTX 1080 Preview First Thoughts
Comments Locked

262 Comments

View All Comments

  • QinX - Tuesday, May 17, 2016 - link

    Thanks for the explanation, I was worried that support for older games was already going down.
  • Badelhas - Tuesday, May 17, 2016 - link

    What about including tht HTC Vive on your benchmarks? If you talk about the VR benefits, you have to show them in graphs, it´s you speciality AnadTech! ;)
  • JeffFlanagan - Tuesday, May 17, 2016 - link

    Seconded. At this point VR gaming is much more interesting to me than even 4K gaming, and will drive my video card upgrades from now on. It's really nice to be able to play a game like it's the real world, rather than using a controller and looking at a screen.
  • MFK - Tuesday, May 17, 2016 - link

    Completely agreed.
    I'm a casual gamer, and my i5-2500k + GTX760 serve me perfectly fine.
    I have a 1440p monitor but I reduce the resolution to 1080 or 720 demanding on how demanding the game is.

    My upgrade will be determined and driven by VR. Whoever manages to deliver acceptable VR performance in a reasonable price will get my $.

    And they will be competing in price and content against the PS4k + Move + Morpheus combo.
  • Ryan Smith - Tuesday, May 17, 2016 - link

    It's in the works, though there's an issue with how many games can be properly tested in VR mode without a headset attached.
  • haplo602 - Tuesday, May 17, 2016 - link

    It will be interesting how much GDDR5X affects the scores vs GDDR5. 1080 vs 1070 will be very telling or in the alternative a downclocked 1080 vs a 980 Ti ....
  • fanofanand - Tuesday, May 17, 2016 - link

    excellent preview, little typo here.

    Translating this into numbers, at 4K we’re looking at 30% performance gain versus the GTX 980 and a 70% performance gain over the GTX 980, amounting to a very significant jump in efficiency and performance over the Maxwell generation. That durn GTX 980 is just all over the board!
  • tipoo - Tuesday, May 17, 2016 - link

    How does Pascal do on async compute? I know that was the big bugbear with Maxwell, with Nvidia promising it but it looking like they were doing it in CPU for scheduling, not GPU like GCN.

    http://www.extremetech.com/extreme/213519-asynchro...

    https://forum.beyond3d.com/threads/dx12-performanc...
  • Stuka87 - Tuesday, May 17, 2016 - link

    I do find it a bit annoying that you guys are still using a junk reference 290X instead of a properly cooled 390X.
  • TheinsanegamerN - Tuesday, May 17, 2016 - link

    That's what AMD provided. A custom cooled nvidia 980ti will perform better then the stock model, yet people dont complain about that.

    When anand DID use a third party card (460s IIRC) there was a massive backlash from the community saying they were 'unfair' in their reviews. So now they just use stock cards. Blame AMD for dropping the ball on that one.

Log in

Don't have an account? Sign up now