Gaming Performance, Power, Temperature, & Noise

So with the basics of the architecture and core configuration behind us, let’s dive into some numbers.

Rise of the Tomb Raider - 3840x2160 - Very High (DX11)

Dirt Rally - 3840x2160 - Ultra

Ashes of the Singularity - 3840x2160 - Extreme

Battlefield 4 - 3840x2160 - Ultra Quality (0x MSAA)

Crysis 3 - 3840x2160 - Very High Quality + FXAA

The Witcher 3 - 3840x2160 - Ultra Quality (No Hairworks)

The Division - 3840x2160 - Ultra Quality

Grand Theft Auto V - 3840x2160 - Very High Quality

Hitman - 3840x2160 - Ultra Quality

As the first high-end card of this generation to launch, NVIDIA gets to set the pace for the market. At the risk of being redundant the GTX 1080 is now the fastest single-GPU card on the market, and even at 4K it wins at every single gaming benchmark, typically by a good margin. In practice we’re looking at a 31% performance lead over GTX 980 Ti – the card the GTX 1080 essentially replaces – with a similar 32% lead over AMD’s Radeon R9 Fury X. Meanwhile against the slightly older GTX 980, that gap is 70%.

On a generational basis this ends up being very close to the 74% jump in 4K performance going from the GTX 680 to GTX 980. And although the pricing comparison is not especially flattering for NVIDIA here, it should be evident that NVIDIA isn’t just looking to sell GTX 1080 as an upgrade for high-end Kepler cards, but as an upgrade for GTX 980 as well, just 20 months after it launched.

The Witcher 3 - 1920x1080 - Ultra Quality (No Hairworks)

I also wanted to quickly throw in a 1080p chart, both for the interest of comparing the GTX 1080 to the first-generation 28nm cards, and for gamers who are playing on high refresh rate 1080p monitors. Though this will of course vary from game to game, roughly speaking the GTX 1080 should be 3x faster than the GTX 680 or Radeon HD 7970. This is a good reminder of how architectural efficiency has played a greater role in past years, as this is a much larger gain than we saw jumping from 55nm to 40nm or 40nm to 28nm, both of which were much closer to the historical norm of 2x.

Load Power Consumption - Crysis 3

Meanwhile when it comes to power, temperature, and noise, NVIDIA continues to execute very well here. Power consumption under Crysis 3 is some 20W higher than GTX 980 or 52W lower than GTX 980 Ti, generally in line with NVIDIA’s own TDP ratings after accounting for the slightly higher CPU power consumption incurred by the card’s higher performance. The end result is that GTX 1080 is a bit more power hungry than GTX 980, but still in the sweet spot NVIDIA has carved out in the gaming market. Broadly speaking, this amounts to a 54% increase in energy efficiency in the case of Crysis 3.

Load GPU Temperature - Crysis 3

Load Noise Levels - Crysis 3

Otherwise from a design perspective the GTX 1080 Founders Edition carries on from NVIDIA’s high-end GTX 700/900 reference design, allowing NVIDIA to once again offer a superior blower-based solution. NVIDIA’s temperature management technology has not changed relative to Maxwell, so like their other cards, the GTX 1080 tops out in the low 80s for load temperature. More significantly, at 47.5 db(A) load noise, the card is on par with the GTX 780 and half a dB off of the GTX 980.

Ultimately NVIDIA has designed the GTX 1080 to be a drop-in replacement for the GTX 980, and this data confirms just that, indicating that GTX 1080’s much higher performance comes with only a slight increase in power consumption and no meaningful change in temperatures or acoustics.

The NVIDIA GeForce GTX 1080 Preview First Thoughts
Comments Locked

262 Comments

View All Comments

  • Beararam - Tuesday, May 17, 2016 - link

    Also, chart shows the 780 having 256 bus width. Is definitely 384.
  • FMinus - Tuesday, May 17, 2016 - link

    Can we get a table comparing OCed GTX 980Ti to a stock and OCed GTX 1080 in the final review?
  • strafejumper - Tuesday, May 17, 2016 - link

    1 more request - later on update with the new game Overwatch - this comes out May 24
  • yhselp - Tuesday, May 17, 2016 - link

    ", and in time-honored fashion NVIDIA is starting at the high-end."

    Come on, at least acknowledge the fact NVIDIA are actually releasing a video card based on medium-sized GPU - the GTX 1080 - and marketing it as a flagship with a price to match, even more.

    No need to comment on the fact; no need to criticize NVIDIA for seeking huge margins in the consumer sector ever since Kepler, delaying the real high-end GPU, or any such thing. Just let your readers know the GTX 1080 is based on a mid-sized GPU which is not the GP100 flagship to come from the get go.

    A true time-honored fashion for NVIDIA would be releasing a new architecture with the biggest GPU and charging $500 from day one. Something that last happened with Fermi.
  • nevcairiel - Tuesday, May 17, 2016 - link

    It beats every other GPU on the market, if thats not high-end... High-end is a moving target, its whatever is the fastest at the time of writing.

    Certainly, it could be faster - it always can be. But GP100 just doesn't have the availability yet. They could wait longer and then launch your true "high-end" first, but instead we get new toys sooner, which is always a good thing.
  • yhselp - Wednesday, May 18, 2016 - link

    On the contrary, nowadays we get more performance late, and we pay double for it. We used to get the large-sized GPU first, with a new architecture - just like the GTX 480 - ever since Kepler, however, we've had to wait a year after release. In the meantime, NVIDIA have been charging flagship money for the medium-sized GPU - like the GTX 460 - and releasing the vulgar, super-high margin Titan somewhere in-between. Essentially, by the time the 780 Ti, the 980 Ti, the Titans and even the very cut-down 780 came out, they were already outdated products as far as technology goes, but still carried a premium price tag. Why is that so hard to understand?

    As far as performance goes - of course a new architecture on a new node will be significantly faster, there's nothing amazing about that. That doesn't mean a video card based on a mid-sized GPU should be marketed as a flagship, as the best thing since sliced bread, and carry such gruesome price premium - $700 for "irresponsible performance" - give me a break! - the only irresponsible thing is blind consumers eating this up. That's why we need competition.

    Keep making excuses for big companies, and see how they keep increasing pricing, delaying products, cutting features, and doing whatever the hell they want. Guess who gets screwed as a result of this - that would be you, and me, and every other consumer out there. So keep at it.
  • yhselp - Wednesday, May 18, 2016 - link

    Just to clarify a bit more: going into Kepler NVIDIA were quite nervous about how consumers would react to all this, and although journalists, including Anandtech, noted that the GTX 680 was not a direct successor to the GTX 580, but rather the new GTX 560 Ti, and as such was essentially twice as expensive, it didn't seem to bother consumers perhaps because, as you say - it's so new and fast. Whether it's really because consumers are misinformed, don't care, or a combination of both is irrelevant - it's now history. NVIDIA managed to get away with it. It has been that way ever since. And now, with Pascal, they're looking to expand on it all and charge even higher - up to $150 extra as noted at the end of this article. They might be looking to establish a great premium for overclocking capabilities as well. A sort of Intel K-series, but on top of a product that is already very expensive.

    The Titan-class cards are just the other side of this story. After a successful GTX 680 launch, NVIDIA decided to try and do the same with the large-sized Kepler GPU. On top of delaying the flagship product - the GK110 - they decided to, again, charge essentially double. And thus the original Titan was born. They were so nervous about it that they decided to enable serious compute performance on it so that if it fails in the consumer sector, it'd sell in the compute world. It outlived their wildest dreams - apparently, people were not only willing to throw money at them, but didn't know any better either. And so we put the writing on the wall, and we've been reaping the "benefits" ever since. It looks like we'll do the same again.
  • lashek37 - Tuesday, May 17, 2016 - link

    I'm selling my 980T.i and buying this beast.anybody on board?😂😉
  • Lolimaster - Tuesday, May 17, 2016 - link

    If you got a ti, theres no reason to "upgrade" to this card.

    Wait Vega or the 1080ti.
  • Iamthebst87 - Tuesday, May 17, 2016 - link

    If you do I'd wait till AIB card become available. The reference 1080 OCs like crap compared to Maxwell. The 980ti reference OC got about 20%-25% performance gain on OC, the 1080 gets about 10%-12%. If you have an AIB 980ti you might even be getting more on the OC. So to sum it up an AIB OC 980ti is only slightly slower 15%-20% than a OC 1080.

Log in

Don't have an account? Sign up now