Power, Temperature, and Noise

With a large chip, more transistors, and more frames, questions always pivot to the efficiency of the card, and how well it sits with the overall power consumption, thermal limits of the default ‘coolers’, and the local noise of the fans when at load. Users buying these cards are going to be expected to push some pixels, which will have knock on effects inside a case. For our testing, we use a case for the best real-world results in these metrics.

Power

All of our graphics cards pivot around the 83-86W level when idle, though it is noticeable that they are in sets: the 2080 is below the 1080, the 2080 Ti sits above the 1080 Ti, and the Vega 64 consumes the most.

Idle Power Consumption

When we crank up a real-world title, all the RTX 20-series cards are pushing more power. The 2080 consumes 10W over the previous generation flagship, the 1080 Ti, and the new 2080 Ti flagship goes for another 50W system power beyond this. Still not as much as the Vega 64, however.

Load Power Consumption - Battlefield 1

For a synthetic like Furmark, the RTX 2080 results show that it consumes less than the GTX 1080 Ti, although the GTX 1080 is some 50W less. The margin between the RTX 2080 FE and RTX 2080 Ti FE is some 40W, which is indicative of the official TDP differences. At the top end, the RTX 2080 Ti FE and RX Vega 64 are consuming equal power, however the RTX 2080 Ti FE is pushing through more work.

Load Power Consumption - FurMark

For power, the overall differences are quite clear: the RTX 2080 Ti is a step up above the RTX 2080, however the RTX 2080 shows that it is similar to the previous generation 1080/1080 Ti.

Temperature

Straight off the bat, moving from the blower cooler to the dual fan coolers, we see that the RTX 2080 holds its temperature a lot better than the previous generation GTX 1080 and GTX 1080 Ti.

Idle GPU Temperature

Load GPU Temperature - Battlefield 1

Load GPU Temperature - FurMark

At each circumstance at load, the RTX 2080 is several degrees cooler than both the previous generation and the RTX 2080 Ti. The 2080 Ti fairs well in Furmark, coming in at a lower temperature than the 10-series, but trades blows in Battlefield. This is a win for the dual fan cooler, rather than the blower.

Noise

Similar to the temperature, the noise profile of the two larger fans rather than a single blower means that the new RTX cards can be quieter than the previous generation: the RTX 2080 wins here, showing that it can be 3-5 dB(A) lower than the 10-series and perform similar. The added power needed for the RTX 2080 Ti means that it is still competing against the GTX 1080, but it always beats the GTX 1080 Ti by comparison.

Idle Noise Levels

Load Noise Levels - Battlefield 1

Load Noise Levels - FurMark

Compute & Synthetics Final Words
Comments Locked

337 Comments

View All Comments

  • Holliday75 - Friday, September 21, 2018 - link

    Good thing there are cops around to keep me honest. If they weren't I'd go on a murder spree and blame them for it.
  • Yojimbo - Wednesday, September 19, 2018 - link

    It's NVIDIA making a conscious decision to spend its engineering resources on innovating and implementing new technologies that will shift the future of gaming instead of spending that energy and die space on increasing performance as much as it can in today's titles. If NVIDIA left out the RT cores and other new technologies they could have easily increased performance 50 or 60% in legacy technologies by building chips bigger than Pascal but smaller than Turing, while increasing prices only moderately. Then everyone would be happy getting a card that would be leading them into a gaming torpor. In a few years when everyone is capable of running at 4k and over 60 fps they'd get bored and wonder why the industry were going nowhere.
  • NikosD - Wednesday, September 19, 2018 - link

    nVidia has done the same thing in the past, introducing new technologies and platforms like tesselation, PhysX, HairWorks, GameWorks, GPP etc.
    All of these were proved to be just tricks in order to kill competition, like always, which nowadays means to kill AMD.
    Pseudoraytracing is not an innovation or something mandatory for gaming.
    It's just another premature technology that the opponent doesn't have in order to be nVidia unique again with huge cost for the consumer and performance regression.

    I repeat.

    Skip that Turing fraud.
  • maximumGPU - Thursday, September 20, 2018 - link

    i don't think it's fair to compare ray tracing to HairWorks...
    ray tracing is a superior way to render graphics compared to rasterisation, there's no question about this.
  • Lolimaster - Saturday, September 22, 2018 - link

    But with what, nvidia RTX only do it on a small part of a FRAME, on selected scenes. On tensor core repurposed for that.

    You will need tensor cores in the 100's to make nvidia implementation more "wowish", 1000's to actually talk about raytracing being a thing.

    Consoles dictate gaming progress, AMD holds that.
  • Lolimaster - Saturday, September 22, 2018 - link

    Exactly, to start talking about actual raytracing or at least most of the parts of a scene, we need 10-100x the current gpu performance.
  • Yojimbo - Saturday, September 22, 2018 - link

    GPP was a partner promotion program. Hairworks is part of Gameworks. PhysX is part of Gameworks. Gameworks is not a trick, and neither is the PhysX part of it. But neither of them compare to ray tracing. Maybe you should like up what the word "pseudo" means, because you're using it wrong.

    In 1 year or a year and a half AMD will have their own ray tracing acceleration hardware and then you'll be all in on it.

    As for killing AMD, NVIDIA are not interested in it. It wouldn't be good for them, anyway. NVIDIA are, however, interested in building their platform and market dominance.
  • Yojimbo - Saturday, September 22, 2018 - link

    Edit: look up*
  • Eris_Floralia - Thursday, September 20, 2018 - link

    I've read all ur comments and still struggle to find any consistent logic.
  • eva02langley - Thursday, September 20, 2018 - link

    Nvidia is throwing down the throat of gamers Ray Tracing development. We are paying for something that we didn't even wanted at first.

    You didn't even know about Ray Tracing and DLSS before it was announced. You are just drinking the coolaid unlike many of us who stand out and raging against these INDECENT prices.

Log in

Don't have an account? Sign up now