Power, Temperature, and Noise

With a large chip, more transistors, and more frames, questions always pivot to the efficiency of the card, and how well it sits with the overall power consumption, thermal limits of the default ‘coolers’, and the local noise of the fans when at load. Users buying these cards are going to be expected to push some pixels, which will have knock on effects inside a case. For our testing, we use a case for the best real-world results in these metrics.

Power

All of our graphics cards pivot around the 83-86W level when idle, though it is noticeable that they are in sets: the 2080 is below the 1080, the 2080 Ti sits above the 1080 Ti, and the Vega 64 consumes the most.

Idle Power Consumption

When we crank up a real-world title, all the RTX 20-series cards are pushing more power. The 2080 consumes 10W over the previous generation flagship, the 1080 Ti, and the new 2080 Ti flagship goes for another 50W system power beyond this. Still not as much as the Vega 64, however.

Load Power Consumption - Battlefield 1

For a synthetic like Furmark, the RTX 2080 results show that it consumes less than the GTX 1080 Ti, although the GTX 1080 is some 50W less. The margin between the RTX 2080 FE and RTX 2080 Ti FE is some 40W, which is indicative of the official TDP differences. At the top end, the RTX 2080 Ti FE and RX Vega 64 are consuming equal power, however the RTX 2080 Ti FE is pushing through more work.

Load Power Consumption - FurMark

For power, the overall differences are quite clear: the RTX 2080 Ti is a step up above the RTX 2080, however the RTX 2080 shows that it is similar to the previous generation 1080/1080 Ti.

Temperature

Straight off the bat, moving from the blower cooler to the dual fan coolers, we see that the RTX 2080 holds its temperature a lot better than the previous generation GTX 1080 and GTX 1080 Ti.

Idle GPU Temperature

Load GPU Temperature - Battlefield 1

Load GPU Temperature - FurMark

At each circumstance at load, the RTX 2080 is several degrees cooler than both the previous generation and the RTX 2080 Ti. The 2080 Ti fairs well in Furmark, coming in at a lower temperature than the 10-series, but trades blows in Battlefield. This is a win for the dual fan cooler, rather than the blower.

Noise

Similar to the temperature, the noise profile of the two larger fans rather than a single blower means that the new RTX cards can be quieter than the previous generation: the RTX 2080 wins here, showing that it can be 3-5 dB(A) lower than the 10-series and perform similar. The added power needed for the RTX 2080 Ti means that it is still competing against the GTX 1080, but it always beats the GTX 1080 Ti by comparison.

Idle Noise Levels

Load Noise Levels - Battlefield 1

Load Noise Levels - FurMark

Compute & Synthetics Final Words
Comments Locked

337 Comments

View All Comments

  • Fritzkier - Wednesday, September 19, 2018 - link

    Blame both. Why the f you blame AMD for NVIDIA's own fault?
    And yes, AMD had competitive offering on mid-end, not on high end. But, that's before 7mm. Let's see what will we got on 7mm. 7mm will be released next year anyway, it's not that far off.
  • PopinFRESH007 - Wednesday, September 19, 2018 - link

    Yep, lets wait for those 7mm processes. Those chips should only be the size of my computer with a couple hundred thousand transistors.
  • Holliday75 - Friday, September 21, 2018 - link

    Haha I was about to question your statement until I paid more attention to the process size he mentioned.
  • Fritzkier - Saturday, September 22, 2018 - link

    We seriously needs an edit button. Thanks autocorrect.
  • Yojimbo - Wednesday, September 19, 2018 - link

    So you are saying that if AMD were competitive then NVIDIA could never have implemented such major innovations in games technology... So, competition is bad?
  • dagnamit - Thursday, September 20, 2018 - link

    Competition can stifle innovation when the market is involved in race to see how efficiently they can leverage current technology. The consumer GPU market has been about the core count/core efficiency race for a very long time.

    Because Nvidia has a commanding lead in that department, they are able to add in other technology without falling behind AMD. In fact, they’ve been given the opportunity to start an entirely new market with ray-tracing tech.

    There are a great many more companies developing ray-tracing hardware than rasterization focused hardware at the current moment. With Nvidia throwing their hat in now, it could mean other companies start to bring hardware solutions to the fore that don’t have a Radeon badge. It won’t be Red v. Green anymore, and that’s very exciting.
  • Spunjji - Friday, September 21, 2018 - link

    Your Brave New World would involve someone else magically catching up with AMD and Nvidia's lead in conventional rasterization tech. Spoiler alert: nobody has in the past 2 decades and the best potential competition, Intel, isn't entering the fray until ~2020
  • dagnamit - Sunday, September 23, 2018 - link

    No. I’m saying that companies that specialize in ray-tracing technology may have an opportunity to get into the consumer discrete GPU market. They don’t need to catch up with anything.
  • eva02langley - Thursday, September 20, 2018 - link

    Not AMD fault if Nvidia is asking 1200$ US. Stop blaming AMD because you want to purchase Nvidia cards at better price, BLAME Nvidia!

    It is not AMD who force Ray Tracing on us. It is not AMD who want to provide gamework tools to sabotage the competition and gamers at the same time. It is not AMD charging us the G-sync tax. It is not AMD that screw gamers for the wallet of investors.

    It is all Nvidia fault! Stop defending them! There is no excuses.
  • BurntMyBacon - Thursday, September 20, 2018 - link

    I accept that nVidia's choices are their own and not the "fault" of any third party. On the other hand, nVidia is a business and their primary objective is to make money. Manufacturing GPUs with features and performance that customers find valuable is a tool to meet their objective. So while their decisions are their own responsibility, they are not unexpected. Competition from a third party with the same money making objective limits their ability to make money as they now have to provide at least the perception of more value to the customer. Previous generation hardware also limits their ability to make money as the relative increase in features and performance (and consequently value) are less than if the previous generation didn't exist. If the value isn't perceived to be high enough, customers won't upgrade from existing offerings. However, if nVidia simply stops offering previous generation hardware, new builds may still be a significant source of sales for those without an existing viable product.

    Long story short, since there is no viable competition from AMD or another third party to limit nVidia's prices, it falls to us as consumers to keep the prices in check through waiting or buying previous gen hardware. If, however, consumers in general decide these cards are worth the cost, then those who are discontent simply need to accept that they fit into a lower price category of the market than they previously did. It is unlikely that nVidia will bring prices back down without reason.

    Note: I tend to believe that nVidia got a good idea of how much more the market was willing to pay for their product during the mining push. Though I don't like it (and won't pay for it), I can't really blame them for wanting the extra profits in their own coffers rather than letting it go to retailers.

Log in

Don't have an account? Sign up now