Power, Temperature, and Noise

With a large chip, more transistors, and more frames, questions always pivot to the efficiency of the card, and how well it sits with the overall power consumption, thermal limits of the default ‘coolers’, and the local noise of the fans when at load. Users buying these cards are going to be expected to push some pixels, which will have knock on effects inside a case. For our testing, we use a case for the best real-world results in these metrics.

Power

All of our graphics cards pivot around the 83-86W level when idle, though it is noticeable that they are in sets: the 2080 is below the 1080, the 2080 Ti sits above the 1080 Ti, and the Vega 64 consumes the most.

Idle Power Consumption

When we crank up a real-world title, all the RTX 20-series cards are pushing more power. The 2080 consumes 10W over the previous generation flagship, the 1080 Ti, and the new 2080 Ti flagship goes for another 50W system power beyond this. Still not as much as the Vega 64, however.

Load Power Consumption - Battlefield 1

For a synthetic like Furmark, the RTX 2080 results show that it consumes less than the GTX 1080 Ti, although the GTX 1080 is some 50W less. The margin between the RTX 2080 FE and RTX 2080 Ti FE is some 40W, which is indicative of the official TDP differences. At the top end, the RTX 2080 Ti FE and RX Vega 64 are consuming equal power, however the RTX 2080 Ti FE is pushing through more work.

Load Power Consumption - FurMark

For power, the overall differences are quite clear: the RTX 2080 Ti is a step up above the RTX 2080, however the RTX 2080 shows that it is similar to the previous generation 1080/1080 Ti.

Temperature

Straight off the bat, moving from the blower cooler to the dual fan coolers, we see that the RTX 2080 holds its temperature a lot better than the previous generation GTX 1080 and GTX 1080 Ti.

Idle GPU Temperature

Load GPU Temperature - Battlefield 1

Load GPU Temperature - FurMark

At each circumstance at load, the RTX 2080 is several degrees cooler than both the previous generation and the RTX 2080 Ti. The 2080 Ti fairs well in Furmark, coming in at a lower temperature than the 10-series, but trades blows in Battlefield. This is a win for the dual fan cooler, rather than the blower.

Noise

Similar to the temperature, the noise profile of the two larger fans rather than a single blower means that the new RTX cards can be quieter than the previous generation: the RTX 2080 wins here, showing that it can be 3-5 dB(A) lower than the 10-series and perform similar. The added power needed for the RTX 2080 Ti means that it is still competing against the GTX 1080, but it always beats the GTX 1080 Ti by comparison.

Idle Noise Levels

Load Noise Levels - Battlefield 1

Load Noise Levels - FurMark

Compute & Synthetics Final Words
Comments Locked

337 Comments

View All Comments

  • V900 - Thursday, September 20, 2018 - link

    That’s plain false.

    Tomb Raider is a title out now with RTX enabled in the game.

    Battlefield 5 is out in a month or two (though you can play it right now) and will also utilize RTX.

    Sorry to destroy your narrative with the fact, that one of the biggest titles this year is supporting RTX.

    And that’s of course just one out of a handful of titles that will do so, just in the next few months.

    Developer support seems to be the last thing that RTX2080 owners need to worry about, considering that there are dozens of titles, many of them big AAA games, scheduled for release just in the first half of 2019.
  • Skiddywinks - Friday, September 21, 2018 - link

    Unless I'm mistaken, TR does not support RTX yet. Obviously, otherwise it would be showing up in reviews everywhere. There is a reason every single reviewer is only benchmarking traditional games; that's all there is right now.
  • Writer's Block - Monday, October 1, 2018 - link

    Exactly.
    Is supporting or enabled.
    However - neiher actually have it now to see, to experience.
  • eva02langley - Thursday, September 20, 2018 - link

    These cards are nothing more than a cheap magic trick show. Nvidia knew about the performances being lackluster, and based their marketing over gimmick to square the competition by affirming that these will be the future of gaming and you will be missing out without it.

    Literally, they basically tried to create a need... and if you are defending Nvidia over this, you have just drinking the coolaid at this point.

    Quote me on this, this will be the next gameworks feature that devs will not bother touching. Why? Because devs are developing games on consoles and transit them to PC. The extra time in development doesn't bring back any additional profit.
  • Skiddywinks - Friday, September 21, 2018 - link

    Here's the thing though, I don't the performance is that lacklustre, the issue is we have this huge die and half of it does not do what most people want; give us more frames. If they had made the same size die with nothing but traditional CUDA cores, the 2080 Ti would be an absolute beast. And I'd imagine it would be a lot cheaper as well.

    But nVidia (maybe not mistakenly) have decided to push the raytracing path, and those of us you just want maximum performance for the price (me) and were waiting for the next 1080 Ti are basically left thinking "... oh well, skip".
  • eva02langley - Friday, September 21, 2018 - link

    DOn't get me wrong, these cards are a normal upgrade performance jump, however it is not the second christ sent that Nvidia is marketing.

    The problem here is Nvidia want to corner AMD and their tactic they choose is RTX. However RTX is nothing else than a FEATURE. The gamble could cost them a lot.

    If AMD gaming and 7nm strategy pays off, devs will develop on AMD hardware and transit to PC architecture leaving devs no incentive to put the extra work for a FEATURE.

    The extra cost of the bigger die should have been for gaming performances, but Nvidia strategy is to disrupt competition and further their stand as a monopoly as they can.

    Physx didn't work, hairwork didn't work and this will not work. As cool as it is, this should have been a feature for pro cards only, not consumers.
  • mapesdhs - Thursday, September 27, 2018 - link

    That's the thing though, they aren't a "normal" upgrade performance jump, because the prices make no sense.
  • AnnoyedGrunt - Thursday, September 20, 2018 - link

    This reminds me quite a bit of the original GeForce 256 launch. Not sure how many of you were following Anandtech back then, but it was my go-to site then just as it is now. Here are links to some of the original reviews:

    GeForce256 SDR: https://www.anandtech.com/show/391
    GeForce256 DDR: https://www.anandtech.com/show/429

    Similar to the 20XX series, the GeForce256 was Nvidia's attempt to change the graphics card paradigm, adding hardware tranformation and lighting to the graphics card (and relieving the CPU from those tasks). The card was faster than the contemporary cards, but also much more expensive, making the value questionable for many.

    At the time I was a young mechanical engineer, and I remember feeling that Nvidia was brilliant for creating this card. It let me run Pro/E R18 on my $1000 home computer, about as fast as I could on my $20,000 HP workstation. That card basically destroyed the market of workstation-centric companies like SGI and Sun, as people could now run CAD packages on a windows PC.

    The 20XX series gives me a similar feeling, but with less obvious benefit to the user. The cards are as fast or faster than the previous generation, but are also much more expensive. The usefulness is likely there for developers and some professionals like industrial designers who would love to have an almost-real-time, high quality, rendered image. For gamers, the value seems to be a stretch.

    While I was extremely excited about the launch of the original GeForce256, I am a bit "meh" about the 20XX series. I am looking to build a new computer and replace my GTX 680/i5-3570K, but this release has not changed the value equation at all.

    If I look at Wolfenstein, then a strong argument could be made for the 2080 being more future proof, but pretty much all other games are a wash. The high price of the 20XX series means that the 1080 prices aren't dropping, and I doubt the 2070 will change things much since it looks like it would be competing with the vanilla 1080, but costing $100 more.

    Looks like I will wait a bit more to see how that price/performance ends up, but I don't see the ray-tracing capabilities bringing immediate value to the general public, so paying extra for it doesn't seem to make a lot of sense. Maybe driver updates will improve performance in today's games, making the 20XX series look better than it does now, but I think like many, I was hoping for a bit more than an actual reduction in the performance/price ratio.

    -AG
  • eddman - Thursday, September 20, 2018 - link

    How much was a 256 at launch? I couldn't find any concrete pricing info but let's go with $500 to be safe. That's just $750 by today's dollar for something that is arguably the most revolutionary nvidia video card.
  • Ananke - Thursday, September 20, 2018 - link

    Yep, and it was also not selling well among "gamers" novelty, that became popular after falling under $100 a pop years later. Same here, financial analysts say the expected revenue from gaming products will drop in the near future, and Wall Street already dropped NVidia. Product is good, but expensive, it is not going to sell in volume, their revenue will drop in the imminent quarters.
    Apple's XS phone was the same, but Apple started a buy-one-get-one campaign on the very next day, plus upfront discount and solid buyback of iPhones. Yet, not clear whether they will achieve volume and revenue growth within the priced in expectations.
    These are public companies - they make money from Wall Street, and they /NVidia/ can lose much more and much faster on the capital markets, versus what they would gain in profitability from lesser volume high end boutique products. This was relatively sh**y launch - NVidia actually didn't want to launch anything, they want to sell their glut of GTX inventory first, but they have silicon ordered and made already at TSMC, and couldn't just sit on it waiting...

Log in

Don't have an account? Sign up now