Power, Temperature, and Noise

As always, we'll take a look at power, temperature, and noise of the RTX 2060 Founders Edition, though most of the highlights and trends we've seen twice before with the RTX 2080 Ti, RTX 2080, and RTX 2070 Founders Edition launches. For the most part, the dual axial fan open air design provide straightforward benefits in lower noise and cooling, which counterbalences the atypically large GPUs and new fixed-function hardware.

As this is a new GPU, we will quickly review the GeForce RTX 2060's stock voltages and clockspeeds as well.

NVIDIA GeForce Video Card Voltages
Model Boost Idle
GeForce RTX 2060 (6GB) Founders Edition 1.050V 0.725V
GeForce RTX 2070 Founders Edition 1.050v 0.718v
GeForce GTX 1060 6GB Founders Edition 1.062v 0.625v

The voltages are broadly comparable to the preceding 16nm GTX 1070. In comparison to pre-FinFET generations, these voltages are exceptionally lower because of the FinFET process used, something we went over in detail in our GTX 1080 and 1070 Founders Edition review. As we said then, the 16nm FinFET process requires said low voltages as opposed to previous planar nodes, so this can be limiting in scenarios where a lot of power and voltage are needed, i.e. high clockspeeds and overclocking. Of course, Turing (along with Volta, Xavier, and NVSwitch) are built on 12nm "FFN" rather than 16nm, but there is little detail on the exact process tweaks.

Power Consumption

The TDP increase to 160W brings the RTX 2060 (6GB) in between the 180W GTX 1080/1070 Ti and 150W GTX 1070. In turn, load consumption is more-or-less on that level, and nothing dissimilar to what we've seen. This also means that efficiency is around the same relative to performance, as opposed to the RTX 2070, 2080, and 2080 Ti.

Idle Power ConsumptionLoad Power Consumption - Battlefield 1Load Power Consumption - FurMark

 

 

Temperature & Noise

With an open air cooler design with dual axial fans, the results are in line with what we've seen with the other RTX Founders Editions.

Idle GPU TemperatureLoad GPU Temperature - Battlefield 1Load GPU Temperature - FurMark

Idle Noise LevelsLoad Noise Levels - Battlefield 1Load Noise Levels - FurMark

Compute & Synthetics Closing Thoughts
Comments Locked

134 Comments

View All Comments

  • sing_electric - Monday, January 7, 2019 - link

    It's likely that Nvidia has actually done something to restrict the 2060s to 6GB - either though its agreements with board makers or by physically disabling some of the RAM channels on the chip (or both). I agree, it'd be interesting to see how it performs, since I'd suspect it'd be at a decent price/perf point compared to the 2070, but that's also exactly why we're not likely to see it happen.
  • CiccioB - Monday, January 7, 2019 - link

    You can't add memory at will. You need to take into consideration the available bus, and as this is a 192bit bus, you can install 3, 6 or 12 GB of memory unless you cope with hybrid configuration thorough heavily optimized drivers (as nvidia did with 970).
  • nevcairiel - Monday, January 7, 2019 - link

    Even if they wanted to increase it, just adding 2GB more is hard to impossible. The chip has a certain memory interface, in this case 192-bit. Thats 6x 32-bit memory controller, for 6 1GB chips. You cannot just add 2 more without getting into trouble - like the 970, which had unbalanced memory speeds, which was terrible.
  • mkaibear - Tuesday, January 8, 2019 - link

    "terrible" in this case defined as "unnoticeable to anyone not obsessed with benchmark scores"
  • Retycint - Tuesday, January 8, 2019 - link

    It was unnoticeable back then, because even the most intensive game/benchmark rarely utilized more than 3.5GB of RAM. The issue, however, comes when newer games inevitably start to consume more and more VRAM - at which point the "terrible" 0.5GB of VRAM will become painfully apparent.
  • mkaibear - Wednesday, January 9, 2019 - link

    So, you agree with my original comment which was that it was not terrible at the time? Four years from launch and it's not yet "painfully apparent"?

    That's not a bad lifespan for a graphics card. Or if you disagree can you tell me which games, now, have noticeable performance issues from using a 970?

    FWIW my 970 has been great at 1440p for me for the last 4 years. No performance issues at all.
  • atragorn - Monday, January 7, 2019 - link

    I am more interested in that comment " yesterday’s announcement of game bundles for RTX cards, as well as ‘G-Sync Compatibility’, where NVIDIA cards will support VESA Adaptive Sync. That driver is due on the same day of the RTX 2060 (6GB) launch, and it could mean the eventual negation of AMD’s FreeSync ecosystem advantage." will ALL nvidia cards support Freesync/Freesync2 or only the the RTX series ?
  • A5 - Monday, January 7, 2019 - link

    Important to remember that VESA ASync and FreeSync aren't exactly the same.

    I don't *think* it will be instant compatibility with the whole FreeSync range, but it would be nice. The G-sync hardware is too expensive for its marginal benefits - this capitulation has been a loooooong time coming.
  • Devo2007 - Monday, January 7, 2019 - link

    Anandtech's article about this last night mentioned support will be limited to Pascal & Turing cards
  • Ryan Smith - Monday, January 7, 2019 - link

    https://www.anandtech.com/show/13797/nvidia-to-sup...

Log in

Don't have an account? Sign up now