Power, Temperature, & Noise

Moving on from performance metrics, we’ll touch upon power, temperature, and noise. This is also normally where we’d discuss voltages, but as Vega is a new chip on a new architecture, nothing seems to read Vega 64 and 56 correctly.

In terms of average game clockspeeds, neither card maintains its boost specification at 100% with prolonged usage. Vega 64 tends to stay closer to its boost clocks, which is in line with its additional power overhead and higher temperature target over Vega 56.

Radeon RX Vega Average Clockspeeds
  Radeon RX Vega 64 Air Radeon RX Vega 56
Boost Clocks
1546MHz
1471MHz
Max Boost (DPM7)
1630MHz
1590MHz
 
Battlefield 1
1512MHz
1337MHz
Ashes: Escalation
1542MHz
1354MHz
DOOM
1479MHz
1334MHz
Ghost Recon: Wildlands
1547MHz
1388MHz
Dawn of War III
1526MHz
1335MHz
Deus Ex: Mankind Divided
1498MHz
1348MHz
GTA V
1557MHz
1404MHz
F1 2016
1526MHz
1394MHz
 
FurMark
1230MHz
HBM2: 868MHz
1099MHz
HBM2: 773MHz

With games, the HBM2 clocks ramp up and stay at their highest clock state. Expectedly, the strains of FurMark cause the cards to oscillate memory clocks: between 945MHz and 800MHZ for Vega 64, and between 800MHz and 700MHz for Vega 56. On that note, HBM2 comes with an idle power state (167MHz), an improvement on Fiji's HBM1 single power state. Unfortunately, the direct power savings are a little obscured since, as we will soon see, Vega 10 is a particularly power hungry chip.

As mentioned earlier, we used the default out-of-the-box configuration for power: Balanced, with the corresponding 220W GPU power limit. And under load, Vega needs power badly.

Idle Power ConsumptionLoad Power Consumption - Battlefield 1Load Power Consumption - FurMark

The performance of both Vega cards comes at a significant power cost. For the RX 500 series, we mused that load consumption is where AMD paid the piper. Here, the piper has taken AMD to the cleaners. In Battlefield 1, Vega 64 consumes 150W more system-wide power than the GTX 1080, its direct competitor. To be clear, additional power draw is expected, since Vega 64 is larger in both shader count (4096 vs. 2560) and die size (486mm2 vs. 314mm2) to the GTX 1080. But in that sense, when compared with the 1080 Ti, powered by the 471mm2 GP102, Vega 64 still consumes more power. 

As for Vega 64's cut-down sibling, Vega 56's lower temperature target, lower clocks, and lower board power make its consumption look much more reasonable, although it is still well above the 1070.

In any case, the cooling solutions are able to do the job without severe effects on temperature and noise. As far as blowers go, RX Vega 64 and 56 are comparable to the 1080 Ti FE blower.

Idle GPU TemperatureLoad GPU Temperature - Battlefield 1Load GPU Temperature - FurMark
Not Graphed: Temperature of the actual Vega (Star): 9329C

Noise-testing equipment and methodology differ from past results, with a more sensitive noise meter and closer distance to the graphics card. Readings were also taken with an open case. As such, the noise levels may appear higher than expected.

Idle Noise LevelsLoad Noise Levels - Battlefield 1Load Noise Levels - FurMark

 

Synthetics Final Words
Comments Locked

213 Comments

View All Comments

  • BOBOSTRUMF - Monday, August 14, 2017 - link

    well, I was expected lower performance compared to a geforce 1080 so this is one of the few plusses. Now NVIDIA only has to bump the base clocks for the Geforce 1080 while still consuming less power. Competition is great but this is not the best product from AMD, on 14nm the gains should be much higher. Fortunately AMD is great now on CPU's and that will hopefully bring income that should be invested in GPU research.
    Good luck AMD
  • mapesdhs - Monday, August 14, 2017 - link

    NV doesn't have to do anything as long as retail pricing has the 1080 so much cheaper. I look foward to seeing how the 56 fares.
  • webdoctors - Tuesday, August 15, 2017 - link

    It looks like the 1080 MSRP is actually less! Other sites mentioning the initial price included a $100 rebate which has expired :( and the new MSRP has taken effect....

    https://pcgamesn.com/amd/amd-rx-vega-rebates
  • mdriftmeyer - Monday, August 14, 2017 - link

    Remember your last paragraph after the game engines adopt AMD's architecture and features, of which they have committed themselves in doing, and already partially in development. When that happens I look forward to you asking what the hell went wrong at Nvidia.
  • Yojimbo - Monday, August 14, 2017 - link

    The whole "game engines will adopt AMD's architecture" thesis was made when the Xbox One and PS4 were released in 2013. Since then, AMD's market share among PC gamers has declined considerably and NVIDIA seems to be doing just fine in terms of features and performance in relevant game engines. The XBox One and PS4 architectures account for a significant percentage of total software sales. Vega architecture will account for a minuscule percentage. So why would the thesis hold true for Vega when it didn't hold true for Sea Islands?

    Besides, NVIDIA has had packed FP16 capability since 2015 with the Tegra X1. They also have it in their big GP100 and GV100 GPUs. They can relatively easily implement it in consumer GeForce GPUs whenever they feel it is appropriate. And within 3 months of doing so they will have more FP16-enabled gaming GPUs in the market than Vega will represent over its entire lifespan.
  • Yojimbo - Monday, August 14, 2017 - link

    That means the Nintendo Switch is FP16 capable, by the way.
  • mapesdhs - Monday, August 14, 2017 - link

    Good points, and an extra gazillion for reminding me of an awesome movie. 8)
  • stockolicious - Tuesday, August 15, 2017 - link

    "the Xbox One and PS4 were released in 2013. Since then, AMD's market share among PC gamers has declined considerably "

    The problem AMD had was they could not play to their advantage - which was having a CPU and GPU. The CPU was so aweful that nobody used them to game (or very few) now that Ryzen is here and successful they will gain GPU share even though their top cards dont beat Nvida. This is called "Attach Rate" - when a person buys a Computer with an AMD CPU the get an AMD GPU 55% of the time vs 25% of the time with an Intel CPU. AMD had the same issue with their APU - the CPU side was so bad that nobody cared to build designs around them but now with Raven Ridge coming Ryzen/Vega they will do very well there as well.
  • Yojimbo - Tuesday, August 15, 2017 - link

    I wouldn't expect bulldozer (or whatever their latest pre-zen architecture was called) attach rates to hold true for Ryzen. There were probably a significant percentage of AMD fans accounting for bulldozer sales. If Ryzen is a lot more successful (and by all accounts it looks like it will be), then only a small percentage of Ryzen sales will be by die hard AMD fans. Most will be by people looking to get the best value. Then you can expect attach rates for AMD GPUs with Ryzen CPUs to be significantly lower than with bulldozer.
  • nwarawa - Monday, August 14, 2017 - link

    *yawn* Wake me up when the prices return to normal levels. I've had my eye on a few nice'n'cheap freesync monitors for awhile now, but missed my chance at an affordable RX470/570.

    Make a Vega 48 -3GB card (still enough RAM for 1080P for me, but should shoo-off the miners) for around $250, and I'll probably bite. And get that power consumption under control while you're at it. I'll undervolt it either way.

Log in

Don't have an account? Sign up now