Power, Temperature, & Noise

Moving on from performance metrics, we’ll touch upon power, temperature, and noise. This is also normally where we’d discuss voltages, but as Vega is a new chip on a new architecture, nothing seems to read Vega 64 and 56 correctly.

In terms of average game clockspeeds, neither card maintains its boost specification at 100% with prolonged usage. Vega 64 tends to stay closer to its boost clocks, which is in line with its additional power overhead and higher temperature target over Vega 56.

Radeon RX Vega Average Clockspeeds
  Radeon RX Vega 64 Air Radeon RX Vega 56
Boost Clocks
1546MHz
1471MHz
Max Boost (DPM7)
1630MHz
1590MHz
 
Battlefield 1
1512MHz
1337MHz
Ashes: Escalation
1542MHz
1354MHz
DOOM
1479MHz
1334MHz
Ghost Recon: Wildlands
1547MHz
1388MHz
Dawn of War III
1526MHz
1335MHz
Deus Ex: Mankind Divided
1498MHz
1348MHz
GTA V
1557MHz
1404MHz
F1 2016
1526MHz
1394MHz
 
FurMark
1230MHz
HBM2: 868MHz
1099MHz
HBM2: 773MHz

With games, the HBM2 clocks ramp up and stay at their highest clock state. Expectedly, the strains of FurMark cause the cards to oscillate memory clocks: between 945MHz and 800MHZ for Vega 64, and between 800MHz and 700MHz for Vega 56. On that note, HBM2 comes with an idle power state (167MHz), an improvement on Fiji's HBM1 single power state. Unfortunately, the direct power savings are a little obscured since, as we will soon see, Vega 10 is a particularly power hungry chip.

As mentioned earlier, we used the default out-of-the-box configuration for power: Balanced, with the corresponding 220W GPU power limit. And under load, Vega needs power badly.

Idle Power ConsumptionLoad Power Consumption - Battlefield 1Load Power Consumption - FurMark

The performance of both Vega cards comes at a significant power cost. For the RX 500 series, we mused that load consumption is where AMD paid the piper. Here, the piper has taken AMD to the cleaners. In Battlefield 1, Vega 64 consumes 150W more system-wide power than the GTX 1080, its direct competitor. To be clear, additional power draw is expected, since Vega 64 is larger in both shader count (4096 vs. 2560) and die size (486mm2 vs. 314mm2) to the GTX 1080. But in that sense, when compared with the 1080 Ti, powered by the 471mm2 GP102, Vega 64 still consumes more power. 

As for Vega 64's cut-down sibling, Vega 56's lower temperature target, lower clocks, and lower board power make its consumption look much more reasonable, although it is still well above the 1070.

In any case, the cooling solutions are able to do the job without severe effects on temperature and noise. As far as blowers go, RX Vega 64 and 56 are comparable to the 1080 Ti FE blower.

Idle GPU TemperatureLoad GPU Temperature - Battlefield 1Load GPU Temperature - FurMark
Not Graphed: Temperature of the actual Vega (Star): 9329C

Noise-testing equipment and methodology differ from past results, with a more sensitive noise meter and closer distance to the graphics card. Readings were also taken with an open case. As such, the noise levels may appear higher than expected.

Idle Noise LevelsLoad Noise Levels - Battlefield 1Load Noise Levels - FurMark

 

Synthetics Final Words
Comments Locked

213 Comments

View All Comments

  • Ryan Smith - Tuesday, August 15, 2017 - link

    3 CUs per array is a maximum, not a fixed amount. Each Hawaii shader engine had a 4/4/3 configuration, for example.

    http://images.anandtech.com/doci/7457/HawaiiDiagra...

    So in the case of Vega 10, it should be a 3/3/3/3/2/2 configuration.
  • watzupken - Tuesday, August 15, 2017 - link

    I think the performance is in line with recent rumors and my expectation. The fact that AMD beats around the bush to release Vega was a tell tale sign. Unlike Ryzen where they are marketing how well it runs in the likes of Cinebench and beating the gong and such, AMD revealed nothing on benchmarks throughout the year for Vega just like they did when they first released Polaris.
    The hardware no doubt is forward looking, but where it needs to matter most, I feel AMD may have fallen short. It seems like the way around is probably to design a new GPU from scratch.
  • Yojimbo - Wednesday, August 16, 2017 - link

    "It seems like the way around is probably to design a new GPU from scratch. "

    Well, perhaps, but I do think with more money they could be doing better with what they've got. They made the decision to focus on reviving their CPU business with their resources, however.

    They probably have been laying the groundwork for an entirely new architecture for some time, though. My belief is that APUs were of primary concern when originally designing GCN. They were hoping to enable heterogeneous computing, but it didn't work out. If that strategy did tie them down somewhat, their next gen architecture should free them from those tethers.
  • Glock24 - Tuesday, August 15, 2017 - link

    Nice review, I'll say the outcome was expected given the Vega FE reviews.

    Other reviews state that the Vega 64 has a switch that sets the power limts, and you have "power saving", "normal" and "turbo" modes. From what I've read the difference between the lowest and highest power limit is as high as 100W for about 8% more performance.

    It seems AMD did not reach the expected performance levels so they just boosted the clocks and voltage. Vega is like Skylake-X in that sense :P

    As others have mentioned, it would be great to have a comparison of Vega using Ryzen CPUs vs. Intel's CPUs.
  • Vertexgaming - Wednesday, August 16, 2017 - link

    It sucks so much that price drops on GPUs aren't a thing anymore because of miners. I have been upgrading my GPU every year and getting an awesome deal on the newest generation GPU, but now the situation has changed so much, that I will have to skip a generation to justify a $600-$800 (higher than MSRP) price tag for a new graphics card. :-(
  • prateekprakash - Wednesday, August 16, 2017 - link

    In my opinion, it would have been great if Vega 64 had a 16gb vram version at 100$ more... That would be 599$ apiece for the air cooled version... That would future proof it to run future 4k games (CF would benefit too)...

    It's too bad we still don't have 16gb consumer gaming cards, the Vega pro being not strictly for gamers...
  • Dosi - Wednesday, August 16, 2017 - link

    So the system does consumes 91W more with Vega 64, cant imagine with the LC V64... it can be 140W more? Actually what you saved on the GPU (V64 instead 1080) you already spent on electricity bill...
  • versesuvius - Wednesday, August 16, 2017 - link

    NVIDIA obviously knows how to break down the GPU tasks into chunks and processing those chunks and sending them out the door better than AMD. And more ROPs can certainly help AMD cards a lot.
  • peevee - Thursday, August 17, 2017 - link

    "as electrons can only move so far on a single (ever shortening) clock cycle"

    Seriously? Electrons? You think that how far electrons move matters? Sheesh.
  • FourEyedGeek - Tuesday, August 22, 2017 - link

    You being serious or sarcastic? If serious then you are ignorant.

Log in

Don't have an account? Sign up now