Power, Temperature, & Noise

Moving on from performance metrics, we’ll touch upon power, temperature, and noise. This is also normally where we’d discuss voltages, but as Vega is a new chip on a new architecture, nothing seems to read Vega 64 and 56 correctly.

In terms of average game clockspeeds, neither card maintains its boost specification at 100% with prolonged usage. Vega 64 tends to stay closer to its boost clocks, which is in line with its additional power overhead and higher temperature target over Vega 56.

Radeon RX Vega Average Clockspeeds
  Radeon RX Vega 64 Air Radeon RX Vega 56
Boost Clocks
1546MHz
1471MHz
Max Boost (DPM7)
1630MHz
1590MHz
 
Battlefield 1
1512MHz
1337MHz
Ashes: Escalation
1542MHz
1354MHz
DOOM
1479MHz
1334MHz
Ghost Recon: Wildlands
1547MHz
1388MHz
Dawn of War III
1526MHz
1335MHz
Deus Ex: Mankind Divided
1498MHz
1348MHz
GTA V
1557MHz
1404MHz
F1 2016
1526MHz
1394MHz
 
FurMark
1230MHz
HBM2: 868MHz
1099MHz
HBM2: 773MHz

With games, the HBM2 clocks ramp up and stay at their highest clock state. Expectedly, the strains of FurMark cause the cards to oscillate memory clocks: between 945MHz and 800MHZ for Vega 64, and between 800MHz and 700MHz for Vega 56. On that note, HBM2 comes with an idle power state (167MHz), an improvement on Fiji's HBM1 single power state. Unfortunately, the direct power savings are a little obscured since, as we will soon see, Vega 10 is a particularly power hungry chip.

As mentioned earlier, we used the default out-of-the-box configuration for power: Balanced, with the corresponding 220W GPU power limit. And under load, Vega needs power badly.

Idle Power ConsumptionLoad Power Consumption - Battlefield 1Load Power Consumption - FurMark

The performance of both Vega cards comes at a significant power cost. For the RX 500 series, we mused that load consumption is where AMD paid the piper. Here, the piper has taken AMD to the cleaners. In Battlefield 1, Vega 64 consumes 150W more system-wide power than the GTX 1080, its direct competitor. To be clear, additional power draw is expected, since Vega 64 is larger in both shader count (4096 vs. 2560) and die size (486mm2 vs. 314mm2) to the GTX 1080. But in that sense, when compared with the 1080 Ti, powered by the 471mm2 GP102, Vega 64 still consumes more power. 

As for Vega 64's cut-down sibling, Vega 56's lower temperature target, lower clocks, and lower board power make its consumption look much more reasonable, although it is still well above the 1070.

In any case, the cooling solutions are able to do the job without severe effects on temperature and noise. As far as blowers go, RX Vega 64 and 56 are comparable to the 1080 Ti FE blower.

Idle GPU TemperatureLoad GPU Temperature - Battlefield 1Load GPU Temperature - FurMark
Not Graphed: Temperature of the actual Vega (Star): 9329C

Noise-testing equipment and methodology differ from past results, with a more sensitive noise meter and closer distance to the graphics card. Readings were also taken with an open case. As such, the noise levels may appear higher than expected.

Idle Noise LevelsLoad Noise Levels - Battlefield 1Load Noise Levels - FurMark

 

Synthetics Final Words
Comments Locked

213 Comments

View All Comments

  • Stuka87 - Monday, August 14, 2017 - link

    So my question is, can these be under-volted like Polaris can for some pretty decent power savings, and what is the power usage like when you enable AMD's Chill mode. They had stated you get about 90-95% of the performance but at a significantly lower power usage.
  • tamalero - Monday, August 14, 2017 - link

    Does this means that all the future of VEGA 64 will rest in the hands of FINEWINE(tm)'s optimizations and boosts?

    Because right now Vega 64 is nothing but a disappointment.
  • Chaser - Monday, August 14, 2017 - link

    This is a letdown. I don't understand why AMD chooses to lag behind Nvidia. The market is ripe for a competitive alternative to Nvidia. AMD hasn't been it. I am very pleased with my GTX 1080 purchase in January. Hopefully, come my next GPU upgrade time, AMD will have something better to consider.
  • Stuka87 - Monday, August 14, 2017 - link

    They don't "choose" to. They had the money to either make an amazing CPU, or an amazing GPU. And the CPU market is larger, so they chose to push R&D budget into Ryzen (Which has payed off big time).
  • TheinsanegamerN - Monday, August 14, 2017 - link

    They chose to split their resources between two GPUs (polaris and vega) rather then focusing on one line of chips. They chose to rebrand and resell the same chips for 5 years.

    AMD isnt rich, but they make quite a few boneheaded decisions.
  • Aldaris - Monday, August 14, 2017 - link

    Actually, that looks like it paid off for them in market share. Also, Polaris was always out of stock (irrelevant as to the reasons why. It's still money in AMD's pocket).
  • mapesdhs - Monday, August 14, 2017 - link

    That's a good point; whatever the buyer, a sale is still a sale. However, perhaps from AMD's pov they'd rather sell them to gamers because when Etherium finally crashes there will be a huge dump of used AMD cards on the market that will at least for a time stifle new card sales, whereas gamers tend to keep their cards for some time. Selling GPUs to miners now is certainly money in the bag, but it builds up a potential future sting.
  • mattcrwi - Monday, August 14, 2017 - link

    I would never buy a used GPU that has been run at full throttle 24/7 for months. I'm sure some people won't understand what miners do with their cards or will be enticed by the prices anyway.
  • wolfemane - Tuesday, August 15, 2017 - link

    I own a wide range of 290s and 290xs I picked up at the end of the last mining craze for great prices. Purchased all off miners. They all still work to this day with 0 issues. I've also purchased and sold 10x that quantity across 280 - 290x. Of those only one failed and sapphire replaced it under end of warranty.

    I look forward to the new craze ending. Will get some great cards for dirt cheap, and a vast majority still under warranty.

    Nothing wrong with buying them.
  • nintendoeats - Tuesday, August 15, 2017 - link

    I have been running Folding @ Home on the GPU for several years now. I have yet to find any reason to believe that running a card 24/7 is a problem. What I would be more concerned about is heat cycles, which aren't an issue when you just run the card hot all the time.

Log in

Don't have an account? Sign up now