Power, Temperature, and Noise

As always, we'll take a look at power, temperature, and noise of the Radeon VII. While it is customary to look at voltages and clockspeeds, given the SMU changes that was not possible this first time around.

Idle Power Consumption

Load Power Consumption - Battlefield 1

Load Power Consumption - FurMark

Idle GPU Temperature

Load GPU Temperature - Battlefield 1

Load GPU Temperature - FurMark

The noise levels of the card look surprising at first blush. Ultimately, what's happening here is the consequence of a very aggressive fan curve, one that invests all potential acoustic improvements of an open-air triple fan card for cooling capability. Going this route makes the fan noise comparable to RX Vega 64's blower.

Idle Noise Levels

Load Noise Levels - Battlefield 1

Load Noise Levels - FurMark

Radeon VII and RX Vega 64 Clock-for-Clock Final Words
Comments Locked

289 Comments

View All Comments

  • tipoo - Sunday, February 10, 2019 - link

    It's MI50
  • vanilla_gorilla - Thursday, February 7, 2019 - link

    As a linux prosumer user who does light gaming, this card is a slam dunk for me.
  • LogitechFan - Friday, February 8, 2019 - link

    and a noisy one at that
  • BaneSilvermoon - Thursday, February 7, 2019 - link

    Meh, I went looking for a 16GB card about a week before they announced Radeon VII because gaming was using up all 8gb of VRAM and 14gb of system RAM. This card is a no brainer upgrade from my Vega 64.
  • LogitechFan - Friday, February 8, 2019 - link

    lemme guess, you're playing sandstorm?
  • Gastec - Tuesday, February 12, 2019 - link

    I was beginning to think that the "money" was in crytocurrency mining with video cards but I guess after the €1500+ RTX 2080Ti I should reconsider :)
  • eddman - Thursday, February 7, 2019 - link

    Perhaps but Turing is also a new architecture, so it's probable it'd get better with newer drivers too.

    Maxwell is from 2014 and still performs as it should.

    As for GPU-accelerated gameworks, obviously nvidia is optimizing it for their own cards only, but that doesn't mean they actively modify the code to make it perform worse on AMD cards; not to mention it would be illegal. (GPU-only gameworks effects can be disabled in game options if need be)

    Many (most?) games just utilize the CPU-only gameworks modules; no performance difference between cards.
  • ccfly - Tuesday, February 12, 2019 - link

    you joking right ?
    1st game they did just that is crysis (they hide modely under water so ati card will render these too
    and be slower
    and after that they cheat full time ...
  • eddman - Tuesday, February 12, 2019 - link

    No, I'm not.

    There was no proof of misconduct in crysis 2's case, just baseless rumors.

    For all we know, it was an oversight on crytek's part. Also, DX11 was an optional feature, meaning it wasn't part of game's main code, as I've stated.
  • eddman - Tuesday, February 12, 2019 - link

    ... I mean an optional toggle for crysis 2. The game could be run in DX9 mode.

Log in

Don't have an account? Sign up now