Power, Temperature, & Noise

Given AMD’s focus on power efficiency with Polaris – not to mention the overall benefits of the move to 14nm FinFET – there is a lot of interest in just how the RX 480 stacks up when it comes to power, temperature, and noise. So without further ado…

Idle Power Consumption

When it comes to idle power consumption I'm posting the results I've measured as-is, but I want to note that I have low confidence in these results for the AMD cards. Ever since the GPU testbed was updated from Windows 8.1 to Windows 10, AMD cards have idled 3-5W higher than they used to under Windows 8.1. I believe that this is an AMD driver bug – NVIDIA’s cards clearly have no problem – possibly related to the GPU tested being an Ivy Bridge-E system. In this case I don’t believe RX 480’s idle power consumption is any higher than GTX 960’s, but for the moment the testbed is unable to prove it.

Load Power Consumption - FurMark

Traditionally we start with gaming load power before moving on to FurMark, but in this instance I want to flip that. As a power virus type workload, FurMark’s power requirements are greater than any game. But because it’s synthetic, it gives us a cleaner look at just GPU power consumption.

Among AMD’s cards, the RX 480 is second to only the Radeon HD 7850 in power consumption. Even then, as a GCN 1.0 card, the 7850 is one of the last AMD cards without fine-grained power states, so this isn’t a true apples-to-apples comparison. Instead a better point of reference is the GCN 1.2 based R9 Nano, which has a 175W TBP. Compared to the R9 Nano we find that the RX 480 draws about 30W less at the wall, which almost perfectly translates to the 25W difference in TBP. As a result we can see first-hand the progress AMD has made on containing power consumption with Polaris.

Load Power Consumption - Crysis 3

However things are a bit more mixed under Crysis 3. RX 480 is still near the top of our charts, and keeping in mind that higher performing cards draw more power on this test due to the additional CPU workload, the RX 480 compares very favorably to the rest of AMD’s lineup. System power consumption is very close to R9 280/380 for much improved performance, and against the performance-comparable R9 390, we’re looking at over 110W in savings. Hawaii was a solid chip from a performance standpoint, and Polaris 10 picks up where that left off by bringing down the power consumption to much lower levels.

The drawback for AMD here is that power consumption compared to NVIDIA still isn’t great. At the wall, RX 480 is only about 10W ahead of the performance-comparable GTX 970, a last-generation 28nm card. 1070FE further complicates matters, as its performance is well ahead of RX 480, and yet its power consumption at the wall is within several watts of AMD’s latest card. Given what we saw with FurMark I have little reason to believe that card-level power consumption is this close, but it looks like AMD is losing out elsewhere; possibly with driver-related CPU load.

Idle GPU Temperature

Moving on to idle GPU temperatures, there’s little to remark on. At 31C, the RX 480’s blower based design is consistent with the other cards in our lineup.

Load GPU Temperature - FurMark

Load GPU Temperature - Crysis 3

Meanwhile with load temperatures, we get to see the full impact of AMD’s new WattMan power management technology. The RX 480 has a temperature target of 80C, and it dutifully ramps up the fan to ensure it doesn’t exceed that temperature.

Idle Noise Levels

With idle noise levels RX 480 once again posts a good result. At 37.8dB, it’s in good company, only meaningfully trailing cards that idle silently due to their respective zero fan speed idle implementations.

Load Noise Levels - FurMark

Load Noise Levels - Crysis 3

Finally, with load noise levels, RX 480 produces middling (but acceptable) results. Given that we have a mix of blowers and open air coolers here, the RX 480 performs similarly to other mainstream blower based cards. The $199 price tag means that AMD can’t implement any exotic cooling or noise reduction technologies, though strictly speaking it doesn’t need them.

Gaming Performance, Continued First Thoughts
Comments Locked

449 Comments

View All Comments

  • basroil - Thursday, June 30, 2016 - link

    "Or both the mobo and the PSU are supplying the same voltage and the power input is combined into a single bus... y'know... preventing the unlikely scenario you describe from ever possibly happening."

    1) The two do NOT have the same voltage. Ideally they do but that's not how things actually work in practice.
    2) The folks at tomshardware did bus level analysis of power draws and put their results into their review. Their tests for various cards will prove to you that power draw can indeed be modified to either PCIe slot or power cable and is not 50-50 like you claim.
    3) Even assuming that your point was valid (which it most certainly is NOT), it wouldn't change the fact that a single card already draws more power from the PCIe slot than allowable by ATX specifications, and that two cards will be far more than the specs allow (double the spec for PCIe3.0)
  • schulmaster - Thursday, June 30, 2016 - link

    Lol. The PSU is the source for all board power AND PCIE Aux. The board design and PSU will negotiate how much 12V power is reliably sourced from the 24pin. A 6pin PCIe aux is rated for an additional 75W, and that limit could be down to the cable itself, let alone the card interface and/or the PSU. Even high-end OC boards have a supplemental molex connector for multi GPU configs to supplement available bus power, which is the burden of the 24pin. It is not outlandish to have concern if a single RX480 is overdrawing from the entire PCIe bus wattage allotted in the spec, especially when the fall back is a PCIe 6 pin already being overdrawn from as well. Tomshardware was literally unwilling to due further multiGPU testing due to the numbers they were physically seeing, not paranoia.
  • pats1111 - Thursday, June 30, 2016 - link

    @binarydissonance: Don't confuse these fanboys with the facts, they're NVIDIA goons, it's a waste of time because they are TROLLS
  • AbbieHoffman - Wednesday, June 29, 2016 - link

    Actually most motherboards support crossfire. There are many that support only crossfire. Because it is cheaper to make crossfire support than SLI.
  • Gigaplex - Thursday, June 30, 2016 - link

    But they don't support the excessive power consumption on the PCIe bus, which is a specification violation.
  • jospoortvliet - Monday, July 4, 2016 - link

    Luckily every motherboard except for cheap ones that are quite old can handle easily 100+ watt over the PCIe port, as any over clocking would need that, too.
  • beck2050 - Thursday, June 30, 2016 - link

    I just laugh when I see people talking about Crossfire
  • fanofanand - Thursday, June 30, 2016 - link

    "when even 2x1080 wouldn't hit 75W"

    Your post is so full of FUD it should be deleted.
  • basroil - Thursday, June 30, 2016 - link

    "Your post is so full of FUD it should be deleted. "

    I'm not responsible for your ignorance. Check tomshardware /reviews/nvidia-geforce-gtx-1080-pascal,4572-10.html and you'll see I'm right
  • fanofanand - Thursday, June 30, 2016 - link

    I checked, you are wrong. Stop spreading FUD, you Nvidiot.

Log in

Don't have an account? Sign up now