Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

So far we’ve seen AMD take a lateral when it comes to gaming performance, resulting in R9 285 keeping up with R9 280 rather consistently. It is clear that AMD has specifically intended for R9 285 to deliver R9 280-like gaming performance, so that is exactly what has happened above the hood.

Under the hood however there are not one but two generations of GCN upgrades to account for, which have the potential to significantly alter the power/temp/noise characteristics of the video card. Compared to the GCN 1.0 based Tahiti GPU, GCN 1.2 introduces not only AMD’s radically improved PowerTune implementation, but it also introduces their delta color compression technology that cuts down on the size of the memory bus and the resulting number of RAM chips needed on a completed card. As a result R9 285 can at times perform quite differently from R9 280, especially when it comes to power.

Radeon R9 285 Voltages
Saph. 285 DXOC Load Saph. 285 DXOC Idle AMD R9 280 Load
1.15V 0.9V 1.1V

Starting with voltages, we want to quickly remind everyone that as of GCN 1.1 AMD no longer provides a way of easily reading a GPU’s desired VID, and instead we get the real voltage as reported through the card’s sensors. In this case we’re taking our voltages from LuxMark 2.0, which offers a consistent workload that is strenuous enough to max out the GPU, but light enough that virtually every GPU should be able to boost to its maximum turbo bin. In any case these aren’t going to be the maximum voltages for any given card, but they should be close.

For our Sapphire R9 285 Dual-X OC, we find that our card stabilizes at 1.15V under load and idles at 0.9V. Compared to our R9 280 this appears to be a higher load voltage and idle voltage, but it must be noted that the 280 is reporting its VID instead of its actual voltage.

Meanwhile as GCN 1.2 cards implement the same fine-grained PowerTune support that was first pioneered in GCN 1.1, we want to take a look at average clockspeeds as well. With all of AMD’s modern GCN 1.1+ cards, AMD and most of their partners are advertising the cards by their boost clockspeed. So it’s helpful to see if these cards can maintain these clockspeeds throughout. In practice the potential for throttling is much greater in thermally constrained situations (blowers, e.g. R9 290) than it is for open air coolers, but there is still the potential for hitting situations where we throttle based on power consumption.

Radeon R9 285 Average Clockspeeds
  Saph DXOC (Stock) Saph DXOC (Fact. OC)
Boost Clock 918MHz 965MHz
Metro: LL
918MHz
965MHz
CoH2
918MHz
965MHz
Bioshock
918MHz
963MHz
Battlefield 4
918MHz
965MHz
Crysis 3
918MHz
965MHz
Crysis: Warhead
918MHz
965MHz
TW: Rome 2
918MHz
965MHz
Thief
918MHz
965MHz
GRID 2
918MHz
965MHz

The long and short of it is that the R9 285 Dual-X has no trouble maintaining its 918MHz clockspeed when underclocked. Though it doesn’t affect the averages we do see some very minor fluctuations in clockspeed (an errant 916/917MHz here and there), which is likely due to AMD’s clockspeed governing mechanism rather than any kind of power or temperature throttle. Note that even under FurMark, our worst case (and generally unrealistic) test, the card only falls by less than 20Mhz to 900MHz sustained.

Otherwise if we bring the Dual-X back to its factory overclocked speeds, we find that it has no problem maintaining 965MHz, outside of the exception of Bioshock which fluctuated frequently enough that it averaged a mere 963Mhz.

Unfortunately this means we have also been unable to determine the base clockspeed for these cards. Even holding back cooling and reducing the power target, the R9 285 doesn’t seem to have a GPU clockspeed floor, unlike the Hawaii based R9 290 series.

Idle Power Consumption

At this point outside of cards that are design deficient in some way or another, idle power is unremarkable. Sapphire’s R9 285 Dual-X puts up with the best, with the 2W gain over the R9 280 likely coming from the reduced VRAM capacity.

Load Power Consumption - Crysis 3

Moving on to load power consumption under Crysis 3 we find that power consumption has been reduced compared to the R9 280, but not remarkably so. Despite the much lower official TBP of 190W versus 250W for the R9 280, the actual difference (for virtually equivalent performance) is 13W of savings at the wall. What this tells us is that despite the PowerTune changes, the R9 285 is sustaining power consumption not all that far removed from the R9 280. In practice the R9 280 was unlikely to be drawing near 250W under a gaming workload, so in this case the 190W value for the R9 285 is not all that far removed from the R9 280. The remaining difference is due to the VRAM reduction and some power efficiency gains in Tonga.

On the other hand power consumption for the Dual-X when using its factory overclock launches ahead. The slight increase in performance under Crysis 3 from this overclock will increase the load on the CPU, but only slightly. The rest comes from the power required to hit and sustain the higher clockspeeds of Sapphire’s overclock. As a result we’re looking at power consumption near the level of an R9 280X.

Meanwhile to make a quick GTX 760 comparison, AMD and NVIDIA are virtually tied. At 292W versus 293W, these cards are drawing virtually identical amounts of power. However the GTX 760 ultimately has the efficiency edge, as it delivers better performance under Crysis 3 than the R9 285 does (though in other games the tables could of course turn).

Load Power Consumption - FurMark

Surprisingly, under Furmark the situation is actually reversed. Instead of the R9 285 beating the R9 280, we’re actually seeing it draw 10W more power despite the lower TBP. Though seemingly nonsensical, in practice this is the newer iteration of PowerTune doing a better job of homing in on the card’s 190W limit. This is a situation the coarse PowerTune implementation on R9 280 would have trouble with, causing it to have to back off on clockspeeds much more severely, and ultimately drawing less power than its limit would truly allow.

The end result is something of a wash. The R9 285 is not drawing significantly more or less power than the R9 280, all the while delivering similar performance. In that context we can say that as a result, power efficiency has not meaningfully changed compared to the R9 280.

Finally to make one more GTX 760 comparison, this illustrates that while AMD can generally beat the GTX 760’s performance, it also comes at the cost of maximum power consumption. At least when faced with a worst case scenario, the R9 285 is going to be drawing about 20W more at the wall.

Idle GPU Temperature

When it comes to idle temperatures, Sapphire’s Dual-X cooler is among the best. 30C at idle is average in the pack only because so many other coolers are as equally able at idle.

Load GPU Temperature - Crysis 3

Earlier we mentioned that the Dual-X cooler is probably a bit overpowered for a 190W card, and here we can see why. Under Crysis 3 our card maxes out at a relatively chilly 65C, and even with the factory overclock only pushes to 70C. Sapphire’s card clearly has no problem keeping itself cool.

Load GPU Temperature - FurMark

The greater load from FurMark causes temperatures to rise a bit more, but not exceptionally so. Even under this most strenuous of tests we’re topping out at 70C with reference clockspeeds, or 72C with the factory overclock. So long as Sapphire can hit these temperatures without generating too much noise then they’re golden (or blue, as the case may be).

I would also point out at this time that while the R9 285 Dual-X is significantly cooler than the GTX 760, we’re comparing an open air cooler to a blower. All things considered this is exactly the situation where the open air cooler will be the stronger performer. But it comes at the tradeoff of not being able to directly expel all of its waste heat.

Idle Noise Levels

Much like idle temperatures, idle noise levels are looking quite good for Sapphire’s Dual-X cooler. There are a handful of cards that can drop below even 38.5dB, but at this point we’re quickly approaching the overall noise floor.

Load Noise Levels - Crysis 3

Already doing very well for themselves when it comes to load temperatures, load noise only makes Sapphire’s R9 285 Dual-X look even better. When we underclock it to stock speeds we’re only getting 45.6dB under load, quieter than any Tahiti card, Hawaii card, Pitcairn card, or Kepler card. Only the old GTX 560 Ti (which was impressively overbuilt) can sustain load noises lower than 45.6dB.

Load Noise Levels - FurMark

As was the case with temperatures, FurMark also drives up the load noise levels, but not especially so. Even with this additional heat the R9 285 tops out at 48.3dB, staying comfortably under the 50dB level and trailing only the much less powerful GTX 660 and GTX 560 Ti.

Meanwhile when looking at the R9 285 Dual-X with its factory overclock enabled, we unsurprisingly see an increase in noise from the additional heat generated by the overclock. The total penalty for the overclock is 3-4dB, which is a not-insignificant increase in noise. I feel like Sapphire really hit their sweet spot for balancing noise with performance at stock, so the factory overclock deviates from that some. Overall these noise levels are still well within reason, but they’re now middle of the pack instead of near the head of the pack.

Speaking of Sapphire cards, it’s interesting to compare and contrast the R9 285 with our R9 280, which is also a Sapphire card using an identical cooler. Compared to the R9 280, for the R9 285 Sapphire has found a better balance between temperature and noise. The R9 280 could pull off slightly better temperatures, but it was always above 52dB of noise under load as a result.

Ultimately excluding the vender-specific factors, our look at power, temperature, and noise tells us that much like the R9 285’s gaming performance, the R9 285’s power/temp/noise performance is a lateral move for AMD. Performance hasn’t significantly changed and neither has power, which really helps to distill the essence of R9 285 down to its improved GCN 1.2 feature set. Which in this case in particular means features such as the much finer-grained clockspeeds offered by PowerTune.

Compute Final Thoughts
Comments Locked

86 Comments

View All Comments

  • felaki - Wednesday, September 10, 2014 - link

    The article says that the Sapphire card has "1x DL-DVI-I, 1x DL-DVI-D, 1x HDMI, and 1x DisplayPort". Can you be more precise as to which versions of the spec are supported? Is it HDMI 1.4 or HDMI 2.0? I believe since this refers to MST, it's only HDMI 1.4 and a DisplayPort connection is required in MST mode for 4K@60Hz output?

    Reading the recent GPU articles, I'm very puzzled why HDMI 2.0 adoption is still lacking in GPUs and displays, even though the spec has been out there for about a year now. Is the PC industry reluctant to adopt HDMI 2.0 for some (political(?), business(?)) reason? I have heard only bad things about DisplayPort 1.2 MST to carry a 4K@60Hz signal, and I'm thinking it's a buggy hack for a transitional tech period.

    If the AMD newest next-gen graphics card only supports HDMI 1.4, that is mind-boggling. Please tell me I'm confused and this is a HDMI 2.0-capable release?
  • Ryan Smith - Wednesday, September 10, 2014 - link

    DisplayPort 1.2 and HDMI 1.4. Tonga does not add new I/O options.
  • felaki - Wednesday, September 10, 2014 - link

    Thanks for clarifying this!
  • Penti - Wednesday, September 10, 2014 - link

    You can do 4K SST on both Nvidia and AMD-cards as long as they are DisplayPort 1.2 capable. It depends on your screen. There is no HDMI 600MHz on any graphics processor. Neither is their much of support from monitors or TVs as most don't do 600MHz.
  • felaki - Wednesday, September 10, 2014 - link

    Thanks! I was not actually aware that SST existed. I see here http://community.amd.com/community/amd-blogs/amd-g... that AMD is referring to SST as being the thing to fix up the 4K issue, although the people in the comments on that link refer that the setup is not working properly.

    How do people generally see SST? Should one defer buying a new system now until proper HDMI 2.0 support comes along, or is SST+DisplayPort 1.2 already a glitch-free user experience for 4K@60Hz?
  • Kjella - Wednesday, September 10, 2014 - link

    Got 3840x2160x60Hz using SST/DP and it's been fine, except UHD gaming is trying to kill my graphics card.
  • mczak - Wednesday, September 10, 2014 - link

    DP SST 4k/60Hz should be every bit as glitch free as proper hdmi 2.0 (be careful though with the latter since some 4k TVs claiming to accept 60Hz 4k resolutions over hdmi will only do so with ycbcr 4:2:0). DP SST has the advantage that actually even "old" gear on the graphic card side can do it (such as radeons from the HD 6xxx series - from the hw side, if it could do DP MST 4k/60Hz it should most likely be able to do the same with SST too, the reason why MST hack was needed in the first place is entirely on the display side).
    But if you're planning to attach your 4k TV to your graphic card a DP port might not be of much use since very few TVs have that.
  • Solid State Brain - Wednesday, September 10, 2014 - link

    I won't get another AMD video card until idle multimonitor consumption gets fixed. According to other websites, power consumption in such case increases substantially whereas NVidia video cards have almost the same consumption as when using a single display. In the case of the Sapphire 285 Dual-X it increases by almost 30W just by having a second display connected!!

    I think Anandtech should start measuring idle power consumption when more than one display is connected to the video card / multimonitor configurations. It's an important information for many users who not only game but also need to have productivity needs.
  • Solid State Brain - Wednesday, September 10, 2014 - link

    And of course, a comment editing function would be useful too.
  • shing3232 - Wednesday, September 10, 2014 - link

    well, AMD video card have to run higher frequency with multiscreen than with a single monitor

Log in

Don't have an account? Sign up now