Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

So far we’ve seen AMD take a lateral when it comes to gaming performance, resulting in R9 285 keeping up with R9 280 rather consistently. It is clear that AMD has specifically intended for R9 285 to deliver R9 280-like gaming performance, so that is exactly what has happened above the hood.

Under the hood however there are not one but two generations of GCN upgrades to account for, which have the potential to significantly alter the power/temp/noise characteristics of the video card. Compared to the GCN 1.0 based Tahiti GPU, GCN 1.2 introduces not only AMD’s radically improved PowerTune implementation, but it also introduces their delta color compression technology that cuts down on the size of the memory bus and the resulting number of RAM chips needed on a completed card. As a result R9 285 can at times perform quite differently from R9 280, especially when it comes to power.

Radeon R9 285 Voltages
Saph. 285 DXOC Load Saph. 285 DXOC Idle AMD R9 280 Load
1.15V 0.9V 1.1V

Starting with voltages, we want to quickly remind everyone that as of GCN 1.1 AMD no longer provides a way of easily reading a GPU’s desired VID, and instead we get the real voltage as reported through the card’s sensors. In this case we’re taking our voltages from LuxMark 2.0, which offers a consistent workload that is strenuous enough to max out the GPU, but light enough that virtually every GPU should be able to boost to its maximum turbo bin. In any case these aren’t going to be the maximum voltages for any given card, but they should be close.

For our Sapphire R9 285 Dual-X OC, we find that our card stabilizes at 1.15V under load and idles at 0.9V. Compared to our R9 280 this appears to be a higher load voltage and idle voltage, but it must be noted that the 280 is reporting its VID instead of its actual voltage.

Meanwhile as GCN 1.2 cards implement the same fine-grained PowerTune support that was first pioneered in GCN 1.1, we want to take a look at average clockspeeds as well. With all of AMD’s modern GCN 1.1+ cards, AMD and most of their partners are advertising the cards by their boost clockspeed. So it’s helpful to see if these cards can maintain these clockspeeds throughout. In practice the potential for throttling is much greater in thermally constrained situations (blowers, e.g. R9 290) than it is for open air coolers, but there is still the potential for hitting situations where we throttle based on power consumption.

Radeon R9 285 Average Clockspeeds
  Saph DXOC (Stock) Saph DXOC (Fact. OC)
Boost Clock 918MHz 965MHz
Metro: LL
918MHz
965MHz
CoH2
918MHz
965MHz
Bioshock
918MHz
963MHz
Battlefield 4
918MHz
965MHz
Crysis 3
918MHz
965MHz
Crysis: Warhead
918MHz
965MHz
TW: Rome 2
918MHz
965MHz
Thief
918MHz
965MHz
GRID 2
918MHz
965MHz

The long and short of it is that the R9 285 Dual-X has no trouble maintaining its 918MHz clockspeed when underclocked. Though it doesn’t affect the averages we do see some very minor fluctuations in clockspeed (an errant 916/917MHz here and there), which is likely due to AMD’s clockspeed governing mechanism rather than any kind of power or temperature throttle. Note that even under FurMark, our worst case (and generally unrealistic) test, the card only falls by less than 20Mhz to 900MHz sustained.

Otherwise if we bring the Dual-X back to its factory overclocked speeds, we find that it has no problem maintaining 965MHz, outside of the exception of Bioshock which fluctuated frequently enough that it averaged a mere 963Mhz.

Unfortunately this means we have also been unable to determine the base clockspeed for these cards. Even holding back cooling and reducing the power target, the R9 285 doesn’t seem to have a GPU clockspeed floor, unlike the Hawaii based R9 290 series.

Idle Power Consumption

At this point outside of cards that are design deficient in some way or another, idle power is unremarkable. Sapphire’s R9 285 Dual-X puts up with the best, with the 2W gain over the R9 280 likely coming from the reduced VRAM capacity.

Load Power Consumption - Crysis 3

Moving on to load power consumption under Crysis 3 we find that power consumption has been reduced compared to the R9 280, but not remarkably so. Despite the much lower official TBP of 190W versus 250W for the R9 280, the actual difference (for virtually equivalent performance) is 13W of savings at the wall. What this tells us is that despite the PowerTune changes, the R9 285 is sustaining power consumption not all that far removed from the R9 280. In practice the R9 280 was unlikely to be drawing near 250W under a gaming workload, so in this case the 190W value for the R9 285 is not all that far removed from the R9 280. The remaining difference is due to the VRAM reduction and some power efficiency gains in Tonga.

On the other hand power consumption for the Dual-X when using its factory overclock launches ahead. The slight increase in performance under Crysis 3 from this overclock will increase the load on the CPU, but only slightly. The rest comes from the power required to hit and sustain the higher clockspeeds of Sapphire’s overclock. As a result we’re looking at power consumption near the level of an R9 280X.

Meanwhile to make a quick GTX 760 comparison, AMD and NVIDIA are virtually tied. At 292W versus 293W, these cards are drawing virtually identical amounts of power. However the GTX 760 ultimately has the efficiency edge, as it delivers better performance under Crysis 3 than the R9 285 does (though in other games the tables could of course turn).

Load Power Consumption - FurMark

Surprisingly, under Furmark the situation is actually reversed. Instead of the R9 285 beating the R9 280, we’re actually seeing it draw 10W more power despite the lower TBP. Though seemingly nonsensical, in practice this is the newer iteration of PowerTune doing a better job of homing in on the card’s 190W limit. This is a situation the coarse PowerTune implementation on R9 280 would have trouble with, causing it to have to back off on clockspeeds much more severely, and ultimately drawing less power than its limit would truly allow.

The end result is something of a wash. The R9 285 is not drawing significantly more or less power than the R9 280, all the while delivering similar performance. In that context we can say that as a result, power efficiency has not meaningfully changed compared to the R9 280.

Finally to make one more GTX 760 comparison, this illustrates that while AMD can generally beat the GTX 760’s performance, it also comes at the cost of maximum power consumption. At least when faced with a worst case scenario, the R9 285 is going to be drawing about 20W more at the wall.

Idle GPU Temperature

When it comes to idle temperatures, Sapphire’s Dual-X cooler is among the best. 30C at idle is average in the pack only because so many other coolers are as equally able at idle.

Load GPU Temperature - Crysis 3

Earlier we mentioned that the Dual-X cooler is probably a bit overpowered for a 190W card, and here we can see why. Under Crysis 3 our card maxes out at a relatively chilly 65C, and even with the factory overclock only pushes to 70C. Sapphire’s card clearly has no problem keeping itself cool.

Load GPU Temperature - FurMark

The greater load from FurMark causes temperatures to rise a bit more, but not exceptionally so. Even under this most strenuous of tests we’re topping out at 70C with reference clockspeeds, or 72C with the factory overclock. So long as Sapphire can hit these temperatures without generating too much noise then they’re golden (or blue, as the case may be).

I would also point out at this time that while the R9 285 Dual-X is significantly cooler than the GTX 760, we’re comparing an open air cooler to a blower. All things considered this is exactly the situation where the open air cooler will be the stronger performer. But it comes at the tradeoff of not being able to directly expel all of its waste heat.

Idle Noise Levels

Much like idle temperatures, idle noise levels are looking quite good for Sapphire’s Dual-X cooler. There are a handful of cards that can drop below even 38.5dB, but at this point we’re quickly approaching the overall noise floor.

Load Noise Levels - Crysis 3

Already doing very well for themselves when it comes to load temperatures, load noise only makes Sapphire’s R9 285 Dual-X look even better. When we underclock it to stock speeds we’re only getting 45.6dB under load, quieter than any Tahiti card, Hawaii card, Pitcairn card, or Kepler card. Only the old GTX 560 Ti (which was impressively overbuilt) can sustain load noises lower than 45.6dB.

Load Noise Levels - FurMark

As was the case with temperatures, FurMark also drives up the load noise levels, but not especially so. Even with this additional heat the R9 285 tops out at 48.3dB, staying comfortably under the 50dB level and trailing only the much less powerful GTX 660 and GTX 560 Ti.

Meanwhile when looking at the R9 285 Dual-X with its factory overclock enabled, we unsurprisingly see an increase in noise from the additional heat generated by the overclock. The total penalty for the overclock is 3-4dB, which is a not-insignificant increase in noise. I feel like Sapphire really hit their sweet spot for balancing noise with performance at stock, so the factory overclock deviates from that some. Overall these noise levels are still well within reason, but they’re now middle of the pack instead of near the head of the pack.

Speaking of Sapphire cards, it’s interesting to compare and contrast the R9 285 with our R9 280, which is also a Sapphire card using an identical cooler. Compared to the R9 280, for the R9 285 Sapphire has found a better balance between temperature and noise. The R9 280 could pull off slightly better temperatures, but it was always above 52dB of noise under load as a result.

Ultimately excluding the vender-specific factors, our look at power, temperature, and noise tells us that much like the R9 285’s gaming performance, the R9 285’s power/temp/noise performance is a lateral move for AMD. Performance hasn’t significantly changed and neither has power, which really helps to distill the essence of R9 285 down to its improved GCN 1.2 feature set. Which in this case in particular means features such as the much finer-grained clockspeeds offered by PowerTune.

Compute Final Thoughts
Comments Locked

86 Comments

View All Comments

  • TiGr1982 - Thursday, September 11, 2014 - link

    BTW, is Tonga the only new GPU AMD has to offer in 2014?
    (if I'm not mistaken, the previous one from AMD, Hawaii, was released back in October 2013, almost a year ago)
    Does anybody know?
  • HisDivineOrder - Thursday, September 11, 2014 - link

    The thing is the moment I heard AMD explaining how Tonga was too new for current Mantle applications, I was like, "And there the other shoe is dropping."

    The promise of low level API is that you get low level access and the developer gets more of the burden of carrying the optimizations for the game instead of a driver team. This is great for the initial release of the game and great for the company that wants to have less of a (or no) driver team, but it's not so great for the end user who is going to wind up getting new cards and needing that Mantle version to work properly on games no longer supported by their developer.

    It's hard enough getting publishers and/or developers to work on a game a year or more after release to fix bugs that creep in and in some cases hard to get them to bother with resolution switches, aspect ratio switches, the option to turn off FXAA, the option to choose a software-based AA of your choice, or any of a thousand more doohickeys we should have by now as bog-standard. Can you imagine now relying on that developer--many of whom go completely out of business after finishing said title if they happen to work for Activision or EA--to fix all the problems?

    This is why a driver team is better working on it. Even though the driver team may be somewhat removed from the development of the game, the driver team continues to have an incentive to want to fix that game going forward, even if it's a game no longer under development at the publisher. You're going to be hard pressed to convince Bobby Kotick at Activision that it's worth it to keep updating versions of games older than six months (or a year for Call of Duty) because at a certain point, they WANT you to move on to another game. But nVidia and AMD (and I guess Intel?) want to make that game run well on next gen cards to help you move.

    This is where Mantle is flawed and where Mantle will never recover. Every time they change GCN, it's going to wind up with a similar problem. And every time they'll wind up saying, "Just switch to the DX version." If Mantle cannot be relied upon for the future, then it is Glide 2.0.

    And why even bother at all? Just stick with DirectX from the get-go, optimize for it (as nVidia has shown there is plenty of room for improvement), and stop wasting any money at all on Mantle since it's a temporary version that'll rapidly be out of date and unusable on future hardware.
  • The-Sponge - Thursday, September 11, 2014 - link

    I do not understand how they got there R9 270x temperatures, my OC'd R9 270x never even comes close to the temps they got....
  • mac2j - Friday, September 12, 2014 - link

    It's great that they've caught up with H.264 on hardware and the card otherwise looks fine. The bottom line for me, though, is that I don't see the point of buying card now without H.265 on hardware and an HDMI 2.0 port - 2 things Maxwell will bring this year. I haven't heard what AMDs timetable is there though.
  • P39Airacobra - Friday, October 17, 2014 - link

    It really irritates me that they are making these cards throttle to keep power and temps down! That is pathetic! If you can't make the thing right just don't make it! Even if it throttles .1mhz it should not be tolerated! We pay good money for this stuff and we should get what we pay for! It looks like the only AMD cards worth anything are the 270's and under. It stinks you have to go Nvidia to get more power! Because Nvidia really rapes people with their prices! But I must say the GTX 970 is priced great if it is still around $320. But AMD should have never even tried with this R9 285! First of all when you pay that much you should get more than 2GB. And another thing the card is pretty much limited to the performance of the R9 270's because of the V-Ram count! Yeah the 285 has more power than the 270's, But whats the point when you do not have enough V-Ram to take the extra power were you need a card like that to be? In other words if you are limited to 1080p anyway, Why pay the extra money when a R7 265 will handle anything at 1080p beautifully? This R9 285 is a pointless product! It is like buying a rusted out Ford Pinto with a V-8 engine! Yeah the engine is nice! But the car is a pos!
  • P39Airacobra - Friday, January 9, 2015 - link

    (QUOTE) So a 2GB card is somewhat behind the times as far as cutting edge RAM goes, but it also means that such a card only has ¼ of the RAM capacity of the current-gen consoles, which is a potential problem for playing console ports on the PC (at least without sacrificing asset quality).

    (SIGH) So now even reviewers are pretending the consoles can outperform a mid range GPU! WOW! How about telling the truth like you did before you got paid off! The only reason a mid range card has problems with console ports is because they are no longer optimized! They just basically make it run on PC and say xxxx you customers here it is! And no the 8GB on the consoles are used for everything not for only V-Ram! We are not stupid idiots that fall for anything like the idiots in Germany back in the 1930's!

Log in

Don't have an account? Sign up now