Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. With the reference 290 the high performance of the card came at the cost of significant noise, so the arrival of customized cards presents board partners with a chance to offer better, quieter cooling solutions than what the reference 290 was capable of. In the case of Sapphire and their Tri-X cooler, we’ve already seen that it has proven to be a very capable cooling solution for the heavily overclocked 280X Toxic, so this bodes well for Hawaii based cards. Simply put, if Sapphire can match the 280X Toxic’s cooling performance on the 290 Tri-X OC, then they will have solved the reference 290’s biggest drawback.

Radeon R9 290 Series Voltages (VDDC/GPU-Z)
Ref. 290X Boost Voltage Ref. 290 Boost Voltage Sapphire 290 Boost Voltage
1.11v 1.18v 1.18v

Looking briefly at voltages, our 290 Tri-X OC is indistinguishable from our reference 290. Sapphire is doing some degree of binning here to identify boards capable of hitting their higher factory overclocks, but it doesn’t look like they’re picking chips for low power usage, or alternatively making any voltage adjustments to help hit those clockspeeds.

Radeon R9 290 Series Average Clockspeeds
  Ref. 290 Sapphire 290 Ref. 290X (Quiet)
Boost Clock
947MHz
1000MHz
1000MHz
Metro: LL
947MHz
1000MHz
923MHz
CoH2
930MHz
1000MHz
970MHz
Bioshock
947MHz
1000MHz
985MHz
Battlefield 3
947MHz
1000MHz
980MHz
Crysis 3
947MHz
1000MHz
925MHz
Crysis: Warhead
947MHz
1000MHz
910MHz
TW: Rome 2
947MHz
1000MHz
907MHz
Hitman
947MHz
1000MHz
990MHz
GRID 2
947MHz
1000MHz
930MHz

Moving on to average clockspeeds, to no great surprise the 290 Tri-X OC has absolutely no problem hitting and sustaining 1GHz across all of our games. The 290 series is not seriously power limited except in the case of FurMark, so being able to sustain the GPU’s maximum boost clocks is solely a function of cooling, an easy task for the Tri-X cooler to accomplish. Company of Heroes 2 is of course the sole outlier here for the reference 290, but the 290 Tri-X OC had no problem sustaining 1GHz even though it’s based on an AMD reference board.

Finally, for our power/temperature/noise testing, along with our standard data we’re also going to throw in our results from the various open air cooled Tahiti cards we’ve tested in the past two months. Since most of our reference cards are blowers this will give us a better baseline for open air coolers, especially since some of these open air cooled Tahiti cards have proven to be rather impressive.

Idle Power Consumption

Curiously, the power consumption of the 290 Tri-X OC is notably lower than the reference 290. This isn’t a fluke in our results and we don’t have a solid explanation for this at this time, but for some reason we’re able to idle at a lower degree of power consumption with Sapphire’s card than we are our reference card. There is a difference in fans due to the different coolers, but based on what we’ve seen in the past we don’t believe the fans are responsible for this.

Load Power Consumption - Crysis 3

Load Power Consumption - FurMark

As for load power consumption, we have two different scenarios going on. Under our gaming workload the reference 290 was already able to hit its maximum boost clock, so with the 290 Tri-X OC operating at a similar voltage and only a slightly higher clockspeed, we’re not seeing a meaningful increase in power consumption. This being despite the lower temperatures of the GPU, which would normally offer at least some savings due to reduced leakage. What this means is that custom coolers will not be able to do anything about the 290’s lesser weakness, which is its power consumption relative to the GTX 780.

Meanwhile for FurMark, the reference 290 would throttle here based on both thermal and power limitations, whereas the Sapphire 290 is only limited by power. As a result it’s able to maintain higher clockspeeds and hence higher power consumption levels than the reference 290. Or in other words, the reference 290 was held back by its cooler here, while the 290 Tri-X OC is held back by its board power limits.

Idle GPU Temperature

Moving on to idle temperatures, Sapphire’s 290 performs as we’d expect it to. AMD’s blowers were somewhat hobbled here, but with an open air cooler in the mix idle temperatures for just about every card is going to bottom out at around 30C.

Load GPU Temperature - Crysis 3

Load GPU Temperature - FurMark

With our load temperatures we get our first sign of how well Sapphire’s Tri-X cooler can handle a Hawaii GPU, and so far things are looking good. Under Crysis 3 the Sapphire 290 is topping out at 70C, right in the range we’d expect an open air cooler. Meanwhile in our worst case scenario of FurMark the Sapphire card only warms up a few more degrees to 74C. Compared to AMD’s reference card this is of course a huge difference, representing a 24C and 20C improvement respectively. As we saw in our review of the 290X high temperatures aren’t necessarily a problem so long as they’ve been planned for in the design phase, but at the same time for the reference 290 series cards this went hand-in-hand with thermal throttling. Meanwhile Sapphire’s card will have no such problem, as it clearly has plenty of thermal headroom to work with.

Ultimately the fact that we’re in the 70s tells us that (by our metrics) Sapphire has done a good job balancing temperatures and noise on their fan curve. It means the Sapphire 290 doesn’t deliver the coolest temperatures, particularly under our gaming workloads, but it also means the cooler isn’t working harder than it really needs to (and generating more noise in the process). For a non-overclocking card there’s little benefit to having temperatures below 70C.

Idle Noise Levels

Last but not least is of course our section of noise benchmarks, starting with idle noise levels. Given the reference 290’s weaknesses and the 290 Tri-X OC’s strengths, Sapphire’s card is in a good position to resolve the 290’s biggest drawback, making our noise testing by far the most interesting aspect of this review in our eyes.

Anyhow, idle noise testing starts off well enough. At 37.8dB(A) the 290 Tri-X OC is near the top of our charts, tightly clustered among other open air coolers and tying the other Tri-X card in our roundup, the 280X Toxic.

Load Noise Levels - Crysis 3

Over the last few months we’ve seen some very impressive open air cooled cards come across our labs. But as 2013 comes to a close it may very well be Sapphire’s 290 Tri-X OC that’s the most impressive of them all. Despite the high power load presented by a 290 card and in contrast to the problems AMD’s reference cooler encountered, Sapphire’s 290 Tri-X OC completely sweeps the field here. We would just as well have expected it to beat all of our blower based cards and even tie some of our other open air cooled cards, but as we can see it has gone above and beyond every other card when it comes to noise levels under our Crysis 3 gaming workload.

At 41.1dB the 290 Tri-X OC beats a number of Tahiti based cards, including our 7970GE and Sapphire’s own 280X Toxic, and it even edges out Asus’s impressive 280X DirectCU II. This not only makes Sapphire’s 290 the quietest high-end card we have, but it also means we’re seeing Sapphire dissipate an estimated 250W of heat while only generating a very, very limited amount of noise while doing so. Or to put this another way, Sapphire’s 290 is 16dB quieter than the reference 290, nullifying our earlier noise concerns and then-some.

Load Noise Levels - FurMark

Under FurMark Sapphire’s extremely impressive cooling performance continues unabated. Even with the additional thermal load imposed by FurMark our noise levels only rise to 43dB, which by our standards would still be considered very quiet under even Crysis 3, never mind a pathological program like FurMark. With FurMark being the worst case scenario for a card’s cooler, this is clear evidence that the Tri-X cooler is capable of handling everything that Hawaii can throw at it, and that it can be handled with ease.

Ultimately at the risk of reducing an entire review down to a few numbers, these noise values are the metrics we came into this review interested in measuring, and to that end Sapphire has completely blown our expectations. Just being an “average” open air cooled card would have been a significant improvement over the reference 290, but to cool a 290 this well makes it a significant game changer.

Gaming Performance Overclocking
Comments Locked

119 Comments

View All Comments

  • ShieTar - Tuesday, December 24, 2013 - link

    "Curiously, the [idle] power consumption of the 290 Tri-X OC is notably lower than the reference 290."

    Well, it runs about 10°C cooler, and silicone does have a negative temperature coefficient of electrical resistance. That 10°C should lead to a resistance increase of a few %, and thus to a lower current of a few %. Here's some nice article about the same phenomenon observed going from a Stock 480 to an Zotac AMP! 480:

    http://www.techpowerup.com/reviews/Zotac/GeForce_G...

    The author over there was also initially very surprised. Apparently kids these days just don't pay attention in physics class anymore ...
  • EarthwormJim - Tuesday, December 24, 2013 - link

    It's mainly the leakage current which decreases as temperature decreases, which can lead to the reductions in power consumption.
  • Ryan Smith - Tuesday, December 24, 2013 - link

    I had considered leakage, but that doesn't explain such a (relatively) massive difference. Hawaii is not a leaky chip, meanwhile if we take the difference at the wall to be entirely due to the GPU (after accounting for PSU efficiency), it's hard to buy that 10C of leakage alone is increasing idle power consumption by one-third.
  • The Von Matrices - Wednesday, December 25, 2013 - link

    In your 290 review you said that the release drivers had a power leak. Could this have been fixed and account for the difference?
  • Samus - Wednesday, December 25, 2013 - link

    Quality vrms and circuitry optimizations will have an impact on power consumption, too. Lots of factors here...
  • madwolfa - Wednesday, December 25, 2013 - link

    This card is based on reference design.
  • RazberyBandit - Friday, December 27, 2013 - link

    And based does not mean an exact copy -- it means similar. Some components (caps, chokes, resistors, etc.) could be upgraded and still fill the bill for the base design. Some components could even be downgraded, yet the card would still fit the definition of "based on AMD reference design."
  • Khenglish - Wednesday, December 25, 2013 - link

    Yes power draw does decrease with temperature, but not because resistance drops. Resistance dropping has zero effect on power draw. Why? Because processors are all about pushing current to charge and discharge wire and gate capacitance. Lower resistance just means that happens faster.

    The real reason power draw drops is due to lower leakage. Leakage current is completely unnecessary and is just wasted power.

    Also an added tidbit. The reason performance increases while temperature decreases is mainly due to the wire resistance dropping, not an improvement in the transistor itself. Lower temperature decreases the number of carriers in a semiconductor but improves carrier mobility. There is a small net benefit to how much current the transistor can pass due to temperature's effect on silicon, but the main improvement is from the resistance of the copper interconnects dropping as temperature drops.
  • Totally - Wednesday, December 25, 2013 - link

    Resistance increases with temperature -> Power draw increases P=(I^2)*R.
  • ShieTar - Thursday, December 26, 2013 - link

    The current isn't stabilized generally, the current is: P=U^2/R.

    " Because processors are all about pushing current to charge and discharge wire and gate capacitance. Lower resistance just means that happens faster."

    Basically correct, nevertheless capacitor charging happens asymptotic, and any IC optimised for speed will not wait for a "full" charge. The design baseline is probably to get the lowest charging required for operation at the highest qualified temperature. Since decreasing temperature will increase charging speed, as you pointed out, you will get to a higher charging ratio, and thus use more power.

    On top of that, the GPU is not exclusively transistors. There is power electronics, there are interconnects, there are caches, and who knows what else (not me). Now when the transistors pull a little more charge due to the higher temperature, and the interconnects which deliver the current have a higher resistance, then you get additional transmission losses. And that's on top of higher leakage rates.

    Of course the equation gets even more fun if you start considering the time constants of the interconnects itself, which have gotten quiet relevant since we got to 32nm structures, hence the high-K materials. Though I have honestly no clue how this contribution is linked to temperature.

    But hey, here's hoping that Ryan will go and investigate the Power drop with his equipment and provide us with a full explanation. As I personally don't own a GPU which gets hot in idle (can't force the fan below 30% by software and won't stop it by hand) I cannot test idle power behavior on my own, but I can and did repeat the Furmark-Test described in the link above, and also see a power-saving of about 0.5W per °C with my GTX660. And thats based on internal power monitoring, so the mainboard/PCIe slot and the PSU should add a bit more to that:

    https://www.dropbox.com/s/javq0dg75u40357/Screensh...

Log in

Don't have an account? Sign up now