Power, Temperature, & Noise

Last but not least as always is our look at the power consumption, temperatures, and acoustics of the GTX 590. With a 365W TDP, expect everything here to be quite extreme.

GeForce GTX 500 Series Voltage
GTX 570 Load GTX 590 Load GTX 590 OC GTX 590 Idle
0.987v 0.912v 0.987v 0.875v

In order to fit two GF110 GPUs onto a single card at a reasonable TDP, NVIDIA clearly had to do a lot of binning in order to get chips that would run at a low enough voltage. Our card runs at 0.912v for each GPU, which is only 0.012v more than the idle voltage of the rest of the GF110 cards. We really can’t emphasize enough just how low of a load voltage this is; it’s absolutely miniscule for GF110. This is also reflected in the idle voltage, which at 0.875v is 0.025v lower than the normal GF110 idle voltage, meaning GTX 590 should also idle better than the average GTX 580/570. NVIDIA’s best and lowest-leaking chips are clearly necessary in order to build the GTX 590.

Before we get too far, we wanted to mention OCP again. With NVIDIA’s OCP changes they have once again locked out our ability to disable OCP, and this time quite possibly for good as the generic OCP mechanism even catches our normal fallback programs. As our existing body of work for NVIDIA cards has OCP disabled we can’t reach parity with our existing Furmark results, thanks in large part to NVIDIA throttling Furmark far more than necessary. We’re going to go ahead and publish our results, but it’s not the same playing field.

As a result we’ve thrown in another game instead: HAWX. It’s not a very graphically complex game, but it’s actually one of the most power intensive games in our test suite, making it the best candidate for a Furmark replacement on such short notice.

At idle things don’t look too bad for the GTX 590. With NVIDIA’s voltage binning and efficiency gains from a single card, our total power consumption is 10W lower than the GTX 570 in SLI and 12W lower than the GTX 295. However even binned chips can’t completely erase GF110’s generally mediocre idle power consumption or lack of a super low power mode for the slave GPU, two areas where AMD has an advantage. As a result even with binning GTX 590 still draws 13W more than the 6990 at idle.

Power consumption under Crysis generally mirrors our expectations. NVIDIA’s power consumption should be similar to or higher than the 6990, and this is what we see. At 506W for the GTX 590 it’s actually only 10W more than the GTX 560 in SLI, even though performance is notably greater. Or alternatively it’s 50W under the GTX 570 in SLI. However it falls behind the 6990 by 15W here, which is compounded by the fact that the 6990 gets better performance in this game.

Meanwhile our OC results are quite a bit higher. Even though we’re still using a core voltage below any GTX 580 we have, at 0.987v, our GTX 590 reaches GTX 580 SLI power consumption numbers. Thus the good news is that the card can handle such power, the bad news is that it’s not possible to match the GTX 580 SLI’s performance even with this great of power consumption.

Our first instance of HAWX has the GTX 590 once again falling behind the 6990 by about 10W. EVGA’s factory overclock adds another 11W, and our own overclock brings that up to 588W. Unlike Crysis this is still well below the GTX 580 SLI, this time only missing the 6990OC by a few watts. Also worthy of note is that our HAWX overclock power draw is 28W lower than our Crysis overclock power draw, in contrast to both the stock and EVGA clocks drawing 30-35W more with HAWX. Again, this indicates the OCP has come into play, this time in a regular game.

This is probably the best graph for illustrating just how hard OCP throttles Furmark. Whereas AMD’s PowerTune does a very good job of keeping power consumption near the true power limit on the 6990 (in this case 375W), OCP is far more aggressive. This is why the GTX 590 consumes nearly 100W less, and why Furmark’s status as a worst-case scenario test is compromised with overly aggressive OCP. Even the GTX 590 OC with its voltage bump is throttled to the point where it consumes less power than the 6990.

Dual-GPU cards generally do poorly at idle temperatures, though a center-mounted fan improves the situation somewhat, which is the biggest reason that temperatures are down from the GTX 295. However such a fan configuration doesn’t cure all ills. As a result at 45C for idle we’re a bit on the warm side, but it’s nothing that’s a problem.

Not surprisingly, the GTX 590 is near the top of our Crysis temperature chart. Although we don’t publish the individual GPU temperatures, the hotter GPU in all cases on the GTX 590 was the GPU which exhausts externally, in this case incurring the penalty of having half that vent blocked by a DVI port. As a result the GTX 590 is always going to run a bit hotter than the 6990. We’re also seeing why 0.987v is about as high as you want to go on the GTX 590 OC—it’s within 5C of the thermal throttle.

HAWX largely mirrors Crysis here. The GTX 590 ends up being warmer than the 6990, and even the 6990 OC. The 590 OC is also 2C cooler here, thanks to OCP. 90C isn’t any worse than the GTX 580 in SLI, but then that’s about as warm as we want to allow things to get.

Again with Furmark being throttled, the GTX 590 looks unnaturally good here. Temperatures are below what we see in games.

We haven’t quite decided why the GTX 590 breaks 46dB here. It’s probably the use of a fan as opposed to a blower, but it could just as well be the fact that the GTX 590 effectively exhausts in an uneven fashion due to the half-blocked vent. In any case 46.8db is by no means loud, but this isn’t a whisper-silent card at idle.

These are the noise results collected during our Crysis temperature runs. Remember how we said NVIDIA was using the fact that they launched after AMD in order to claim that they had a quieter cooler? This is the proof. The GTX 590 simply embarrasses the 6990 here; it’s not even a contest. Make no mistake: 57.9dB is not a quiet card; we’re still looking at a dual-GPU monster, but it’s not the roaring monster that the 6990 is. On a subjective level I’d say things are even better than the numbers show—the GTX 590 is lower pitched than the 6990, which improves the perceived noise. Note that if we start overclocking + overvolting however, we largely erase the difference.

HAWX doesn’t make the GTX 590 look quite as good, but the difference is still there. The GTX 590 manages to stay just south of 60dB versus 65dB for the 6990. Perhaps the more impressive outcome however is that the GTX 590 is quieter than the GTX 580 in SLI, with the latter having the advantage of being two separate cards that can be independently cooled. We didn’t have time to grab the GTX 570 SLI or the 6870 in CrossFire, however I suspect the GTX 590 is louder than either of those. It’s also going to be louder than any single card setup (except perhaps the GTX 480)—even NVIDIA will tell you that the GTX 590 is louder than the GTX 580.

Finally we have our Furmark noise values. With extreme throttling everything is different for GTX 590, giving the results little in the way of usefulness.

Overall our power, temperature, and noise data proved to be quite interesting. On the one hand the GTX 590’s power consumption is a bit higher and temperatures a bit hotter than the comparable 6990. However the noise results are nothing short of remarkable—if NVIDIA can dissipate 350W+ of heat while at the same time making 5-7dB less noise, then it starts to become clear that AMD’s design has a serious weakness. The ultimate question is what did NVIDIA do right that AMD did not?

Compute Final Thoughts
Comments Locked

123 Comments

View All Comments

  • Cali3350 - Thursday, March 24, 2011 - link

    Last Page you have a seeming paragraph that says "Quickly, let's also..." and then stops.
  • tipoo - Thursday, March 24, 2011 - link

    Also "Unlike AMD isn’t using an exotic phase change thermal compound here" on the meet the card page
  • tipoo - Thursday, March 24, 2011 - link

    Another one "This doesn’t the game in any meaningful manner, but it’s an example of how SLI/CF aren’t always the right tool for the job." on the computation page.
  • ahar - Thursday, March 24, 2011 - link

    Page 2
    "...NVIDIA’s advice about SLI mirror’s AMD’s advice..."

    mirrors
  • beepboy - Thursday, March 24, 2011 - link

    "Quickly, let's also"

    Nice review.
  • slickr - Thursday, March 24, 2011 - link

    For $700 I'd rather buy a whole new PC.

    Whats the point of playing games at larger resolutions than 1600x1050.

    In fact I'd say that 720p resolution is probably the best to play games at, because it tends to be easier to follow since pixels kind of move faster and you have more precision and smoother gameplay experience.

    I'd be keeping my AMD 6870 that is for sure!
  • HangFire - Thursday, March 24, 2011 - link

    I've once heard that the secret to happiness is learning to like the taste of cheap beer.
  • nyran125 - Sunday, June 19, 2011 - link

    did you know thats actually true lol. If you can have your coffee black, then if milk runs out you still get to enjoy life......
  • cjl - Thursday, March 24, 2011 - link

    That depends entirely on your GPU. Several can push high resolutions at >60fps, and it's just as smooth. Gaming at 2560x1600 is just an awesome experience.
  • Azethoth - Sunday, March 27, 2011 - link

    Exactly, some of us have panels with native 2560x1600. I _could_ game at some miserable 1600x1050 resolution, or I could play at my native resolution. I choose 2560x1600 and ignore all review results at inferior resolutions. Damn you Crysis, damn you!

Log in

Don't have an account? Sign up now