Power, Temperature, & Noise

Last but not least as always is our look at the power consumption, temperatures, and acoustics of the GTX 590. With a 365W TDP, expect everything here to be quite extreme.

GeForce GTX 500 Series Voltage
GTX 570 Load GTX 590 Load GTX 590 OC GTX 590 Idle
0.987v 0.912v 0.987v 0.875v

In order to fit two GF110 GPUs onto a single card at a reasonable TDP, NVIDIA clearly had to do a lot of binning in order to get chips that would run at a low enough voltage. Our card runs at 0.912v for each GPU, which is only 0.012v more than the idle voltage of the rest of the GF110 cards. We really can’t emphasize enough just how low of a load voltage this is; it’s absolutely miniscule for GF110. This is also reflected in the idle voltage, which at 0.875v is 0.025v lower than the normal GF110 idle voltage, meaning GTX 590 should also idle better than the average GTX 580/570. NVIDIA’s best and lowest-leaking chips are clearly necessary in order to build the GTX 590.

Before we get too far, we wanted to mention OCP again. With NVIDIA’s OCP changes they have once again locked out our ability to disable OCP, and this time quite possibly for good as the generic OCP mechanism even catches our normal fallback programs. As our existing body of work for NVIDIA cards has OCP disabled we can’t reach parity with our existing Furmark results, thanks in large part to NVIDIA throttling Furmark far more than necessary. We’re going to go ahead and publish our results, but it’s not the same playing field.

As a result we’ve thrown in another game instead: HAWX. It’s not a very graphically complex game, but it’s actually one of the most power intensive games in our test suite, making it the best candidate for a Furmark replacement on such short notice.

At idle things don’t look too bad for the GTX 590. With NVIDIA’s voltage binning and efficiency gains from a single card, our total power consumption is 10W lower than the GTX 570 in SLI and 12W lower than the GTX 295. However even binned chips can’t completely erase GF110’s generally mediocre idle power consumption or lack of a super low power mode for the slave GPU, two areas where AMD has an advantage. As a result even with binning GTX 590 still draws 13W more than the 6990 at idle.

Power consumption under Crysis generally mirrors our expectations. NVIDIA’s power consumption should be similar to or higher than the 6990, and this is what we see. At 506W for the GTX 590 it’s actually only 10W more than the GTX 560 in SLI, even though performance is notably greater. Or alternatively it’s 50W under the GTX 570 in SLI. However it falls behind the 6990 by 15W here, which is compounded by the fact that the 6990 gets better performance in this game.

Meanwhile our OC results are quite a bit higher. Even though we’re still using a core voltage below any GTX 580 we have, at 0.987v, our GTX 590 reaches GTX 580 SLI power consumption numbers. Thus the good news is that the card can handle such power, the bad news is that it’s not possible to match the GTX 580 SLI’s performance even with this great of power consumption.

Our first instance of HAWX has the GTX 590 once again falling behind the 6990 by about 10W. EVGA’s factory overclock adds another 11W, and our own overclock brings that up to 588W. Unlike Crysis this is still well below the GTX 580 SLI, this time only missing the 6990OC by a few watts. Also worthy of note is that our HAWX overclock power draw is 28W lower than our Crysis overclock power draw, in contrast to both the stock and EVGA clocks drawing 30-35W more with HAWX. Again, this indicates the OCP has come into play, this time in a regular game.

This is probably the best graph for illustrating just how hard OCP throttles Furmark. Whereas AMD’s PowerTune does a very good job of keeping power consumption near the true power limit on the 6990 (in this case 375W), OCP is far more aggressive. This is why the GTX 590 consumes nearly 100W less, and why Furmark’s status as a worst-case scenario test is compromised with overly aggressive OCP. Even the GTX 590 OC with its voltage bump is throttled to the point where it consumes less power than the 6990.

Dual-GPU cards generally do poorly at idle temperatures, though a center-mounted fan improves the situation somewhat, which is the biggest reason that temperatures are down from the GTX 295. However such a fan configuration doesn’t cure all ills. As a result at 45C for idle we’re a bit on the warm side, but it’s nothing that’s a problem.

Not surprisingly, the GTX 590 is near the top of our Crysis temperature chart. Although we don’t publish the individual GPU temperatures, the hotter GPU in all cases on the GTX 590 was the GPU which exhausts externally, in this case incurring the penalty of having half that vent blocked by a DVI port. As a result the GTX 590 is always going to run a bit hotter than the 6990. We’re also seeing why 0.987v is about as high as you want to go on the GTX 590 OC—it’s within 5C of the thermal throttle.

HAWX largely mirrors Crysis here. The GTX 590 ends up being warmer than the 6990, and even the 6990 OC. The 590 OC is also 2C cooler here, thanks to OCP. 90C isn’t any worse than the GTX 580 in SLI, but then that’s about as warm as we want to allow things to get.

Again with Furmark being throttled, the GTX 590 looks unnaturally good here. Temperatures are below what we see in games.

We haven’t quite decided why the GTX 590 breaks 46dB here. It’s probably the use of a fan as opposed to a blower, but it could just as well be the fact that the GTX 590 effectively exhausts in an uneven fashion due to the half-blocked vent. In any case 46.8db is by no means loud, but this isn’t a whisper-silent card at idle.

These are the noise results collected during our Crysis temperature runs. Remember how we said NVIDIA was using the fact that they launched after AMD in order to claim that they had a quieter cooler? This is the proof. The GTX 590 simply embarrasses the 6990 here; it’s not even a contest. Make no mistake: 57.9dB is not a quiet card; we’re still looking at a dual-GPU monster, but it’s not the roaring monster that the 6990 is. On a subjective level I’d say things are even better than the numbers show—the GTX 590 is lower pitched than the 6990, which improves the perceived noise. Note that if we start overclocking + overvolting however, we largely erase the difference.

HAWX doesn’t make the GTX 590 look quite as good, but the difference is still there. The GTX 590 manages to stay just south of 60dB versus 65dB for the 6990. Perhaps the more impressive outcome however is that the GTX 590 is quieter than the GTX 580 in SLI, with the latter having the advantage of being two separate cards that can be independently cooled. We didn’t have time to grab the GTX 570 SLI or the 6870 in CrossFire, however I suspect the GTX 590 is louder than either of those. It’s also going to be louder than any single card setup (except perhaps the GTX 480)—even NVIDIA will tell you that the GTX 590 is louder than the GTX 580.

Finally we have our Furmark noise values. With extreme throttling everything is different for GTX 590, giving the results little in the way of usefulness.

Overall our power, temperature, and noise data proved to be quite interesting. On the one hand the GTX 590’s power consumption is a bit higher and temperatures a bit hotter than the comparable 6990. However the noise results are nothing short of remarkable—if NVIDIA can dissipate 350W+ of heat while at the same time making 5-7dB less noise, then it starts to become clear that AMD’s design has a serious weakness. The ultimate question is what did NVIDIA do right that AMD did not?

Compute Final Thoughts
Comments Locked

123 Comments

View All Comments

  • valenti - Thursday, March 24, 2011 - link

    Ryan, I commented last week on the 550 review. Just to echo that comment here: how are you getting the "nodes per day" numbers? Have you considered switching to a points per day metric? Very few people can explain what nodes per day are, and they aren't a very good measure for real world folding performance.

    (also, it seems like you should double the number for this review, since I'm guessing it was just ignoring the second GPU)
  • Ryan Smith - Thursday, March 24, 2011 - link

    Last year NVIDIA worked with the F@H group to provide a special version of the client for benchmark purposes. Nodes per day is how the client reports its results. Since points are arbitrary based on how the F@H group is scoring things, I can't really make a conversion.
  • poohbear - Thursday, March 24, 2011 - link

    Good to see that a $700 finally has a decent cooler! Why would somebody spend $700 & then go and hafta spend another $40 for an aftermarket cooler??? nvidia & AMD really need to just charge $750 and hve an ultra quiet card, these people in this price range are'nt gonna squabble over an extra $50 for petes sake!!!! it makes no sense that they skimp on the cooler at this price range! this is the top of the line where money isnt the issue!
  • Guspaz - Thursday, March 24, 2011 - link

    Let's get this straight, nVidia. Slapping two of your existing GPUs together does not make this a "next-generation card". Saying that you've been working on it for two years is also misleading; I doubt it took two years just to lay out the PCB to get two GPUs on a single board.

    SLI and Crossfire still feel like kludges. Take Crysis 2 for example. The game comes out, and I try to play it on my 295. It runs, but only on one GPU. So I go looking online; it turns out that there's an SLI profile update for the game, but only for the latest beta drivers. If you install those drivers *and* the profile update, you'll get the speed boost, but also various graphical corruption issues involving flickering of certain types of effects (that seem universal rather than isolated).

    After two goes at SLI (first dual 285s, next a 295), I've come to the conclusion that SLI is just not worth the headache. You'll end up dealing with constant compatibility issues.
  • strikeback03 - Thursday, March 24, 2011 - link

    And that is why people still buy the 6970/580, rather than having 2 cheaper cards in SLI like so many recommend.
  • JarredWalton - Thursday, March 24, 2011 - link

    For the record, I've had three goes at CrossFire (2 x 3870, 4870X2, and now 2 x 5850). I'm equally disappointed with day-of-release gaming results. But, if you stick to titles that are 2-3 months old, it's a lot better. (Yeah, spend $600 on GPUs just so you can wait two months after a game release before buying....)
  • Guspaz - Friday, March 25, 2011 - link

    I don't know about that, the original Crysis still has a lot of issues with SLI.
  • Nentor - Thursday, March 24, 2011 - link

    "For the GTX 590 launch, NVIDIA once again sampled partner cards rather than sampling reference cards directly to the press. Even with this, all of the cards launching today are more-or-less reference with a few cosmetic changes, so everything we’re describing here applies to all other GTX 590 cards unless otherwise noted.

    With that out of the way, the card we were sampled is the EVGA GeForce GTX 590 Classified, a premium GTX 590 offering from EVGA. The important difference from the reference GTX 590 is that GTX 590 Classified ships at slightly higher clocks—630/864 vs. 607/853.5—and comes with a premium package, which we will get into later. The GTX 590 Classified also commands a premium price of $729."

    Are we calling overclocked cards "more-or-less reference" cards now? That's a nice way to put it, I'll use it the next time I get stopped by a police officer. Sir, I was going more or less 100mph.

    Reference is ONE THING. It is the basis and does not waver. Anything that is not it is either overclocked or underclocked.
  • strikeback03 - Thursday, March 24, 2011 - link

    Bad example, as in the US at least your speedometer is only required to be accurate within 10%, meaning you can't get ticketed at less than 10% over the speed limit. This card is only overclocked by 4%. More importantly, they a) weren't sent a reference card, and b) included full tests at stock clocks. Would you rather they not review it since it isn't a reference card?
  • Nentor - Thursday, March 24, 2011 - link

    That is a good point actually, I didn't think of that.

    Maybe reject the card yes, but that is not going to happen. Nvidia is just showing who is boss by sending a non reference card. AT will have to swallow whatever Nvidia feeds them if they want to keep bringing the news.

Log in

Don't have an account? Sign up now