Power, Temperature, & Noise

Last but not least as always is our look at the power consumption, temperatures, and acoustics of the GTX 590. With a 365W TDP, expect everything here to be quite extreme.

GeForce GTX 500 Series Voltage
GTX 570 Load GTX 590 Load GTX 590 OC GTX 590 Idle
0.987v 0.912v 0.987v 0.875v

In order to fit two GF110 GPUs onto a single card at a reasonable TDP, NVIDIA clearly had to do a lot of binning in order to get chips that would run at a low enough voltage. Our card runs at 0.912v for each GPU, which is only 0.012v more than the idle voltage of the rest of the GF110 cards. We really can’t emphasize enough just how low of a load voltage this is; it’s absolutely miniscule for GF110. This is also reflected in the idle voltage, which at 0.875v is 0.025v lower than the normal GF110 idle voltage, meaning GTX 590 should also idle better than the average GTX 580/570. NVIDIA’s best and lowest-leaking chips are clearly necessary in order to build the GTX 590.

Before we get too far, we wanted to mention OCP again. With NVIDIA’s OCP changes they have once again locked out our ability to disable OCP, and this time quite possibly for good as the generic OCP mechanism even catches our normal fallback programs. As our existing body of work for NVIDIA cards has OCP disabled we can’t reach parity with our existing Furmark results, thanks in large part to NVIDIA throttling Furmark far more than necessary. We’re going to go ahead and publish our results, but it’s not the same playing field.

As a result we’ve thrown in another game instead: HAWX. It’s not a very graphically complex game, but it’s actually one of the most power intensive games in our test suite, making it the best candidate for a Furmark replacement on such short notice.

At idle things don’t look too bad for the GTX 590. With NVIDIA’s voltage binning and efficiency gains from a single card, our total power consumption is 10W lower than the GTX 570 in SLI and 12W lower than the GTX 295. However even binned chips can’t completely erase GF110’s generally mediocre idle power consumption or lack of a super low power mode for the slave GPU, two areas where AMD has an advantage. As a result even with binning GTX 590 still draws 13W more than the 6990 at idle.

Power consumption under Crysis generally mirrors our expectations. NVIDIA’s power consumption should be similar to or higher than the 6990, and this is what we see. At 506W for the GTX 590 it’s actually only 10W more than the GTX 560 in SLI, even though performance is notably greater. Or alternatively it’s 50W under the GTX 570 in SLI. However it falls behind the 6990 by 15W here, which is compounded by the fact that the 6990 gets better performance in this game.

Meanwhile our OC results are quite a bit higher. Even though we’re still using a core voltage below any GTX 580 we have, at 0.987v, our GTX 590 reaches GTX 580 SLI power consumption numbers. Thus the good news is that the card can handle such power, the bad news is that it’s not possible to match the GTX 580 SLI’s performance even with this great of power consumption.

Our first instance of HAWX has the GTX 590 once again falling behind the 6990 by about 10W. EVGA’s factory overclock adds another 11W, and our own overclock brings that up to 588W. Unlike Crysis this is still well below the GTX 580 SLI, this time only missing the 6990OC by a few watts. Also worthy of note is that our HAWX overclock power draw is 28W lower than our Crysis overclock power draw, in contrast to both the stock and EVGA clocks drawing 30-35W more with HAWX. Again, this indicates the OCP has come into play, this time in a regular game.

This is probably the best graph for illustrating just how hard OCP throttles Furmark. Whereas AMD’s PowerTune does a very good job of keeping power consumption near the true power limit on the 6990 (in this case 375W), OCP is far more aggressive. This is why the GTX 590 consumes nearly 100W less, and why Furmark’s status as a worst-case scenario test is compromised with overly aggressive OCP. Even the GTX 590 OC with its voltage bump is throttled to the point where it consumes less power than the 6990.

Dual-GPU cards generally do poorly at idle temperatures, though a center-mounted fan improves the situation somewhat, which is the biggest reason that temperatures are down from the GTX 295. However such a fan configuration doesn’t cure all ills. As a result at 45C for idle we’re a bit on the warm side, but it’s nothing that’s a problem.

Not surprisingly, the GTX 590 is near the top of our Crysis temperature chart. Although we don’t publish the individual GPU temperatures, the hotter GPU in all cases on the GTX 590 was the GPU which exhausts externally, in this case incurring the penalty of having half that vent blocked by a DVI port. As a result the GTX 590 is always going to run a bit hotter than the 6990. We’re also seeing why 0.987v is about as high as you want to go on the GTX 590 OC—it’s within 5C of the thermal throttle.

HAWX largely mirrors Crysis here. The GTX 590 ends up being warmer than the 6990, and even the 6990 OC. The 590 OC is also 2C cooler here, thanks to OCP. 90C isn’t any worse than the GTX 580 in SLI, but then that’s about as warm as we want to allow things to get.

Again with Furmark being throttled, the GTX 590 looks unnaturally good here. Temperatures are below what we see in games.

We haven’t quite decided why the GTX 590 breaks 46dB here. It’s probably the use of a fan as opposed to a blower, but it could just as well be the fact that the GTX 590 effectively exhausts in an uneven fashion due to the half-blocked vent. In any case 46.8db is by no means loud, but this isn’t a whisper-silent card at idle.

These are the noise results collected during our Crysis temperature runs. Remember how we said NVIDIA was using the fact that they launched after AMD in order to claim that they had a quieter cooler? This is the proof. The GTX 590 simply embarrasses the 6990 here; it’s not even a contest. Make no mistake: 57.9dB is not a quiet card; we’re still looking at a dual-GPU monster, but it’s not the roaring monster that the 6990 is. On a subjective level I’d say things are even better than the numbers show—the GTX 590 is lower pitched than the 6990, which improves the perceived noise. Note that if we start overclocking + overvolting however, we largely erase the difference.

HAWX doesn’t make the GTX 590 look quite as good, but the difference is still there. The GTX 590 manages to stay just south of 60dB versus 65dB for the 6990. Perhaps the more impressive outcome however is that the GTX 590 is quieter than the GTX 580 in SLI, with the latter having the advantage of being two separate cards that can be independently cooled. We didn’t have time to grab the GTX 570 SLI or the 6870 in CrossFire, however I suspect the GTX 590 is louder than either of those. It’s also going to be louder than any single card setup (except perhaps the GTX 480)—even NVIDIA will tell you that the GTX 590 is louder than the GTX 580.

Finally we have our Furmark noise values. With extreme throttling everything is different for GTX 590, giving the results little in the way of usefulness.

Overall our power, temperature, and noise data proved to be quite interesting. On the one hand the GTX 590’s power consumption is a bit higher and temperatures a bit hotter than the comparable 6990. However the noise results are nothing short of remarkable—if NVIDIA can dissipate 350W+ of heat while at the same time making 5-7dB less noise, then it starts to become clear that AMD’s design has a serious weakness. The ultimate question is what did NVIDIA do right that AMD did not?

Compute Final Thoughts
POST A COMMENT

123 Comments

View All Comments

  • BreadFan - Thursday, March 24, 2011 - link

    Would this card be better for the P67 platform vs GTX 580's in sli considering you won't get full 16x going the sli route? Reply
  • Nfarce - Thursday, March 24, 2011 - link

    The 16x vs. 8x issue has been beaten to death for years. Long story short, it's not a measurable difference at or below 1920x1080 resolutions and only barely a difference above that. Reply
  • BreadFan - Thursday, March 24, 2011 - link

    Thanks man. Already have one evga 580. Only reason I was considering was for the step up program evga offers (590 for around $200). I have till first part of June to think about it but am leaning towards adding another 580 once the price comes down in a year or two. Reply
  • wellortech - Thursday, March 24, 2011 - link

    You won't get 16x going the CF route either.....although I agree that it doesn't really matter. Reply
  • softdrinkviking - Sunday, March 27, 2011 - link

    i hope your screen name is from the budgie song! Reply
  • softdrinkviking - Sunday, March 27, 2011 - link

    that comment was to breadfan. Reply
  • 7Enigma - Thursday, March 24, 2011 - link

    Comon guys, I would have thought you could have at least had the 6990 and the 590 data points for Crysis 2. Perhaps a short video as well with the new game? :) Reply
  • Ryan Smith - Thursday, March 24, 2011 - link

    It's unlikely we'll be using Crysis 2 in its current state, but that could always change.

    However if we were to use it, it won't be until the next benchmark refresh.
    Reply
  • YouGotServed - Friday, March 25, 2011 - link

    Crysis: Warhead will always be the benchmark. Crysis 2 isn't nearly as demanding. It's been dumbed down for consoles, in case you haven't heard. There are no advanced settings available to you through the normal game menu. You have to tweak the CFG file to do so.

    I thought like you, originally. I was thinking: Crysis 2 is gonna set a new bar for performance. But in reality, it's not even close to the original in terms of detail level.
    Reply
  • mmsmsy - Thursday, March 24, 2011 - link

    I know I can be annoying, but I checked it myself and the built in benchmark in Civ V really favours nVidia cards. In the real world scenario the situation is almost upside down. I got a reply from one of the reviewers last time that it provides quite accurate scores, but I thought that just for the heck of it you'd try and see for yourself that it doesn't at all. I know it's just one game and that benchmarking is a slow work, but in order to keep up the good work you're doing you should at least use the advice and confront it with the reality to stay objective and provide the most accurate scores that really mean sth. I don't mean to undermine you, because I find your articles to be mostly accurate and you're doing a great job. Just use the advice to make this site even better. A lot of writing for a comment, but this time maybe you will see what I'm trying to do. Reply

Log in

Don't have an account? Sign up now