Power, Temperature, and Noise

Last but not least as always is our look at the power consumption, temperatures, and acoustics of the GTX 580. NVIDIA’s performance improvements were half of the GTX 580 story, and this is the other half.

Starting quickly with voltage, as we only have one card we can’t draw too much from what we know, but there are still some important nuggets. NVIDIA is still using multiple VIDs, so your mileage may vary. What’s clear from the start though is that NVIDIA’s operating voltages compared to the GTX 480 are higher for both idle and load. This is the biggest hint that leakage has been seriously dealt with, as low voltages are a common step to combat leakage. Even with these higher voltages running on a chip similar to GF100, overall power usage is still going to be lower. And on that note, while the voltages have changed the idle clocks have not; idle remains at 50.6MHz for the core.

GeForce GTX 480/580 Voltages
Ref 480 Load Ref 480 Idle Ref 580 Load Ref 580 Idle
0.959v 0.875v 1.037v 0.962v

Beginning with idle power, we’re seeing our second biggest sign that NVIDIA has been tweaking things specifically to combat leakage. Idle power consumption has dropped by 17W on our test system even though the idle clocks are the same and the idle voltage higher. NVIDIA doesn’t provide an idle power specification, but based on neighboring cards idle power consumption can’t be far off from 30-35W. Amusingly it still ends up being more than the 6870 CF however, thanks to the combination of AMD’s smaller GPUs and ULPS power saving mode for the slave GPU.

Looking at Crysis, we begin to see the full advantage of NVIDIA’s optimizations and where a single GPU is more advantageous over multiple GPUs. Compared to the GTX 480 NVIDIA’s power consumption is down 10% (never mind the 15% performance improvement), and power consumption comes in under all similar multi-GPU configurations. Interestingly the 5970 still draws less power here, a reminder that  we’re still looking at cards near the peak of the PCIe specifications.

As for FurMark, due to NVIDIA’s power throttling we’ve had to get a bit creative. FurMark is throttled to the point where the GTX 580 registers 360W, thanks to a roughly 40% reduction in performance under FurMark. As a result for the GTX 580 we’ve swapped out FurMark for another program that generates a comparable load, Program X. At this point we’re going to decline to name the program, as should NVIDIA throttle it we may be hard pressed to determine if and when this happened.

In any case, under FurMark & X we can see that once again NVIDIA’s power consumption has dropped versus the GTX 480, this time by 27W or around 6%. NVIDIA’s worst case scenario has notably improved, and in the process the GTX 580 is back under the Radeon HD 5970 in terms of power consumption. Thus it goes without saying that while NVIDIA has definitely improved power consumption, the GTX 580 is still a large, power hungry GPU.

With NVIDIA’s improvements in cooling and in idle power consumption, there’s not a result more dramatic than idle GPU temperatures. The GTX 580 isn’t just cooler, it’s cool period. 37C is one of the best results out of any of our midrange and high-end GPUs, and is a massive departure from the GTX 480 which was at least warm all the time. As we’ll see however, this kind of an idle temperature does come with a small price.

The story under load is much the same as idle: compared to the GTX 480 the GTX 580’s temperatures have dramatically dropped. At 79C it’s in the middle of the pack, beating a number of single and multi GPU setups, and really only losing to mainstream-class GPUs and the 6870 CF. While we’ve always worried about the GTX 480 at its load temperatures, the GTX 580 leaves us with no such concerns.

Meanwhile under FurMark and Program X, the gap has closed, though the GTX 580 remains in the middle of the pack. 87C is certainly toasty, but it’s still well below the thermal threshold and below the point where we’d be worried about it. Interestingly however, the GTX 580 is actually just a bit closer to its thermal threshold than the GTX 480 is; NVIDIA rated the 480 for 105C, while the 580 is rated for 97C. We’d like to say this vindicates our concerns about the GTX 480’s temperatures, but it’s more likely that this is a result of the transistors NVIDIA is using.

It’s also worth noting that NVIDIA seems to have done away with the delayed fan ramp-up found on the GTX 480. The fan ramping on the GTX 580 is as near as we can tell much more traditional, with the fan immediately ramping up with higher temperatures. For the purposes of our tests, this keeps the temperatures from spiking as badly.

Remember where we said there was a small price to pay for such low idle temperatures? This is it. At 44.4dB, the 580 is ever so slightly (and we do mean slightly) louder than the GTX 480; it also ends up being a bit louder than the 5970 or 6870CF. 44.4 is not by any means loud, but if you want a card that’s whisper silent at idle, the GTX 580 isn’t going to be able to deliver.

And last but not least is load noise. Between their improvements to power consumption and to cooling, NVIDIA put a lot of effort in to the amount of noise the GTX 580 generates. Where the GTX 480 set new records for a single GPU card, the GTX 580 is quieter than the GTX 285, the GTX 470, and even the Radeon HD 5870. In fact it’s only a dB off of the 5850, a card under most circumstances we’d call the epitome of balance between performance and noise. Graphs alone cannot demonstrate just how much of a difference there is between the GTX 480 and GTX 580 – the GTX 580 is not whisper quiet, but at no point in our testing did it ever get “loud”. It’s a truly remarkable difference; albeit one that comes at the price of pointing out just how lousy the GTX 480 was.

Often the mark of a good card is a balance between power, temperature, and noise, and NVIDIA seems to have finally found their mark. As the GTX 580 is a high end card the power consumption is still high, but it’s no longer the abnormality that was the GTX 480. Meanwhile GPU temperatures have left our self-proclaimed danger zone, and yet at the same time the GTX 580 has become a much quieter card under load than the GTX 480. If you had asked us in what NVIDIA needed to work on with the GTX 480, we would have said noise, temperature, and power consumption in that order; the GTX 580 delivers on just what we would have wanted.

Compute and Tessellation Final Thoughts
POST A COMMENT

159 Comments

View All Comments

  • dtham - Tuesday, November 09, 2010 - link

    Anyone know if aftermarket cooling for the GTX 480 will work for the GTX 580? It would be great to be able to reuse a waterblock from a GTX 480 for the new 580s. Looking at the picture the layout looks similar. Reply
  • mac2j - Tuesday, November 09, 2010 - link

    In Europe the GTX 580 was launched at 399 Euros and in response ATI has lowered the 5970 to 389 Euros (if you believe the rumors).

    This can only bode well for holiday prices of the 6970 vs 580.
    Reply
  • samspqr - Tuesday, November 09, 2010 - link

    it's already listed and in stock at alternate.de, but the cheapest one is 480eur

    the only 5970 still in stock there is 540eur
    Reply
  • yzkbug - Tuesday, November 09, 2010 - link

    I moved all my gaming to the living room on a big screen TV and HTPC (a next next gen console in a sense). But, Optimus would be the only way to use this card on HTPC. Reply
  • slatr - Tuesday, November 09, 2010 - link

    Ryan,

    Would you be able to test with Octane Renderer?

    I am interested to see if Octane gets throttled.

    Thanks
    Reply
  • Andyburgos - Tuesday, November 09, 2010 - link

    Ryan:

    I hold you in the most absolute respect. Actually, in my first post a while ago I praised your work, and I think you´re quite didactic and fun to read. On that, thanks for the review.

    However, I need to ask you: W.T.F. is wrong with you? Aren´t you pissed off by the fact that GTX480 was a half baked chip (wouldn´t say the same about GTX460) and now that we get the real version they decided to call it 580? Why isn´t a single complain about that in the article?

    If, as I understand, you think that the new power / temperature / noise / performance balance has improved dramatically from the 480, I think you are smart enough to see that it was because the 480 was very, very, unpolished chip. This renaming takes us for stupid, is even worse than what AMD did.

    /rant

    AT & staff, I think you have a duty to tell off lousy tactics such as the Barts being renamed 68x0, or the 8800 becoming 9800 then GTS250 as you always did. You have failed so badly to do that here that you look really biased. For me, a loyal argentinian reader since 2001, that is absolutely imposible, but with the GXT460 and this you are acomplishing that.

    +1 for this card deserving an indifferent thumbs up, as Ryan graciously said, not for the card itself (wich is great) but for the nVidia tactics and the half baked 480 they gave us. Remember the FX5800 (as bad or worse than the 480) becoming the 5900... gosh, I think those days are over. Maybe that´s why I stick with my 7300 GT, haha.

    I respectfully disent with your opinion, but thanks for the great review.

    Best regards,
    Andy
    Reply
  • ViRGE - Tuesday, November 09, 2010 - link

    Huh, are we reading the same article? See page 4. Reply
  • chizow - Tuesday, November 09, 2010 - link

    I'd have to agree he probably didn't read the article thoroughly, beside explicitly saying this is the 2nd worst excuse for a new naming denomination, Ryan takes jabs at the 480 throughout by repeatedly hinting the 580 is what Fermi should've been to begin with.

    Sounds like just another short-sighted rant about renaming that conveniently forgets all the renaming ATI has done in the past. See how many times ATI renamed their R200 and R300 designs, even R600 and RV670 fall into the same exact vein as the G92 renaming he bemoans......
    Reply
  • Haydyn323 - Tuesday, November 09, 2010 - link

    Nvidia has done no different than ATI has as far as naming in their new cards. They simply jumped on the naming bandwagon for marketing and competetive purposes since ATI had already done so.... at least the 580 is actually faster than the 480. ATI releasing a 6870 that is far inferior to a 5870 is worse in my mind.

    It should indeed have been a 485, but since ATI calls their new card a 6870 when it really should be a 5860 or something, it only seems fair.
    Reply
  • spigzone - Tuesday, November 09, 2010 - link

    Any 'bandwagon' here belongs to Nvidia. Reply

Log in

Don't have an account? Sign up now