Power, Temperature, and Noise

Last but not least as always is our look at the power consumption, temperatures, and acoustics of the GTX 580. NVIDIA’s performance improvements were half of the GTX 580 story, and this is the other half.

Starting quickly with voltage, as we only have one card we can’t draw too much from what we know, but there are still some important nuggets. NVIDIA is still using multiple VIDs, so your mileage may vary. What’s clear from the start though is that NVIDIA’s operating voltages compared to the GTX 480 are higher for both idle and load. This is the biggest hint that leakage has been seriously dealt with, as low voltages are a common step to combat leakage. Even with these higher voltages running on a chip similar to GF100, overall power usage is still going to be lower. And on that note, while the voltages have changed the idle clocks have not; idle remains at 50.6MHz for the core.

GeForce GTX 480/580 Voltages
Ref 480 Load Ref 480 Idle Ref 580 Load Ref 580 Idle
0.959v 0.875v 1.037v 0.962v

Beginning with idle power, we’re seeing our second biggest sign that NVIDIA has been tweaking things specifically to combat leakage. Idle power consumption has dropped by 17W on our test system even though the idle clocks are the same and the idle voltage higher. NVIDIA doesn’t provide an idle power specification, but based on neighboring cards idle power consumption can’t be far off from 30-35W. Amusingly it still ends up being more than the 6870 CF however, thanks to the combination of AMD’s smaller GPUs and ULPS power saving mode for the slave GPU.

Looking at Crysis, we begin to see the full advantage of NVIDIA’s optimizations and where a single GPU is more advantageous over multiple GPUs. Compared to the GTX 480 NVIDIA’s power consumption is down 10% (never mind the 15% performance improvement), and power consumption comes in under all similar multi-GPU configurations. Interestingly the 5970 still draws less power here, a reminder that  we’re still looking at cards near the peak of the PCIe specifications.

As for FurMark, due to NVIDIA’s power throttling we’ve had to get a bit creative. FurMark is throttled to the point where the GTX 580 registers 360W, thanks to a roughly 40% reduction in performance under FurMark. As a result for the GTX 580 we’ve swapped out FurMark for another program that generates a comparable load, Program X. At this point we’re going to decline to name the program, as should NVIDIA throttle it we may be hard pressed to determine if and when this happened.

In any case, under FurMark & X we can see that once again NVIDIA’s power consumption has dropped versus the GTX 480, this time by 27W or around 6%. NVIDIA’s worst case scenario has notably improved, and in the process the GTX 580 is back under the Radeon HD 5970 in terms of power consumption. Thus it goes without saying that while NVIDIA has definitely improved power consumption, the GTX 580 is still a large, power hungry GPU.

With NVIDIA’s improvements in cooling and in idle power consumption, there’s not a result more dramatic than idle GPU temperatures. The GTX 580 isn’t just cooler, it’s cool period. 37C is one of the best results out of any of our midrange and high-end GPUs, and is a massive departure from the GTX 480 which was at least warm all the time. As we’ll see however, this kind of an idle temperature does come with a small price.

The story under load is much the same as idle: compared to the GTX 480 the GTX 580’s temperatures have dramatically dropped. At 79C it’s in the middle of the pack, beating a number of single and multi GPU setups, and really only losing to mainstream-class GPUs and the 6870 CF. While we’ve always worried about the GTX 480 at its load temperatures, the GTX 580 leaves us with no such concerns.

Meanwhile under FurMark and Program X, the gap has closed, though the GTX 580 remains in the middle of the pack. 87C is certainly toasty, but it’s still well below the thermal threshold and below the point where we’d be worried about it. Interestingly however, the GTX 580 is actually just a bit closer to its thermal threshold than the GTX 480 is; NVIDIA rated the 480 for 105C, while the 580 is rated for 97C. We’d like to say this vindicates our concerns about the GTX 480’s temperatures, but it’s more likely that this is a result of the transistors NVIDIA is using.

It’s also worth noting that NVIDIA seems to have done away with the delayed fan ramp-up found on the GTX 480. The fan ramping on the GTX 580 is as near as we can tell much more traditional, with the fan immediately ramping up with higher temperatures. For the purposes of our tests, this keeps the temperatures from spiking as badly.

Remember where we said there was a small price to pay for such low idle temperatures? This is it. At 44.4dB, the 580 is ever so slightly (and we do mean slightly) louder than the GTX 480; it also ends up being a bit louder than the 5970 or 6870CF. 44.4 is not by any means loud, but if you want a card that’s whisper silent at idle, the GTX 580 isn’t going to be able to deliver.

And last but not least is load noise. Between their improvements to power consumption and to cooling, NVIDIA put a lot of effort in to the amount of noise the GTX 580 generates. Where the GTX 480 set new records for a single GPU card, the GTX 580 is quieter than the GTX 285, the GTX 470, and even the Radeon HD 5870. In fact it’s only a dB off of the 5850, a card under most circumstances we’d call the epitome of balance between performance and noise. Graphs alone cannot demonstrate just how much of a difference there is between the GTX 480 and GTX 580 – the GTX 580 is not whisper quiet, but at no point in our testing did it ever get “loud”. It’s a truly remarkable difference; albeit one that comes at the price of pointing out just how lousy the GTX 480 was.

Often the mark of a good card is a balance between power, temperature, and noise, and NVIDIA seems to have finally found their mark. As the GTX 580 is a high end card the power consumption is still high, but it’s no longer the abnormality that was the GTX 480. Meanwhile GPU temperatures have left our self-proclaimed danger zone, and yet at the same time the GTX 580 has become a much quieter card under load than the GTX 480. If you had asked us in what NVIDIA needed to work on with the GTX 480, we would have said noise, temperature, and power consumption in that order; the GTX 580 delivers on just what we would have wanted.

Compute and Tessellation Final Thoughts


View All Comments

  • wtfbbqlol - Thursday, November 11, 2010 - link

    Most likely an anomaly. Just compare the GTX480 to the GTX470 minimum framerate. There's no way the GTX480 is twice as fast as the GTX470. Reply
  • Oxford Guy - Friday, November 12, 2010 - link

    It does not look like an anomaly since at least one of the few minimum frame rate tests posted by Anandtech also showed the 480 beating the 580.

    We need to see Unigine Heaven minimum frame rates, at the bare minimum, from Anandtech, too.
  • Oxford Guy - Saturday, November 13, 2010 - link

    To put it more clearly... Anandtech only posted minimum frame rates for one test: Crysis.

    In those, we see the 480 SLI beating the 580 SLI at 1920x1200. Why is that?

    It seems to fit with the pattern of the 480 being stronger in minimum frame rates in some situations -- especially Unigine -- provided that the resolution is below 2K.

    I do hope someone will clear up this issue.
  • wtfbbqlol - Wednesday, November 10, 2010 - link

    It's really disturbing how the throttling happens without any real indication. I was really excited reading about all the improvements nVidia made to the GTX580 then I read this annoying "feature".

    When any piece of hardware in my PC throttles, I want to know about it. Otherwise it just adds another variable when troubleshooting performance problem.

    Is it a valid test to rename, say, crysis.exe to furmark.exe and see if throttling kicks in mid-game?
  • wtfbbqlol - Wednesday, November 10, 2010 - link

    Well it looks like there is *some* official information about the current implementation of the throttling.


    Copy and paste of the message:
    "NVIDIA has implemented a new power monitoring feature on GeForce GTX 580 graphics cards. Similar to our thermal protection mechanisms that protect the GPU and system from overheating, the new power monitoring feature helps protect the graphics card and system from issues caused by excessive power draw.

    The feature works as follows:
    • Dedicated hardware circuitry on the GTX 580 graphics card performs real-time monitoring of current and voltage on each 12V rail (6-pin, 8-pin, and PCI-Express).
    • The graphics driver monitors the power levels and will dynamically adjust performance in certain stress applications such as Furmark 1.8 and OCCT if power levels exceed the card’s spec.
    • Power monitoring adjusts performance only if power specs are exceeded AND if the application is one of the stress apps we have defined in our driver to monitor such as Furmark 1.8 and OCCT.
    - Real world games will not throttle due to power monitoring.
    - When power monitoring adjusts performance, clocks inside the chip are reduced by 50%.

    Note that future drivers may update the power monitoring implementation, including the list of applications affected."
  • Sihastru - Wednesday, November 10, 2010 - link

    I never heard anyone from the AMD camp complaining about that "feature" with their cards and all current AMD cards have it. And what would be the purpose of renaming your Crysis exe? Do you have problems with the "Crysis" name? You think the game should be called "Furmark"?

    So this is a non issue.
  • flyck - Wednesday, November 10, 2010 - link

    the use of renaming is that nvidia uses name tags to identify wether it should throttle or not.... suppose person x creates a program and you use an older driver that does not include this name tag, you can break things..... Reply
  • Gonemad - Wednesday, November 10, 2010 - link

    Big fat YES. Please do rename the executable from crysis.exe to furmark.exe, and tell us.

    Get furmark and go all the way around, rename it to Crysis.exe, but be sure to have a fire extinguisher in the premises. Caveat Emptor.

    Perhaps just renaming in not enough, some checksumming is involved. It is pretty easy to change checksum without altering the running code, though. When compiling source code, you can insert comments in the code. When compiling, the comments are not dropped, they are compiled together with the running code. Change the comment, change the checksum. But furmark alone can do that.

    Open the furmark on a hex editor and change some bytes, but try to do that in a long sequence of zeros at the end of the file. Usually compilers finish executables in round kilobytes, filling with zeros. It shouldn't harm the running code, but it changes the checksum, without changing byte size.

    If it works, rename it Program X.

  • iwodo - Wednesday, November 10, 2010 - link

    The good thing about GPU is that it scales VERY well ( if not linearly ) with transistors. 1 Node Die Shrink, Double the transistor account, double the performance.

    Combined there are not bottleneck with Memory, which GDDR5 still have lots of headroom, we are very limited by process and not the design.
  • techcurious - Wednesday, November 10, 2010 - link

    I didnt read through ALL the comments, so maybe this was already suggested. But, can't the idle sound level be reduced simply by lowering the fan speed and compromising idle temperatures a bit? I bet you could sink below 40db if you are willing to put up with an acceptable 45 C temp instead of 37 C temp. 45 C is still an acceptable idle temp. Reply

Log in

Don't have an account? Sign up now