Power, Temperature, and Noise

Last but not least as always is our look at the power consumption, temperatures, and acoustics of the GTX 580. NVIDIA’s performance improvements were half of the GTX 580 story, and this is the other half.

Starting quickly with voltage, as we only have one card we can’t draw too much from what we know, but there are still some important nuggets. NVIDIA is still using multiple VIDs, so your mileage may vary. What’s clear from the start though is that NVIDIA’s operating voltages compared to the GTX 480 are higher for both idle and load. This is the biggest hint that leakage has been seriously dealt with, as low voltages are a common step to combat leakage. Even with these higher voltages running on a chip similar to GF100, overall power usage is still going to be lower. And on that note, while the voltages have changed the idle clocks have not; idle remains at 50.6MHz for the core.

GeForce GTX 480/580 Voltages
Ref 480 Load Ref 480 Idle Ref 580 Load Ref 580 Idle
0.959v 0.875v 1.037v 0.962v

Beginning with idle power, we’re seeing our second biggest sign that NVIDIA has been tweaking things specifically to combat leakage. Idle power consumption has dropped by 17W on our test system even though the idle clocks are the same and the idle voltage higher. NVIDIA doesn’t provide an idle power specification, but based on neighboring cards idle power consumption can’t be far off from 30-35W. Amusingly it still ends up being more than the 6870 CF however, thanks to the combination of AMD’s smaller GPUs and ULPS power saving mode for the slave GPU.

Looking at Crysis, we begin to see the full advantage of NVIDIA’s optimizations and where a single GPU is more advantageous over multiple GPUs. Compared to the GTX 480 NVIDIA’s power consumption is down 10% (never mind the 15% performance improvement), and power consumption comes in under all similar multi-GPU configurations. Interestingly the 5970 still draws less power here, a reminder that  we’re still looking at cards near the peak of the PCIe specifications.

As for FurMark, due to NVIDIA’s power throttling we’ve had to get a bit creative. FurMark is throttled to the point where the GTX 580 registers 360W, thanks to a roughly 40% reduction in performance under FurMark. As a result for the GTX 580 we’ve swapped out FurMark for another program that generates a comparable load, Program X. At this point we’re going to decline to name the program, as should NVIDIA throttle it we may be hard pressed to determine if and when this happened.

In any case, under FurMark & X we can see that once again NVIDIA’s power consumption has dropped versus the GTX 480, this time by 27W or around 6%. NVIDIA’s worst case scenario has notably improved, and in the process the GTX 580 is back under the Radeon HD 5970 in terms of power consumption. Thus it goes without saying that while NVIDIA has definitely improved power consumption, the GTX 580 is still a large, power hungry GPU.

With NVIDIA’s improvements in cooling and in idle power consumption, there’s not a result more dramatic than idle GPU temperatures. The GTX 580 isn’t just cooler, it’s cool period. 37C is one of the best results out of any of our midrange and high-end GPUs, and is a massive departure from the GTX 480 which was at least warm all the time. As we’ll see however, this kind of an idle temperature does come with a small price.

The story under load is much the same as idle: compared to the GTX 480 the GTX 580’s temperatures have dramatically dropped. At 79C it’s in the middle of the pack, beating a number of single and multi GPU setups, and really only losing to mainstream-class GPUs and the 6870 CF. While we’ve always worried about the GTX 480 at its load temperatures, the GTX 580 leaves us with no such concerns.

Meanwhile under FurMark and Program X, the gap has closed, though the GTX 580 remains in the middle of the pack. 87C is certainly toasty, but it’s still well below the thermal threshold and below the point where we’d be worried about it. Interestingly however, the GTX 580 is actually just a bit closer to its thermal threshold than the GTX 480 is; NVIDIA rated the 480 for 105C, while the 580 is rated for 97C. We’d like to say this vindicates our concerns about the GTX 480’s temperatures, but it’s more likely that this is a result of the transistors NVIDIA is using.

It’s also worth noting that NVIDIA seems to have done away with the delayed fan ramp-up found on the GTX 480. The fan ramping on the GTX 580 is as near as we can tell much more traditional, with the fan immediately ramping up with higher temperatures. For the purposes of our tests, this keeps the temperatures from spiking as badly.

Remember where we said there was a small price to pay for such low idle temperatures? This is it. At 44.4dB, the 580 is ever so slightly (and we do mean slightly) louder than the GTX 480; it also ends up being a bit louder than the 5970 or 6870CF. 44.4 is not by any means loud, but if you want a card that’s whisper silent at idle, the GTX 580 isn’t going to be able to deliver.

And last but not least is load noise. Between their improvements to power consumption and to cooling, NVIDIA put a lot of effort in to the amount of noise the GTX 580 generates. Where the GTX 480 set new records for a single GPU card, the GTX 580 is quieter than the GTX 285, the GTX 470, and even the Radeon HD 5870. In fact it’s only a dB off of the 5850, a card under most circumstances we’d call the epitome of balance between performance and noise. Graphs alone cannot demonstrate just how much of a difference there is between the GTX 480 and GTX 580 – the GTX 580 is not whisper quiet, but at no point in our testing did it ever get “loud”. It’s a truly remarkable difference; albeit one that comes at the price of pointing out just how lousy the GTX 480 was.

Often the mark of a good card is a balance between power, temperature, and noise, and NVIDIA seems to have finally found their mark. As the GTX 580 is a high end card the power consumption is still high, but it’s no longer the abnormality that was the GTX 480. Meanwhile GPU temperatures have left our self-proclaimed danger zone, and yet at the same time the GTX 580 has become a much quieter card under load than the GTX 480. If you had asked us in what NVIDIA needed to work on with the GTX 480, we would have said noise, temperature, and power consumption in that order; the GTX 580 delivers on just what we would have wanted.

Compute and Tessellation Final Thoughts
POST A COMMENT

159 Comments

View All Comments

  • Haydyn323 - Tuesday, November 09, 2010 - link

    Nobody seems to be taking into account the fact that the 580 is a PREMIUM level card. It is not meant to be compared to a 6870. Sure 2x 6870s can do more. This card is not, however, geared for that category of buyer.

    It is geared for the enthusiast who intends to buy 2 or 3 580s and completely dominate benchmarks and get 100+ fps in every situation. Your typical gamer will not likely buy a 580, but your insane gamer will likely buy 2 or 3 to play their 2560x1600 monitor at 60fps all the time.

    I fail to see how AMD is destroying anything here. Cost per speed AMD wins, but speed possible, Nvidia clearly wins for the time being. If anyone can come up with something faster than 3x 580s in the AMD camp feel free to post it in response here.
    Reply
  • TemplarGR - Tuesday, November 09, 2010 - link

    Do you own NVIDIA stock, or are you a fanboy? Because really, only one of the two could not see how AMD destroys NVIDIA. AMD's architecture is much more efficient.

    How many "insane gamers" exist, that would pay 1200 or 1800 dollars just for gpus, and adding to that an insanely expensive PSU, tower and mainboard needed to support such a thing? And play what? Console ports? On what screens? Maximum resolution is still 2560x1600 and even a single 6870 could do fine in most games in it...

    And just because there may be about 100 rich kids in the whole world with no lives who could create such a machine, does it make 580 a success?

    Do YOU intent to create such a beast? Or would you buy a mainstream NVIDIA card, just because the posibility of 3x 580s exists?Come on...
    Reply
  • Haydyn323 - Tuesday, November 09, 2010 - link

    So, the answer is no; you cannot come up with something faster. Also, as shown right here on Anandtech:

    http://www.anandtech.com/show/3987/amds-radeon-687...

    A single 6870 cannot play most modern games at anywhere near 60fps at 2560x1600. Even the 580 needs to be SLI'd to guarantee it.

    That is all.
    Reply
  • Haydyn323 - Tuesday, November 09, 2010 - link

    Oh and yes I do intend to buy a couple of them in a few months. One at first and add another later. I also love when fanboys call other fanboys, "fanboys." It doesn't get anyone anywhere. Reply
  • smookyolo - Tuesday, November 09, 2010 - link

    PC games are not simply console ports, the fact that you need a top of the line PC to get even close to 60 FPS in most cases at not even maximum graphics settings is proof of this.

    PC "ports" of console games have been tweaked and souped up to have much better graphics, and can take advantage of current gen hardware, instead of the ancient hardware in consoles.

    The "next gen" consoles will, of course, be worse than PCs of the time.

    And game companies will continue to alter their games so that they look better on PCs.

    It's a fact, live with it.
    Reply
  • mapesdhs - Tuesday, November 09, 2010 - link


    'How many "insane gamers" exist, that would pay 1200 or 1800 dollars just for gpus, ...'

    Actually the market for this is surprisingly strong in some areas, especially
    CA I was told. I suspect it's a bit like other components such as top-spec
    hard drives and high-end CPUs: the volumes are smaller but the margins
    are significantly higher for the seller.

    Some sellers even take a loss on low-end items just to retain the custom,
    making their money on more expensive models.

    Ian.
    Reply
  • QuagmireLXIX - Sunday, November 14, 2010 - link

    "Maximum resolution is still 2560x1600 and even a single 6870 could do fine in most games in it..."

    Multiple monitors (surround, eyefinity) resolutions get much larger.
    Reply
  • 7Enigma - Tuesday, November 09, 2010 - link

    Just to clarify your incorrect (or misleading) statement 2 6870's in CF use significantly more power than a single 580, but also perform significantly better in most games (minimum frame rate issue noted however). Reply
  • TemplarGR - Tuesday, November 09, 2010 - link

    True. I made a mistake on this one. Only in idle power it consumes slightly less. My bad. Reply
  • cjb110 - Tuesday, November 09, 2010 - link

    "The thermal pads connecting the memory to the shroud have once again wiped out the chip markets", wow powerful adhesive that! Bet Intel's pissed. Reply

Log in

Don't have an account? Sign up now