Power, Temperature, & Noise

Last but not least as always is our look at the power consumption, temperatures, and acoustics of the GTX 570. While NVIDIA’s optimizations to these attributes were largely focused on bringing the GTX 480’s immediate successor to more reasonable levels, the GTX 570 certainly stands to gain some benefits too.

Starting with VIDs, we once again only have 1 card so there’s not too much data we can draw. Our GTX 570 sample has a lower VID than our GTX 580, but as we know NVIDIA is using a range of VIDs for each card. From what we’ve seen with the GTX 470 we’d expect the average GTX 570 VID to be higher than the average GTX 580 VID, as NVIDIA is likely using ASICs with damaged ROPs/SMs for the GTX 570, along with ASICs that wouldn’t make the cut at a suitable voltage with all functional units turned on. As a result the full power advantage of lower clockspeeds and fewer functional units would not always be realized here.

GeForce GTX 500 Series Voltages
Ref 580 Load Asus 580 Load Ref 570 Load Ref 570 Idle
1.037v 1.000v 1.025v 0.912v

With fewer functional units than the GTX 580 and a less leaky manufacturing process than the GTX 400 series, the GTX 570 is the first GF1x0 card to bring our test rig under 170W at idle. NVIDIA’s larger chips still do worse than AMD’s smaller chips here though, which is driven home by the point that the 6850 CF is only drawing 1 more watt at idle. Nevertheless this is a 4W improvement over the GTX 470 and a 21W improvement over the GTX 480; the former in particular showcases the process improvements as there are more functional units and yet power consumption has dropped.

Under load we once again see NVIDIA’s design and TSMC’s process improvements in action. Even though the GTX 570 is over 20% faster than the GTX 470 on Crysis, power consumption has dropped by 5W. The comparison to the GTX 480 is even more remarkable at a 60W reduction in power consumption for the same level of performance, a number well in excess of the power savings from removing 2 GDDR5 memory chips. It’s quite remarkable how a bunch of minor changes can add up to so much.

On the flipside however we have the newly reinstated FurMark, which we can once again use thanks to W1zzard of TechPowerUp’s successful disabling of NVIDIA’s power throttling features on the GTX 500 series. We’ve rebenched the GTX 580 and GTX 580 SLI along with the GTX 570, and now have fully comparable numbers once more.

It’s quite interesting that while the GTX 570 does so well under Crysis, it does worse here than the GTX 470. The official TDP difference is in the GTX 470’s favor, but not to this degree. I put more stock in the Crysis numbers than the FurMark numbers, but it’s worth noting that the GTX 570 can at times be worse than the GTX 470. However also take note of the fact that the GTX 570 is still beating the GTX 480 by 33W, and we’ve already established the two have similar performance.

Of course the stand-outs here are AMD’s cards, which benefit from AMD’s small, more power efficient chip designs. The Radeon HD 5870 draws 50W+ less than the GTX 570 in all situations, and even the 6850 CF wavers between being slightly worse and slightly better than the GTX 570. With the architecture and process improvements the GTX 570’s power consumption is in line with historically similar cards, but NVIDIA still cannot top AMD on power efficiency.

Up next are idle temperatures. With nearly identical GPUs on an identical platform it should come as little surprise that the GTX 570 performs just like a GTX 580 here: 37C. The GTX 570 joins the ranks among some of our coolest idling high-end cards, and manages to edge out the 5870 while clobbering the SLI and CF cards.

With the same cooling apparatus as the GTX 580 and lower power consumption, we originally expected the GTX 570 to edge out the GTX 580 in all cases, but as it turns out it’s not so simple. When it comes to Crysist the 82C GTX 570 is 3C warmer than the GTX 580, bringing it in line with other cards like the GTX 285. More importantly however it’s 11-12C cooler than the GTX 480/470, driving home the importance of the card’s lower power consumption along with the vapor chamber cooler. Is does not end up being quite as competitive with the 5870 and SLI/CF cards however, as those other setups are between 0C-5C cooler.

As for FurMark, temperatures approach the mid-to-upper 80s, putting it in good company of most other high-end cards. The 5870 edges out the GTX 570 by 2C, while everything else is as warm or warmer. The GTX 480/470 in particular end up being 4C warmer, and as we’ll see a good bit louder.

It’s once we look at our noise data that it’s clear NVIDIA has been tinkering with the fan parameters of their design for the GTX 570, as the GTX 570 doesn’t quite match the GTX 580 here. When it comes to idle noise for example the GTX 570 manages to just about hit the noise floor of our rig, emitting 1.5dB less noise than the GTX 580/480/470. For the purposes of our testing, it’s effectively a silent card at idle in our test rig.

Under load, it’s finally apparent that NVIDIA has tweaked the GTX 570 to be quieter as opposed to cooler. For the slightly higher temperatures we saw earlier the GTX 570 is 2dB quieter than the GTX 580, 3.6dB quieter than the GTX 470, and 6.2dB quieter than the similarly performing GTX 480. For its performance level the GTX 570 effectively tops the charts here – the next quietest cards are the 5850 and 6870, a good pedigree to be compared to. On the other side of the chart we have the 5870 at 1.4dB louder, and the SLI/CF cards at anywhere between 1.2dB and 2.4dB louder.

Compute & Normalized Numbers Final Thoughts
Comments Locked

54 Comments

View All Comments

  • TheHolyLancer - Tuesday, December 7, 2010 - link

    likely because when the 6870s came out they included an FTW edition of the 460 and was hammered? Not to mention in their own guild lines they said no OCing in launch articles.

    If they do do OC comp, most likely in a special article, possibly with retail brought samples rather than sent demos...
  • Ryan Smith - Tuesday, December 7, 2010 - link

    As a rule of thumb I don't do overclock testing with a single card, as overclocking is too variable. I always wait until I have at least 2 cards to provide some validation to our results.
  • CurseTheSky - Tuesday, December 7, 2010 - link

    I don't understand why so many cards still cling to DVI. Seeing that Nvidia is at least including native HDMI on their recent generations of cards is nice, but why, in 2010, on an enthusiast-level graphics card, are they not pushing the envelope with newer standards?

    The fact that AMD includes DVI, HDMI, and DisplayPort natively on their newer lines of cards is probably what's going to sway my purchasing decision this holiday season. Something about having all of these small, elegant, plug-in connectors and then one massive screw-in connector just irks me.
  • Vepsa - Tuesday, December 7, 2010 - link

    Its because most people still have DVI for their desktop monitors.
  • ninjaquick - Tuesday, December 7, 2010 - link

    DVI is a very good plug man, I don't see why you're hating on it.
  • ninjaquick - Tuesday, December 7, 2010 - link

    I meant to reply to OP.
  • DanNeely - Tuesday, December 7, 2010 - link

    Aside from apple almost noone uses DP. Assuming it wasn't too late in the life cycle to do so, I suspect that the new GPU used in the 6xx series of cards next year will have DP support so nvidia can offer many display gaming on a single card, but only because a single DP clockgen (shared by all DP displays) is cheaper to add than 4 more legacy clockgens (one needed per VGA/DVI/HDMI display).
  • Taft12 - Tuesday, December 7, 2010 - link

    Market penetration is just a bit more important than your "elegant connector" for an input nobody's monitor has. What a poorly thought-out comment.
  • CurseTheSky - Tuesday, December 7, 2010 - link

    Market penetration starts by companies supporting the "cutting edge" of technology. DisplayPort has a number of advantages over DVI, most of which would be beneficial to Nvidia in the long run, especially considering the fact that they're pushing the multi-monitor / combined resolution envelope just like AMD.

    Perhaps if you only hold on to a graphics card for 12-18 months, or keep a monitor for many years before finally retiring it, the connectors your new $300 piece of technology provides won't matter to you. If you're like me and tend to keep a card for 2+ years while jumping on great monitor deals every few years as they come up, it's a different ballgame. I've had DisplayPort-capable monitors for about 2 years now.
  • Dracusis - Tuesday, December 7, 2010 - link

    I invested just under $1000 in a 30" professional 8-bit PVA LCD back in 2006 that is still better than 98% of the crappy 6-bit TN panels on the market. It has been used with 4 different video cards, supports DVI, VGA, Component HD and Composite SD. Has an ultra wide color gamut (113%), great contrast, matt screen with super deep blacks and perfectly uniform backlighting along with mem card readers and USB ports.

    Display Port, not any other monitor on the market offers me absolutely nothing new or better in terms of visual quality or features.

    If you honestly see an improvement in quality spending $300 ever 18 months on a new "value" displays then I feel sorry for you, you've made some poorly informed choices and wasted a lot of money.

Log in

Don't have an account? Sign up now