Power, Temperature, & Noise

Last but not least as always is our look at the power consumption, temperatures, and acoustics of the GTX 570. While NVIDIA chose to optimize for both performance and power on the GTX 570/580, the GTX 560 Ti is almost exclusively optimized for performance. As a result NVIDIA’s official TDP has gone up by 10W, and as we’ll see in practice the difference is even greater than that.

GeForce 400/500 Series Load Voltage
Ref GTX 560 Ti Ref GTX 460 1GB Ref GTX 460 768MB
0.950v 1.025v 0.987v

Starting with VIDs, we once again only have 1 card so there’s not too much data we can draw. Our GTX 560 sample has a VID of 0.95v, which is actually lower than our reference 1GB card, which had a VID of 1.025v. NVIDIA’s transistor optimizations can allow them to run their cards at lower voltages compared to the earlier designs, but at the same time based on our data a lower VID appears to be critical for keeping the GTX 560 Ti’s power consumption from significantly growing versus the GTX 460. This is not wholly unexpected – GF104 never suffered from leakage nearly as badly as GF100 did.

Idle power is effectively a wash here. The GTX 560 Ti does have additional functional units which are now consuming power even when the GPU is idling, but at the same time the transistor changes are keeping idle power in check. At 161W it’s as good as the rest of our mid-range cards.

Under Crysis we get our first glimpse that while NVIDIA’s TDP may only be 10W higher, the real world difference is greater. While we cannot isolate just the power consumption of the video card, and we can explain at least some of the difference as being a result of a greater workload on the CPU due to a higher framerate, we can’t fully explain away the difference between the GTX 460 1GB and the GTX 560 Ti. With a 26W delta, we’re confident the extra power draw we’re seeing with the GTX 560 Ti is well over 10W – and keep in mind this is a game test where our numbers are similar to how NVIDIA defines their TDP. Performance per watt is always increasing, but for a mid-range card the GTX 560 Ti doesn’t do so hot here, and as a result it’s a bit worse than even the quite similar Radeon HD 5870.

Under FurMark the gap between the GTX 460 1GB and GTX 560 Ti shrinks some, even after we disable the GTX 560 Ti’s overcurrent protection. It’s now a 21W gap, which is still more than we’d expect given NVIDIA’s official TDP numbers. Furthermore AMD’s PowerTune technology makes NVIDIA look quite bad here – the 200W capped 6950 1GB rig is drawing 44W less, and even the 6970 rig is only drawing 9W more. As we said when AMD first introduced PowerTune, we’re not against TDP-limited technologies so long as they’re done on a truly application-wide basis – so hopefully at some point we’ll see NVIDIA’s overcurrent protection technology evolve in to something closer to PowerTune instead of something that merely targets 2 of many stress testing applications.

The ultimate lesson from this however is that it once more reiterates the importance of good case cooling when using open-ended cards that don’t fully exhaust their hot air, such as the GTX 460 and GTX 560. Our airy GPU test bed has no problem with these cards, but at the end of the day you’re looking at 200W of heat and only around half of which is getting blown out of the case, leaving another 100W for the case to deal with. It’s a reasonable design choice on NVIDIA’s part, but it means you need to use the right case for the job. The GTX 560 Ti makes this all the more important, and I suspect we may be at the limits of what’s practical for a non-blower card.

What do you get when you combine a large, open-air video card with an equally large and airy GPU test rig? Very, very low idle temperatures. Last year the GTX 460 set a new mark on our rig at 34C for idle, but with a bigger cooler and similar idle power consumption numbers, the GTX 560 Ti takes this one step farther. At 28C not only is the GTX 560 Ti several degrees cooler than the aforementioned GTX 560s, but it’s also only several degrees off of room temperature. This is something a blower design such as the GTX 570 or Radeon HD 6950 simply cannot match, even if idle power consumption is similar.

Compared to idle things end up being a little less rosy for the GTX 560 Ti, but they still look good. Even with a bigger cooler the GTX 560 cannot match the GTX 460s, but it’s doing fairly well against everything else. The open-air design still gives it an advantage versus blowers, but not by quite as much – the 6950 is only 4C off. Given that we figure the actual power consumption of the GTX 560 Ti is around 20W more than the GTX 460 1GB, it looks like the GTX 560 Ti’s cooler can’t fully make up for the additional heat the GTX 560 Ti’s GPU puts off.

Under FurMark the story is quite similar. The GTX 560 Ti again is several degrees warmer than the GTX 460, and AMD’s blowers do start catching up thanks to PowerTune. Otherwise at 79C the card still runs quite cool, merely not as cool as the GTX 460 before it. When we get to our noise data however, we’ll see that NVIDIA may have optimized the GTX 560 for noise ahead of temperatures more than they did the GTX 460.

With that said, based on our numbers and TDP estimations though, we’re all the more curious at just how much case cooling is necessary for the GTX 560, and if it’s going to be as flexible as the GTX 460. It may be worth building a secondary GPU test rig with poor ventilation to see if the GTX 560 Ti is still suitable under sub-optimal conditions.

At idle virtually all of our cards run up against the noise floor, so there’s little to be surprised about here. The GTX 560 Ti is effectively as good as everything else.

It’s our load noise values that make us reconsider our earlier temperature data. While the GTX 560 Ti may run hotter than a GTX 460, NVIDIA clearly didn’t lose their knack when it comes to noise. The GTX 460 768MB set a new record for a mid-range card, and the GTX 560 Ti is the second –best, besting even the GTX 460 1GB by a bit over 2dB. When it comes to our GPU testbed the GTX 560 is just shy of silent, which is quite an accomplishment for as much power as it consumes. This also handily illustrates why we don’t consider the Radeon HD 6870 to be much competition for the GTX 560 Ti – it may be cheaper, but it’s also a heck of a lot louder. It takes a 6950 to find an AMD card with similar performance that has acoustic qualities in the same neighborhood.

Compute & Tessellation Final Thoughts
Comments Locked

87 Comments

View All Comments

  • auhgnist - Tuesday, January 25, 2011 - link

    1920x1080 graph is wrong, should be mistakenly used that of 2560x1600
  • Ryan Smith - Tuesday, January 25, 2011 - link

    Fixed. Thanks.
  • Marlin1975 - Tuesday, January 25, 2011 - link

    6950 1gig look good.

    I am guessing the 560 will either drop in price very quickly or the 6950 will sell better.
  • Lolimaster - Tuesday, January 25, 2011 - link

    Not impressive at alla the 560, 6950 1GB is a good value over the 2GB 6950. I think if you just prefer 1GB 6870 offers more bang for buck.
  • cactusdog - Tuesday, January 25, 2011 - link

    Wow, plenty of good options from AMD and Nvidia. Since the introduction of eyefinity and 3D surround, we dont need to spend a fortune to play the latest games. For most users with 1 monitor a $250 dollar card gives excellent performance.
  • tech6 - Tuesday, January 25, 2011 - link

    Like top end desktop CPUs, the high end GPU really seems to be increasingly irrelevant for most gamers as the mid-range provides plenty of performance for a fraction of the cost.
  • Nimiz99 - Tuesday, January 25, 2011 - link

    I was just curious about the 2.8 FPS on Crysis by the Radeon HD 5970 - is that reproducible/consistent?
    I am just curious, b/c on the first graph of average frame-rate it leads the pack; if it fluctuates that badly I would definitely like a little bit more background on it.

    'Preciate the response,
    Nimiz
  • Ryan Smith - Tuesday, January 25, 2011 - link

    No, it's highly variable. With only 1GB of effective VRAM, the Radeon cards are forced to texture swap - the minimum framerate is chaotic at best and generally marks how long the worst texture swap took. With swapping under the control of AMD's drivers, the resulting minimum framerate ends up being quite variable.
  • Shadowmaster625 - Tuesday, January 25, 2011 - link

    Can somebody explain why 1GB is not enough when 1GB is enough memory to store over 160 frames at 24 bits at 1920x1080. At 60fps, 1GB should be able to supply a constant uncompressed stream of frames for almost 3 whole seconds. Seems like more than enough memory to me. Sounds like somebody is just haphazardly wasting vast amounts of space for no reason at all. Sort of like windows with its WinSXS folder. Lets just waste a bunch of space because we can!
  • ciukacz - Tuesday, January 25, 2011 - link

    are you streaming your benchmark video through youtube ?
    because i am rendering mine realtime, which requires loading all the textures, geometry etc.

Log in

Don't have an account? Sign up now