Power, Temperature, & Noise

Last but not least as always is our look at the power consumption, temperatures, and acoustics of the GTX 570. While NVIDIA chose to optimize for both performance and power on the GTX 570/580, the GTX 560 Ti is almost exclusively optimized for performance. As a result NVIDIA’s official TDP has gone up by 10W, and as we’ll see in practice the difference is even greater than that.

GeForce 400/500 Series Load Voltage
Ref GTX 560 Ti Ref GTX 460 1GB Ref GTX 460 768MB
0.950v 1.025v 0.987v

Starting with VIDs, we once again only have 1 card so there’s not too much data we can draw. Our GTX 560 sample has a VID of 0.95v, which is actually lower than our reference 1GB card, which had a VID of 1.025v. NVIDIA’s transistor optimizations can allow them to run their cards at lower voltages compared to the earlier designs, but at the same time based on our data a lower VID appears to be critical for keeping the GTX 560 Ti’s power consumption from significantly growing versus the GTX 460. This is not wholly unexpected – GF104 never suffered from leakage nearly as badly as GF100 did.

Idle power is effectively a wash here. The GTX 560 Ti does have additional functional units which are now consuming power even when the GPU is idling, but at the same time the transistor changes are keeping idle power in check. At 161W it’s as good as the rest of our mid-range cards.

Under Crysis we get our first glimpse that while NVIDIA’s TDP may only be 10W higher, the real world difference is greater. While we cannot isolate just the power consumption of the video card, and we can explain at least some of the difference as being a result of a greater workload on the CPU due to a higher framerate, we can’t fully explain away the difference between the GTX 460 1GB and the GTX 560 Ti. With a 26W delta, we’re confident the extra power draw we’re seeing with the GTX 560 Ti is well over 10W – and keep in mind this is a game test where our numbers are similar to how NVIDIA defines their TDP. Performance per watt is always increasing, but for a mid-range card the GTX 560 Ti doesn’t do so hot here, and as a result it’s a bit worse than even the quite similar Radeon HD 5870.

Under FurMark the gap between the GTX 460 1GB and GTX 560 Ti shrinks some, even after we disable the GTX 560 Ti’s overcurrent protection. It’s now a 21W gap, which is still more than we’d expect given NVIDIA’s official TDP numbers. Furthermore AMD’s PowerTune technology makes NVIDIA look quite bad here – the 200W capped 6950 1GB rig is drawing 44W less, and even the 6970 rig is only drawing 9W more. As we said when AMD first introduced PowerTune, we’re not against TDP-limited technologies so long as they’re done on a truly application-wide basis – so hopefully at some point we’ll see NVIDIA’s overcurrent protection technology evolve in to something closer to PowerTune instead of something that merely targets 2 of many stress testing applications.

The ultimate lesson from this however is that it once more reiterates the importance of good case cooling when using open-ended cards that don’t fully exhaust their hot air, such as the GTX 460 and GTX 560. Our airy GPU test bed has no problem with these cards, but at the end of the day you’re looking at 200W of heat and only around half of which is getting blown out of the case, leaving another 100W for the case to deal with. It’s a reasonable design choice on NVIDIA’s part, but it means you need to use the right case for the job. The GTX 560 Ti makes this all the more important, and I suspect we may be at the limits of what’s practical for a non-blower card.

What do you get when you combine a large, open-air video card with an equally large and airy GPU test rig? Very, very low idle temperatures. Last year the GTX 460 set a new mark on our rig at 34C for idle, but with a bigger cooler and similar idle power consumption numbers, the GTX 560 Ti takes this one step farther. At 28C not only is the GTX 560 Ti several degrees cooler than the aforementioned GTX 560s, but it’s also only several degrees off of room temperature. This is something a blower design such as the GTX 570 or Radeon HD 6950 simply cannot match, even if idle power consumption is similar.

Compared to idle things end up being a little less rosy for the GTX 560 Ti, but they still look good. Even with a bigger cooler the GTX 560 cannot match the GTX 460s, but it’s doing fairly well against everything else. The open-air design still gives it an advantage versus blowers, but not by quite as much – the 6950 is only 4C off. Given that we figure the actual power consumption of the GTX 560 Ti is around 20W more than the GTX 460 1GB, it looks like the GTX 560 Ti’s cooler can’t fully make up for the additional heat the GTX 560 Ti’s GPU puts off.

Under FurMark the story is quite similar. The GTX 560 Ti again is several degrees warmer than the GTX 460, and AMD’s blowers do start catching up thanks to PowerTune. Otherwise at 79C the card still runs quite cool, merely not as cool as the GTX 460 before it. When we get to our noise data however, we’ll see that NVIDIA may have optimized the GTX 560 for noise ahead of temperatures more than they did the GTX 460.

With that said, based on our numbers and TDP estimations though, we’re all the more curious at just how much case cooling is necessary for the GTX 560, and if it’s going to be as flexible as the GTX 460. It may be worth building a secondary GPU test rig with poor ventilation to see if the GTX 560 Ti is still suitable under sub-optimal conditions.

At idle virtually all of our cards run up against the noise floor, so there’s little to be surprised about here. The GTX 560 Ti is effectively as good as everything else.

It’s our load noise values that make us reconsider our earlier temperature data. While the GTX 560 Ti may run hotter than a GTX 460, NVIDIA clearly didn’t lose their knack when it comes to noise. The GTX 460 768MB set a new record for a mid-range card, and the GTX 560 Ti is the second –best, besting even the GTX 460 1GB by a bit over 2dB. When it comes to our GPU testbed the GTX 560 is just shy of silent, which is quite an accomplishment for as much power as it consumes. This also handily illustrates why we don’t consider the Radeon HD 6870 to be much competition for the GTX 560 Ti – it may be cheaper, but it’s also a heck of a lot louder. It takes a 6950 to find an AMD card with similar performance that has acoustic qualities in the same neighborhood.

Compute & Tessellation Final Thoughts
Comments Locked

87 Comments

View All Comments

  • Nimiz99 - Tuesday, January 25, 2011 - link

    One of my buddies has a C2D 8500 system OC'd to 3.5 i think. He got himself a 5870 (overclocked) to game. The problem we ran into was that the C2D is too slow to handle games like Civ5 that heavily rely on the CPU to keep up (you can still play the game, but it's literally wasting the 5870 with noticeable lag from the chip). Basically, he is upgrading now to a sandy bridge. I'd wager some of the older i7's or maybe even a Thuban (OC'd to 3.8 with a good HT overclock) could manage, but why bother when a new architecture is out form Intel (or AMD later in the year).
    So enjoy your new build ;),
    Nimiz
  • Beenthere - Tuesday, January 25, 2011 - link

    Over the last couple years Nvidia has really struggled and they may be on the ropes at this point. They have created a lot of their own problems with their arrogance so we'll see how it all plays out.
  • kilkennycat - Tuesday, January 25, 2011 - link

    eVGA GTX560 Ti "Superclocked" Core: 900MHz, Shader 1800MHZ; Memory 4212MHz $279.99

    ~ 10% factory-overclock for $20 extra, together with a lifetime warranty (if you register within 30 days) ain't too shabby....
  • Belard - Tuesday, January 25, 2011 - link

    Sure, the name shouldn't be a big deal... but each year or worse, Nvidia comes up with a new marketing product name that is meaningless and confusing.

    Here is the full product name:

    GeForce GTX 560 Ti But in reality, the only part that is needed or makes ANY sense is:
    GeForce 560

    GTX / GT / GTs are worthless. Unless there were GTX 560, GTS 560 and GT 560. Much like the older 8800 series.

    TI is only added to this idiotic mess. Might as well Ultra, Pro or MX.... so perhaps Nvidia will come out with the "GT 520 mx"?

    The product itself is solid, why turn it into something stupid with your marketing department?

    AMD does it right (mostly), the "Radeaon 6870" that's it. DUH.
  • omelet - Tuesday, January 25, 2011 - link

    Yeah. Not that it really matters. And while this might be what you meant by "mostly" note that AMD's naming was pretty retarded this generation with the 68xx having lower performance than 58xx.

    But I don't see why they readopted the Ti moniker.
  • Sufo - Wednesday, January 26, 2011 - link

    no, that's only a result of the 5xxx series being stupidly named. Using 5970 for a dual chip part was the error. Use an x2 suffix or smthng. AMD is back on track with the 6xxx naming convention... well, until we see what they do with the 6 series dual chip card.
  • Belard - Thursday, January 27, 2011 - link

    The model numbers of:

    x600, x800, etc have been consistent since the 3000 series.

    x800 is top
    x700 is high-end mid range ($200 sub)
    x600 is mid-range ($150 sub)
    x400~500 low-end ($50~60)
    x200~300 Desktop or HTPC cards.

    AMD said they changed because they didn't want to confuse people with the 5750/5770 cards with the 6000 series. Which is completely stupid... so instead they confuse everyone with all th cards.

    If the 6800s were called 6700s - they would have been easily faster than any of the 5700s and at least somewhat equal to the 5800s (sometimes slower, others faster). Instead, we have "6850" that is slower than the 5850.

    The prices are a bit high still, yet far cheaper than the 5800 series, in which a 5850 was $300+ or $400 for the 5870. But by all means, I'd rather spend $220 on a 6870 than $370 on todays 5870s.

    Anyways, I'm still using a 4670 in my main computer. When I do my next upgrade, I'll spend about $200 at the most and want at least 6870 level of performance, which is still about 4x faster than what I have now. Noise & heat are very high on my list, my 4670 was $15 extra for the better noise & heat cooling system. Perhaps in 6 months, the AMD 7000 or GeForce 700 series will be out.
  • marraco - Tuesday, January 25, 2011 - link

    Is the first time I see a radiator geometrically aligned to the direction of air velocity thrown by the fan.

    Obviously it increases the efficiency of the fan, increasing the flow of air thrown across the radiator, and reducing noise.

    It’s an obvious enhancement in air cooling, that I don’t understand why CPU coolers don’t use.
  • strikeback03 - Tuesday, January 25, 2011 - link

    I wouldn't be surprised if in some cases the increase in fin surface area (from having a bunch of straight fins packed more closely together) produces better cooling than having a cleaner airpath.
  • MeanBruce - Wednesday, January 26, 2011 - link

    You should check out the four Asus Direct CU II three slot radiators that came out today on the GTX 580, 570, and the HD 6970 and 6950, each using two 100mm fans, five heatpipes and three slots of pure metal, they claim you can easily fit two of them on ATX for SLI and CB?

Log in

Don't have an account? Sign up now