Power, Temperature, & Noise

Last but not least as always is our look at the power consumption, temperatures, and acoustics. As there isn’t a reference GTX 560, we’re working with what we have: the ASUS GTX 560 DirectCU II Top. For the sake of completeness we’re including our results for power/temp/noise at both the Base (810MHz) and Mid (850MHz) on the ASUS card. However if ASUS is goosing the voltage a bit to hit 925MHz, then it’s likely we’re drawing a bit more power here than a card specifically targeted for those performance levels.

GeForce GTX 460/560 Series Load Voltage
GTX 460 GTX 560 Ti ASUS GTX 560 ASUS GTX 560 OC
1.025v 1.0v 1.037v 1.062v

Looking at voltage quickly, ASUS is running the GTX 560 Top at 1.037v. This is a bit more than any other GF114/GF104 card that we’ve seen, but not by a great deal. The voltage difference between the GTX 560 Top and the reference GTX 560 Ti does mean that any power benefits of having a SM disabled are wiped out. In other words, the GTX 560 Top can offer GTX 560 Ti-like performance, but at GTX 560 Ti-like power consumption.

Idle power consumption looks very good here. The GTX 560 Ti already did well, and now the GTX 560 does even better. The difference ultimately comes down to the power savings realized by disabling a SM.

Starting with our sample card, the ASUS GTX 560, we’ve already hinted at the fact that power consumption between these heavily factory overclocked cards and the GTX 560 Ti will end up being very similar, in accordance with their similar performance. The results deliver on that concept, with the ASUS GTX 560 and the GTX 560 Ti being separated by only 7W in the ASUS GTX 560’s favor. Overclocking doesn’t have the expected ramp up in power consumption however, as even with the slightly higher clocks and higher voltage, power consumption only rises by 10W for the whole system.

As for our simulated GTX 560 Base and Mid, we can’t say too much. Based on NVIDIA’s specs and the better leakage properties of GF114, there’s no reason why a GTX 560 built for those clocks shouldn’t be able to achieve power consumption similar to (if not better than) the GTX 460 series. We’d get far better data from a suitably lower performing GTX 560 card.

One thing that is clear however is that unless power consumption on lower clocked GTX 560s was more than 20W lower, AMD’s general advantage in power consumption is unchallenged. The same can be said for the GTX 6950, which consumes nearly 18W less than the ASUS GTX 560, even though the latter is often the performance laggard.

Under FurMark the ASUS GTX 560 actually does worse than the GTX 560 Ti, likely due to the ASUS card’s lack of overcurrent protection circuitry and the realization of the full impact of operating at a higher voltage. The difference isn’t this pronounced in games, but FurMark hits all the right notes. Along this same train of thought we see our overclocked ASUS GTX 560 consuming a further 20W beyond what the card consumes under factory settings. The overclocked ASUS GTX 560 usually could beat the GTX 560 Ti, but the power consumption is a major tradeoff.

As for our simulated GTX 560 Mid and Base cards, the results are of a similar nature as our Crysis power results. Power consumption is higher than both the GTX 460 series and AMD’s Radeon HD 6800 series due to the voltages involved.

Idle temperatures are largely a function of the cooler being used. The GTX 560 Ti did exceptionally well here, and it’s nearly impossible to compete with it. At 33C the ASUS GTX 560 is among the coolest cards in our regular charts, and yet it can’t catch the GTX 560 Ti.

When looking at ASUS cards, we often see them favoring aggressive cooling over noise. We’ll get to noise in a bit, but certainly it looks like they have cooling as if not more aggressive than the reference GTX 560 Ti. At 71C the ASUS GTX 560 and the GTX 560 Ti are tied, and are both well below a number of other cards in temperature; an impressive feat given the performance we’ve seen earlier. Our simulated cards are a bit cooler, but they would probably be even better if they were at a lower voltage.

Unfortunately FurMark doesn’t look as good as Crysis, thanks in part to AMD’s use of PowerTune on the 6900 series, and the higher power consumption due to ASUS’s overvolting making itself felt. 84C under Furmark is not at all bad as it’s not even close to any sort of critical GPU temperature, but it’s not quite chart topping. It’s also well off what we’ve seen the GTX 560 Ti do, which is 5C cooler at 79C. Further overclocking and overvolting on the ASUS GTX 560 does dive the temperature up to 89C, which means at 1.062v we’re probably applying as much voltage as we can reasonably get away with.

As for our simulated cards, both the GTX 560 Base and GTX 560 Mid are well above their GTX 460 counterparts. Part of this goes back to power consumption, but it also involves just how different their respective coolers are.

It’s rare to see a card not bottom out on our idle noise testing, and the ASUS GTX 560 doesn’t disappoint. At idle it’s whisper quiet, and it can’t be distinguished from our noise floor.

Last but not least is our look at load noise, where ASUS’s generally aggressive cooling finally becomes quantified. At 54.7dB it’s about the middle of the pack, and actually beats the Radeon HD 6870. But we’ve seen the reference GTX 560 Ti complete the same test at 8dB less – even if we could equalize the power consumption the GTX 560 Ti reference cooler seems to have an edge over ASUS’s DirectCU II cooler when it comes to noise under load. Cutting down the clocks to Base or Mid levels helps this some, but then rendering performance shrinks away from the GTX 560 Ti.

Compute Performance Final Thoughts
Comments Locked

66 Comments

View All Comments

  • Ryan Smith - Tuesday, May 17, 2011 - link

    We always include MIRs in our pricing, given their prevalence. With MIRs, there are no fewer than 6 6870s at Newegg below $180 (and a 7th at $183).
  • C'DaleRider - Wednesday, May 18, 2011 - link

    "I don't know where you guys are getting this information, but the Radeon HD 6870 IS NOT at $180."

    Actually, I don't know where YOU are getting your misinformed information.

    Right now, on Newegg, 6870's are as low as $162 after rebate, $182 before rebate. (An XFX card, btw.)

    Take a look......

    http://www.newegg.com/Product/ProductList.aspx?Sub...

    Now, maybe you'll get your facts correct before posting drivel.
  • L. - Thursday, May 19, 2011 - link

    Don't feed teh trollz ;)
  • Stas - Wednesday, May 18, 2011 - link

    Idk, I paid $160 for my Sapphire HD6870 with dual fans 2 months ago o.O
  • araczynski - Tuesday, May 17, 2011 - link

    ...would those 'in the know' know whether games like the witcher2 and skyrim would be better off with the nvidia or amd line?
  • TheJian - Wednesday, May 18, 2011 - link

    Witcher 2 doesn't use Aurora engine, all new now so wait for it to be put in some sites benchmarking. Same with skyrim, it uses a new Creation engine, and also 100% dynamic lighting with lots of snow and cliffs (which this engine is designed for).

    Sorry. Not much worth extrapolating other than we have no idea who will win later :) If we're being honest anyway. I suppose a quick google might get a hit on the devs opinions.
  • Sunsmasher - Tuesday, May 17, 2011 - link

    Your comments about confusing naming are very valid.
    I've long ago come to the conclusion that the confusing naming issue is a deliberate strategy by Nvidia.
    They WANT to create confusion to make it more difficult for less sophisticated buyers
    to compare cards head to head and Nvidia can thereby pick up a few more sales than they would otherwise.
    The proof that this is deliberate is the fact that they Keep Doing It.
    Otherwise, they would have very straightforward, very easily compared naming conventions:
    Higher numbers = more power, GT not as powerful as GTX, etc.
    An unfortunate state of affairs, but not about to change even with writers often complaining about it.
    Fortunately, there are resources on the web that compare cards head to head.
  • TheJian - Wednesday, May 18, 2011 - link

    They're not trying to confuse you. They're just trying to sell every chip they can. A lot of dies have defects etc that cause them to release a plethora of cards at different speeds, features disabled (possibly due to defects in dies) etc. Die shrinks cause problems too. Sometimes they save enough in power/heat to warrant a new release # or model. Take the GTX 260. The core216 came out, fixed heat issues and was a good 10% faster. People would want to identify the faster/cooler cards and not get screwed. I hate Motherboard makers not listing the REV prominently on the box, or in ads. It's tough to buy online when I'm after a specific rev. This is more a tech issue than a company deliberately ticking us off.

    If you don't mind paying MUCH higher prices, they can go ahead an toss all defective dies and get back to 3 product lines with easily seen performance advantages between the 3. AMD, Intel, Nvidia, etc they all have this problem. Of course progress would really slow down if they take this route. A person going into the store and seeing a 6750 card, might find a 5850 sitting next to it for $200 and wonder what the heck is going on...LOL. I could almost say the same about the 6850. That 6850 should blow away a 5850, I mean its a whole 1000 higher right? Confusing yes? But that 5850 beats the 6850 by about 10% in everything. There are a lot of these examples. Heck this time NV let the manufacturers decide everything (clock/memory/ref design).

    In an age of small margins, just about everything in your PC being a commodity, and shareholders demanding every last dollar they can get from company X, you should just get used to tons of products not performing too differently. Really, I can make up my mind in one night of reading reviews on 3-4 websites. By the end of the night I can decide how to spend my money and be fairly certain I'm not making a big mistake. But yeah, if you're not willing to do some homework, get ready to buy something that's completely disappointing on occasion. But you're already here, no worries :) We have hardware review sites, because stores shelves and floor reps at fry's don't help us at all... :) I pity marketing dept's trying to work with all these dies/re-launches/binning etc that probably cause them nightmares...LOL Could they do better here and there? Probably. Would I like to try to make us all happy? HECK NO. :) I take that back, I wouldn't mind taking a crack at intel naming. :)
  • sysdawg - Tuesday, May 17, 2011 - link

    Ryan,

    Thanks for your reviews. On page 3 you write "on the ATI side we’re using the Catalyst 11.5a hotfix"...but is that the case for all the AMD cards? The same page lists three drivers being used: Catalyst 10.10e, 11.4, and 11.5a. And for the Nvidia cards, you also list three drivers: 262.99, 270.51 beta, and 275.20 beta. If you could help, I'd specifically like to know which drivers were used for the GTX 580 and the Radeon 6970. And since I'm going to be running at 2560x1600, it would also help to know which drivers were used for those 2 cards in your March 24 review (of the GTX 590), since that review included that resolution. My thinking is that if the drivers are reasonably current for both cards, then it is closer to being 'apples to apples'.

    Thanks in advance, and thank you again for your reviews.
  • Ryan Smith - Wednesday, May 18, 2011 - link

    Cat 10.10e: Radeon 3xxx/4xxx.
    Cat 11.4: Radeon 5xxx/6850/6970
    Cat 11.5: Radeon 6870/6950

    262.99: GeForce 2xx
    270.51: GeForce 4xx, 580/570/550
    275.20: GeForce GTX 560 Ti, GTX 460, GTX 560

    As for the March 24th review of the GTX 590, all the high end cards were on 266/267 drivers or the Catalyst 10.4 preview drivers respectively. Those were the newest drivers at the time of that publication.

    And I apologize for the somewhat chaotic nature of the driver selection. We benchmark many different cards, redoing them for every single driver revision simply isn't practical. The relevant cards will be updated for any given article, and many (if not all) of the cards are updated if there's a major driver release that significantly impacts performance.

Log in

Don't have an account? Sign up now