Power, Temperature, & Noise

Last but not least as always is our look at the power consumption, temperatures, and acoustics. As there isn’t a reference GTX 560, we’re working with what we have: the ASUS GTX 560 DirectCU II Top. For the sake of completeness we’re including our results for power/temp/noise at both the Base (810MHz) and Mid (850MHz) on the ASUS card. However if ASUS is goosing the voltage a bit to hit 925MHz, then it’s likely we’re drawing a bit more power here than a card specifically targeted for those performance levels.

GeForce GTX 460/560 Series Load Voltage
GTX 460 GTX 560 Ti ASUS GTX 560 ASUS GTX 560 OC
1.025v 1.0v 1.037v 1.062v

Looking at voltage quickly, ASUS is running the GTX 560 Top at 1.037v. This is a bit more than any other GF114/GF104 card that we’ve seen, but not by a great deal. The voltage difference between the GTX 560 Top and the reference GTX 560 Ti does mean that any power benefits of having a SM disabled are wiped out. In other words, the GTX 560 Top can offer GTX 560 Ti-like performance, but at GTX 560 Ti-like power consumption.

Idle power consumption looks very good here. The GTX 560 Ti already did well, and now the GTX 560 does even better. The difference ultimately comes down to the power savings realized by disabling a SM.

Starting with our sample card, the ASUS GTX 560, we’ve already hinted at the fact that power consumption between these heavily factory overclocked cards and the GTX 560 Ti will end up being very similar, in accordance with their similar performance. The results deliver on that concept, with the ASUS GTX 560 and the GTX 560 Ti being separated by only 7W in the ASUS GTX 560’s favor. Overclocking doesn’t have the expected ramp up in power consumption however, as even with the slightly higher clocks and higher voltage, power consumption only rises by 10W for the whole system.

As for our simulated GTX 560 Base and Mid, we can’t say too much. Based on NVIDIA’s specs and the better leakage properties of GF114, there’s no reason why a GTX 560 built for those clocks shouldn’t be able to achieve power consumption similar to (if not better than) the GTX 460 series. We’d get far better data from a suitably lower performing GTX 560 card.

One thing that is clear however is that unless power consumption on lower clocked GTX 560s was more than 20W lower, AMD’s general advantage in power consumption is unchallenged. The same can be said for the GTX 6950, which consumes nearly 18W less than the ASUS GTX 560, even though the latter is often the performance laggard.

Under FurMark the ASUS GTX 560 actually does worse than the GTX 560 Ti, likely due to the ASUS card’s lack of overcurrent protection circuitry and the realization of the full impact of operating at a higher voltage. The difference isn’t this pronounced in games, but FurMark hits all the right notes. Along this same train of thought we see our overclocked ASUS GTX 560 consuming a further 20W beyond what the card consumes under factory settings. The overclocked ASUS GTX 560 usually could beat the GTX 560 Ti, but the power consumption is a major tradeoff.

As for our simulated GTX 560 Mid and Base cards, the results are of a similar nature as our Crysis power results. Power consumption is higher than both the GTX 460 series and AMD’s Radeon HD 6800 series due to the voltages involved.

Idle temperatures are largely a function of the cooler being used. The GTX 560 Ti did exceptionally well here, and it’s nearly impossible to compete with it. At 33C the ASUS GTX 560 is among the coolest cards in our regular charts, and yet it can’t catch the GTX 560 Ti.

When looking at ASUS cards, we often see them favoring aggressive cooling over noise. We’ll get to noise in a bit, but certainly it looks like they have cooling as if not more aggressive than the reference GTX 560 Ti. At 71C the ASUS GTX 560 and the GTX 560 Ti are tied, and are both well below a number of other cards in temperature; an impressive feat given the performance we’ve seen earlier. Our simulated cards are a bit cooler, but they would probably be even better if they were at a lower voltage.

Unfortunately FurMark doesn’t look as good as Crysis, thanks in part to AMD’s use of PowerTune on the 6900 series, and the higher power consumption due to ASUS’s overvolting making itself felt. 84C under Furmark is not at all bad as it’s not even close to any sort of critical GPU temperature, but it’s not quite chart topping. It’s also well off what we’ve seen the GTX 560 Ti do, which is 5C cooler at 79C. Further overclocking and overvolting on the ASUS GTX 560 does dive the temperature up to 89C, which means at 1.062v we’re probably applying as much voltage as we can reasonably get away with.

As for our simulated cards, both the GTX 560 Base and GTX 560 Mid are well above their GTX 460 counterparts. Part of this goes back to power consumption, but it also involves just how different their respective coolers are.

It’s rare to see a card not bottom out on our idle noise testing, and the ASUS GTX 560 doesn’t disappoint. At idle it’s whisper quiet, and it can’t be distinguished from our noise floor.

Last but not least is our look at load noise, where ASUS’s generally aggressive cooling finally becomes quantified. At 54.7dB it’s about the middle of the pack, and actually beats the Radeon HD 6870. But we’ve seen the reference GTX 560 Ti complete the same test at 8dB less – even if we could equalize the power consumption the GTX 560 Ti reference cooler seems to have an edge over ASUS’s DirectCU II cooler when it comes to noise under load. Cutting down the clocks to Base or Mid levels helps this some, but then rendering performance shrinks away from the GTX 560 Ti.

Compute Performance Final Thoughts
Comments Locked

66 Comments

View All Comments

  • DanNeely - Wednesday, May 18, 2011 - link

    Where exactly are you finding a $300ish 2560x monitor? IIRC even the best sales Dell had on refurb 3007's only dropped as a low as $800ish, and with the potential inventory of off lease 3007's mostly gone by now and the 3008 and 3011's being significantly more expensive deals that good aren't likely to repeat themselves in the future.
  • L. - Thursday, May 19, 2011 - link

    My mistake you two ;)

    I was thinking about a lcd pannel from idunnowho that had 2xyz * something resolution and that was dirt cheap .. obviously 2560*1440 aren't common at all and overpriced.

    On the other hand, you could make the argument for a dual monitor setup below 400 bucks that spans more pixels and thus makes more use of the gfx.
  • Stas - Wednesday, May 18, 2011 - link

    You fail once again. You plan on keeping this card until 20" screens hit 2560x1600/1440? It will probably only be... oh, idk... 10 years?
    And $330 for a decent 2560 screen? Links plz.
  • L. - Thursday, May 19, 2011 - link

    Yes sir, I would like to be agressive to you to ?

    On the other hand ... 10 years ??

    My Amoled screen on my n900 has a dpi small enough to cram more than 4*full HD on a 20" pannel, something that will happen soon enough as the oled processes mature.

    Again, my mistake on the monitor price, memory error.
  • L. - Thursday, May 19, 2011 - link

    Who would buy a 200 bucks card to play on a single 150 bucks monitor when the whole config costs 700+ bucks ?

    200 bucks is A_DECENT_AMOUNT_OF_MONEY for a GFX, it means you're a gamer (maybe a poor one though) and it means you might be interested in dual screen (meh you spent 700 bucks on the tower, why not 2*150 for dual 22" 1080p monitors ?).
  • L. - Tuesday, May 17, 2011 - link

    I'm seeing quite a trend with AMD stuff getting better scores (relatively) on more recent and demanding games, and I'm wondering if it would be time to weight games differently for a better comparison.

    For example here, on the important/demanding/modern games (let's take Metro 2033 and Crysis to have undisputable arguments here), the 560 doesn't ever come close to a 6950 and only the best version can beat the 6870 by almost nothing.

    If somebody buys a gfx today, it might be to use it for at least another year, and in that sense, the weight of less important games should be diminished a lot, including hawx-120fps-fest, other 100+ fps titles and the clearly nVidia-favoring Civ5.

    What is important here, is that NOONE has any interest in buying a gtx560 today, because of the following extremely important points :

    -> AMD offerings do better in more demanding games, and will thus do better in future games
    -> AMD offerings (6950 for example) have more memory, which WILL be used in next-gen games for sure, as for every gen
    -> Noone cares if they have 110 or 120 fps in hawx, which is a console game anyway

    I believe the use of any PC component for gamers can be summarized to this in the end :

    -> can it play this game ? 30+
    -> can it play it well ? 60+

    Because any of those components will for most people be used 2 years from now, the fact that older / less-demanding games get 110 fps is completely irrelevant, might as well show 557 fps in quake3 as a benchmark...

    As a summary, could you anandtech guys please tweak your test list / weighting in order to better inform the less-informed readers of your website ?

    It is utter nonsense to state today that a 560Ti "trades blows" with a 6950 or that a factory OC'd 560 "comes close" to a 6950 either.

    The one real true fact is the 6950 gets a huge win in all demanding titles, has 1GB more vRAM and can even unlock+OC very easily to levels none of the 560 versions can reach.

    nVidia has done some great stuff in the past, but one has to admit that outside of quad-sli gtx580 there is no use in buying anything nVidia this round, as AMD offers better performance + performance/watt at every price point this time around.

    There is one argument for nVidia and that argument (no, not the drivers, because you do NOT play on linux) is the nVidia goodies like 3d gaming and other minor stuff.
  • crimson117 - Tuesday, May 17, 2011 - link

    I half agree with you... some of your commentary is good (HAWX lol) but one particular conclusion is not tenable:

    "AMD offerings do better in more demanding games, and will thus do better in future games"

    When Mass Effect 3 comes out, I expect that like Mass Effect 2 it will strongly favor nVidia GPU's - unless they rewrote the entire engine.

    New games cannot be classified into demanding vs non-demanding - each game engine has its favorite factors, be it clock speed, memory bandwidth, stream processors, ROP's, etc, so I expect each game will have its favorite card.
  • formulav8 - Tuesday, May 17, 2011 - link

    The thing is, in games that people can actually use the horsepower, the Radeon is the best card.

    If you care about getting 500fps on Quake3 instead of 450fps, then the GTX is the better card.
  • lowlymarine - Tuesday, May 17, 2011 - link

    The problem is that if they DON'T complete rewrite the entire engine, Mass Effect 3 will continue to be a festival of even mid-range cards breaking 60 FPS. While there's nothing wrong with that per se - ME2 is one of the better-looking games out there despite being not particularly intensive, after all - it still means that nVidia's slight advantage over AMD in that game is meaningless. Compare that to Crysis where even the 6970 falls short of 60 FPS at WUXGA, and the sizable lead AMD carries over the competition there has real, noticeable impact on the game.
  • Stas - Wednesday, May 18, 2011 - link

    Correction:
    New games cannot be classified into demanding vs non-demanding - each game engine has its favorite chip developer. Money and politics will decide performance in certain games.

Log in

Don't have an account? Sign up now