Power, Temperature, & Noise

Last but not least as always is our look at the power consumption, temperatures, and acoustics. As there isn’t a reference GTX 560, we’re working with what we have: the ASUS GTX 560 DirectCU II Top. For the sake of completeness we’re including our results for power/temp/noise at both the Base (810MHz) and Mid (850MHz) on the ASUS card. However if ASUS is goosing the voltage a bit to hit 925MHz, then it’s likely we’re drawing a bit more power here than a card specifically targeted for those performance levels.

GeForce GTX 460/560 Series Load Voltage
GTX 460 GTX 560 Ti ASUS GTX 560 ASUS GTX 560 OC
1.025v 1.0v 1.037v 1.062v

Looking at voltage quickly, ASUS is running the GTX 560 Top at 1.037v. This is a bit more than any other GF114/GF104 card that we’ve seen, but not by a great deal. The voltage difference between the GTX 560 Top and the reference GTX 560 Ti does mean that any power benefits of having a SM disabled are wiped out. In other words, the GTX 560 Top can offer GTX 560 Ti-like performance, but at GTX 560 Ti-like power consumption.

Idle power consumption looks very good here. The GTX 560 Ti already did well, and now the GTX 560 does even better. The difference ultimately comes down to the power savings realized by disabling a SM.

Starting with our sample card, the ASUS GTX 560, we’ve already hinted at the fact that power consumption between these heavily factory overclocked cards and the GTX 560 Ti will end up being very similar, in accordance with their similar performance. The results deliver on that concept, with the ASUS GTX 560 and the GTX 560 Ti being separated by only 7W in the ASUS GTX 560’s favor. Overclocking doesn’t have the expected ramp up in power consumption however, as even with the slightly higher clocks and higher voltage, power consumption only rises by 10W for the whole system.

As for our simulated GTX 560 Base and Mid, we can’t say too much. Based on NVIDIA’s specs and the better leakage properties of GF114, there’s no reason why a GTX 560 built for those clocks shouldn’t be able to achieve power consumption similar to (if not better than) the GTX 460 series. We’d get far better data from a suitably lower performing GTX 560 card.

One thing that is clear however is that unless power consumption on lower clocked GTX 560s was more than 20W lower, AMD’s general advantage in power consumption is unchallenged. The same can be said for the GTX 6950, which consumes nearly 18W less than the ASUS GTX 560, even though the latter is often the performance laggard.

Under FurMark the ASUS GTX 560 actually does worse than the GTX 560 Ti, likely due to the ASUS card’s lack of overcurrent protection circuitry and the realization of the full impact of operating at a higher voltage. The difference isn’t this pronounced in games, but FurMark hits all the right notes. Along this same train of thought we see our overclocked ASUS GTX 560 consuming a further 20W beyond what the card consumes under factory settings. The overclocked ASUS GTX 560 usually could beat the GTX 560 Ti, but the power consumption is a major tradeoff.

As for our simulated GTX 560 Mid and Base cards, the results are of a similar nature as our Crysis power results. Power consumption is higher than both the GTX 460 series and AMD’s Radeon HD 6800 series due to the voltages involved.

Idle temperatures are largely a function of the cooler being used. The GTX 560 Ti did exceptionally well here, and it’s nearly impossible to compete with it. At 33C the ASUS GTX 560 is among the coolest cards in our regular charts, and yet it can’t catch the GTX 560 Ti.

When looking at ASUS cards, we often see them favoring aggressive cooling over noise. We’ll get to noise in a bit, but certainly it looks like they have cooling as if not more aggressive than the reference GTX 560 Ti. At 71C the ASUS GTX 560 and the GTX 560 Ti are tied, and are both well below a number of other cards in temperature; an impressive feat given the performance we’ve seen earlier. Our simulated cards are a bit cooler, but they would probably be even better if they were at a lower voltage.

Unfortunately FurMark doesn’t look as good as Crysis, thanks in part to AMD’s use of PowerTune on the 6900 series, and the higher power consumption due to ASUS’s overvolting making itself felt. 84C under Furmark is not at all bad as it’s not even close to any sort of critical GPU temperature, but it’s not quite chart topping. It’s also well off what we’ve seen the GTX 560 Ti do, which is 5C cooler at 79C. Further overclocking and overvolting on the ASUS GTX 560 does dive the temperature up to 89C, which means at 1.062v we’re probably applying as much voltage as we can reasonably get away with.

As for our simulated cards, both the GTX 560 Base and GTX 560 Mid are well above their GTX 460 counterparts. Part of this goes back to power consumption, but it also involves just how different their respective coolers are.

It’s rare to see a card not bottom out on our idle noise testing, and the ASUS GTX 560 doesn’t disappoint. At idle it’s whisper quiet, and it can’t be distinguished from our noise floor.

Last but not least is our look at load noise, where ASUS’s generally aggressive cooling finally becomes quantified. At 54.7dB it’s about the middle of the pack, and actually beats the Radeon HD 6870. But we’ve seen the reference GTX 560 Ti complete the same test at 8dB less – even if we could equalize the power consumption the GTX 560 Ti reference cooler seems to have an edge over ASUS’s DirectCU II cooler when it comes to noise under load. Cutting down the clocks to Base or Mid levels helps this some, but then rendering performance shrinks away from the GTX 560 Ti.

Compute Performance Final Thoughts
Comments Locked

66 Comments

View All Comments

  • L. - Thursday, May 19, 2011 - link

    You'll have some trouble doing an apples to apples comparison between a 580 and a 6970 ...

    The 580 tdp goes through the roof when you OC it, not so much with the 6970.

    The 580 is a stronger gpu than the 6970 by a fair margin (2% @ 2560 to 10+%@1920), does not depend much on drivers.

    The 580 costs enough to make you consider a 6950 crossfire as a better alternative . or even 6970 cf ...

    The day drivers will be comparable is about a few months from now still as both cards are relatively fresh.
  • mosox - Tuesday, May 17, 2011 - link

    And of course the only factory OCed card in there is from Nvidia.

    Can you show me ONE review in which you did the same for AMD? Including a factory OCed card in a general review and compare it to stock Nvidia cards?

    Are you trying to inform your readers or to pander to Nvidia by following to the letter their "review guides"? No transparency in the game selection (again that TWIMTBP-heavy list), OCed cards, what's next? Changing the site's color from blue to green? Letting the people at Nvidia to do your reviews and you just post them in here?

    :(
  • TheJian - Wednesday, May 18, 2011 - link

    The heavily overclocked card is from ASUS. :)

    NV didn't send them a card. There is no ref design for this card (reference clocks, but not a card). They tested at 3 speeds, giving us a pretty good idea of multiple variants you'd see in the stores. What more do you want?

    Nvidia didn't have anything to do with the article. They put NV's name on the SLOWER speeds in the charts, but NV didn't send them a card. Get it? They down-clocked the card ASUS sent to show what NV would assume they'd be clocked at on the shelves. AT smartly took what they had to work with (a single 560 from ASUS - mentioned as why they have no SLI benchmarks in with 560), clocked it at the speeds you'd buy (based on checking specs online) and gave us the best idea they could of what we'd expect on a shelf or from oem.

    Or am I just missing something in this article?

    Is it Anandtech's problem NV has spent money on devs trying to get them to pay attention to their tech? AMD could do this stuff too if they weren't losing 6Bil in 3 years time (more?). I'm sure they do it some, but obviously a PROFITABLE company (for many more years than AMD - AMD hasn't made a dime since existence as a whole), with cash in the bank and no debt, can blow money on game devs and give them some engineers to help with drivers etc.

    http://moneycentral.msn.com/investor/invsub/result...
    That link should work..(does in FFox). If you look at a 10 year summary, AMD has lost about 6bil over 10yrs. That's NOT good. It's not easy coming up with a top games list that doesn't include TWIMTBP games.

    I tend to agree with the link below. We'd have far more console ports if PC companies (Intel,AMD,Nvidia) didn't hand some money over to devs in some way shape or form. For example, engineers working with said game companies etc to optimize for new tech etc. We should thank them (any company that practices this). This makes better PC games.

    Not a fan of fud, but they mention Dirt2, Hawx, battleforge & Halflife2 were all ATI enhanced games. Assassins Creed for NV and a ton more now of course.
    http://www.fudzilla.com/graphics/item/11037-battle...

    http://www.bit-tech.net/news/gaming/2009/10/03/wit...

    Many more sites about both sides on their "enhancements" to games by working with devs. It's not straight cash they give, but usually stuff like engineers, help with promotions etc. With Batman, NV engineers wrote the AA part for their cards in the game. It looks better too. AMD was offered the same (probably couldn't afford it, so just complained saying "they made it not like our cards". Whatever. They paid, you didn't so it runs better on their cards in AA. More on NV's side due to more money, but AMD does this too.
  • bill4 - Tuesday, May 17, 2011 - link

    Crysis 1, but not Crysis 2? Wheres Witcher 2 benches (ok, that one may have been time constraints). Doesnt LA Noire have a PC version you could bench? Maybe even Homefront?

    It's the same old ancient tired PC bench staples that most sites use. I can only guess this is because of lazyness.
  • Ryan Smith - Tuesday, May 17, 2011 - link

    I expect to be using Crysis 1 for quite a bit longer. It's still the de-facto ourdoor video card killer. The fact that it still takes more than $500 in GPUs to run it at 1920 with full Enthusiast settings and AA means it's still a very challenging game.

    Crysis 2 I'm not even looking at until the DX11 update comes out. We want to include more games that fully utilize the features of today's GPUs, not fewer.

    LA Noire isn't out on the PC. In fact based on Rockstar's history with their last game, I'm not sure we'll ever see it.

    In any case, the GPU test suite is due for a refresh in the next month. We cycle it roughly every 6 months, though we don't replace every single game every time.
  • mosox - Wednesday, May 18, 2011 - link

    In any case, the GPU test suite is due for a refresh in the next month.In any case, the GPU test suite is due for a refresh in the next month.


    Make sure you don't slip in any game that Nvidia doesn't like or they might cut you off from the goodies. 100% TWIMTBP please, no matter how obscure are the games.
  • TheJian - Wednesday, May 18, 2011 - link

    Ignore whats on a box. Go to metacritic and pick top scoring games from last 12 months up to now. If the game doesn't get 80+/100 you pass. Not enough like or play it probably below there. You could toss in a beta of duke forever or something like that if you can find a popular game that's about to come out and has a benchmark built in. There's only a few games that need to go anyway (mostly because newer ones are out in the same series - Crysis 2 w/dx11 update when out).

    Unfortunately mosox, you can't make an AMD list (not a long one) as they aren't too friendly with devs (no money or free manpower, duh), and devs will spend more time optimizing for the people that give them the most help. Plain and simple. If you reversed the balance sheets, AMD would be doing the same thing (more of it than now anyway).

    In 2009 when this cropped up Nvidia had 220 people in a dept. that was purely losing money (more now?). It was a joke that they never made nvidia any money, but they were supplying devs with people that would create physx effects, performance enhancements etc to get games to take advantage of Nvidia's features. I don't have any problem with that until AMD doesn't have the option to send over people to do the same. AFAIK they are always offered, but can't afford it, decline and then whine about Nvidia. Now if NV says "mr gamemaker you can't let AMD optimize because you signed with us"...ok. Lawsuit.
  • mosox - Wednesday, May 18, 2011 - link

    I don't want "AMD games" that would be the same thing. I just don't want obscure games that are fishy and biased as well.

    Games in which a GTX 460/768 is better than a HD 6870 AND they're not even popular - but are in there to skew the final results exactly like in this review. Take out HAWX 2 and LP2 and do the performance summary again.

    Lately in every single review you can see some nvidia cards beating the AMD counterparts with 2-5% ONLY because of the biased game selection.

    A HW site has to choose between being fair and unbiased and serve its readers or sell out to some company and become a shill for that company.

    HAWX 2 is only present because Nvidia (not Ubisoft!) demanded that. It's a shame.
  • Spoelie - Wednesday, May 18, 2011 - link

    Both HAWX 2 and Lost Planet 2 are not in this review?
  • mosox - Wednesday, May 18, 2011 - link

    I was talking in general. HAWX2 isn't but HAWX is. And Civ 5 in which the AMD cards are lumped together at the bottom and there's no difference whatsoever between a HD 6870 and a HD 6950.

Log in

Don't have an account? Sign up now