Power, Temperature, & Noise

Last but not least as always is our look at the power consumption, temperatures, and acoustics. As there isn’t a reference GTX 560, we’re working with what we have: the ASUS GTX 560 DirectCU II Top. For the sake of completeness we’re including our results for power/temp/noise at both the Base (810MHz) and Mid (850MHz) on the ASUS card. However if ASUS is goosing the voltage a bit to hit 925MHz, then it’s likely we’re drawing a bit more power here than a card specifically targeted for those performance levels.

GeForce GTX 460/560 Series Load Voltage
GTX 460 GTX 560 Ti ASUS GTX 560 ASUS GTX 560 OC
1.025v 1.0v 1.037v 1.062v

Looking at voltage quickly, ASUS is running the GTX 560 Top at 1.037v. This is a bit more than any other GF114/GF104 card that we’ve seen, but not by a great deal. The voltage difference between the GTX 560 Top and the reference GTX 560 Ti does mean that any power benefits of having a SM disabled are wiped out. In other words, the GTX 560 Top can offer GTX 560 Ti-like performance, but at GTX 560 Ti-like power consumption.

Idle power consumption looks very good here. The GTX 560 Ti already did well, and now the GTX 560 does even better. The difference ultimately comes down to the power savings realized by disabling a SM.

Starting with our sample card, the ASUS GTX 560, we’ve already hinted at the fact that power consumption between these heavily factory overclocked cards and the GTX 560 Ti will end up being very similar, in accordance with their similar performance. The results deliver on that concept, with the ASUS GTX 560 and the GTX 560 Ti being separated by only 7W in the ASUS GTX 560’s favor. Overclocking doesn’t have the expected ramp up in power consumption however, as even with the slightly higher clocks and higher voltage, power consumption only rises by 10W for the whole system.

As for our simulated GTX 560 Base and Mid, we can’t say too much. Based on NVIDIA’s specs and the better leakage properties of GF114, there’s no reason why a GTX 560 built for those clocks shouldn’t be able to achieve power consumption similar to (if not better than) the GTX 460 series. We’d get far better data from a suitably lower performing GTX 560 card.

One thing that is clear however is that unless power consumption on lower clocked GTX 560s was more than 20W lower, AMD’s general advantage in power consumption is unchallenged. The same can be said for the GTX 6950, which consumes nearly 18W less than the ASUS GTX 560, even though the latter is often the performance laggard.

Under FurMark the ASUS GTX 560 actually does worse than the GTX 560 Ti, likely due to the ASUS card’s lack of overcurrent protection circuitry and the realization of the full impact of operating at a higher voltage. The difference isn’t this pronounced in games, but FurMark hits all the right notes. Along this same train of thought we see our overclocked ASUS GTX 560 consuming a further 20W beyond what the card consumes under factory settings. The overclocked ASUS GTX 560 usually could beat the GTX 560 Ti, but the power consumption is a major tradeoff.

As for our simulated GTX 560 Mid and Base cards, the results are of a similar nature as our Crysis power results. Power consumption is higher than both the GTX 460 series and AMD’s Radeon HD 6800 series due to the voltages involved.

Idle temperatures are largely a function of the cooler being used. The GTX 560 Ti did exceptionally well here, and it’s nearly impossible to compete with it. At 33C the ASUS GTX 560 is among the coolest cards in our regular charts, and yet it can’t catch the GTX 560 Ti.

When looking at ASUS cards, we often see them favoring aggressive cooling over noise. We’ll get to noise in a bit, but certainly it looks like they have cooling as if not more aggressive than the reference GTX 560 Ti. At 71C the ASUS GTX 560 and the GTX 560 Ti are tied, and are both well below a number of other cards in temperature; an impressive feat given the performance we’ve seen earlier. Our simulated cards are a bit cooler, but they would probably be even better if they were at a lower voltage.

Unfortunately FurMark doesn’t look as good as Crysis, thanks in part to AMD’s use of PowerTune on the 6900 series, and the higher power consumption due to ASUS’s overvolting making itself felt. 84C under Furmark is not at all bad as it’s not even close to any sort of critical GPU temperature, but it’s not quite chart topping. It’s also well off what we’ve seen the GTX 560 Ti do, which is 5C cooler at 79C. Further overclocking and overvolting on the ASUS GTX 560 does dive the temperature up to 89C, which means at 1.062v we’re probably applying as much voltage as we can reasonably get away with.

As for our simulated cards, both the GTX 560 Base and GTX 560 Mid are well above their GTX 460 counterparts. Part of this goes back to power consumption, but it also involves just how different their respective coolers are.

It’s rare to see a card not bottom out on our idle noise testing, and the ASUS GTX 560 doesn’t disappoint. At idle it’s whisper quiet, and it can’t be distinguished from our noise floor.

Last but not least is our look at load noise, where ASUS’s generally aggressive cooling finally becomes quantified. At 54.7dB it’s about the middle of the pack, and actually beats the Radeon HD 6870. But we’ve seen the reference GTX 560 Ti complete the same test at 8dB less – even if we could equalize the power consumption the GTX 560 Ti reference cooler seems to have an edge over ASUS’s DirectCU II cooler when it comes to noise under load. Cutting down the clocks to Base or Mid levels helps this some, but then rendering performance shrinks away from the GTX 560 Ti.

Compute Performance Final Thoughts
Comments Locked

66 Comments

View All Comments

  • L. - Thursday, May 19, 2011 - link

    You're almost there:

    You need an n-level classification with multipliers on scores i.e. :

    Game A, favors nVidia (score * .9), is demanding (score * 1) -> score = base score *.9

    Game B favors AMD (score * .9), is not demanding (score * .5) -> score = base score * .45

    And so on.

    Of course money and politics play, but that has no importance to gamers, they just want the best out of what they pay, and if some company plays better politics to offer a better result, so be it.
  • Ryan Smith - Tuesday, May 17, 2011 - link

    The test suite is due for a refresh, and will either be updated at the end of this month or next month once we build our new SNB testbed.
  • L. - Wednesday, May 18, 2011 - link

    Thanks :)
  • TheJian - Wednesday, May 18, 2011 - link

    OK. I get it. If a game runs better on Nvidia we should just consider it old trash and never benchmark it. Regardless of the fact that it's arguably one of the most important franchises in history. CIVILIZATION 5 (released Sept 21, 2010, expansion packs released even later I'm sure, always usually 2 packs released). Even though it was release 2 years after your crysis warhead (2008). Civ5 will continue to be relevant for another year or two (if not more, people play these for years).

    Sid Meier didn't make Nvidia's cards run better on it either. Nvidia let out MULTITHREADED RENDERING. Umm, that's a DirectX 11 feature isn't it? But isn't Civ5 an old junk game we shouldn't benchmark? LOL.
    http://forums.anandtech.com/showpost.php?p=3152067...
    Ryan Smith (from here at AT) explaining just a month ago (exactly today) how Nvidia sped up Civ5. It doesn't favor them. They are just making their cards RUN THE WAY THEY SHOULD. Full performance, with multithreading. NOTE: Ryan says it's NOT a sweet cheat they came up with. The cards should run like this!

    "For example here, on the important/demanding/modern games". Again I guess we have a different definition of modern. 2008 vs. 2010. I'd say you got it backwards, they didn't test crysis 2 here. Read the comments from Ryan. This game can harness 12 threads from the cpu. Firaxis thinks they're gpu limited! Check this:

    "At this point in time we appear to be GPU limited, but we may also be CPU limited. Firaxis says Civ V can scale to 12 threads; this would be a hex-core CPU with hyperthreading. Our testbed is only a quad-core CPU with HT, meaning we probably aren't maxing out Civ V on the CPU side. And even with HT, it's likely that 12 real cores would improve on performance relative to 6 cores + HT. Firaxis believes they're GPU limited, but it's hard to definitively tell which it is."

    Nope not very modern. Dump this game AT. ;) Obviously hawx is old (Tom's tests hawx2, but the game sucks). Personally I'd rather benchmark based on popularity. How many ever played hawx or hawx2? If a large portion of us are playing it, you should benchmark it (of course you can only bench so many, the point is don't bench it if nobody plays it). I agree you should just NOT benchmark anything over 200fps (draw the line where you want). At this speed nobody has to worry about playing on even a $100 card most likely so nobody cares (can you even call yourself a gamer if you don't have at least discrete card? $50 $100 card?).

    Metro 2033 is 6 months OLDER than Civ 5 and scored an avg of 10pts less at metacritic than Civ5. Nuff said? Will you still be complaining about Civ5 when AMD turns on Multithreading on AMD's cards? LOL.

    Maybe they should test crysis 2? Head over to Tomshardware where the 560 beats your beloved 6870 by a good 10% in 1920x1200, even more when you look at the MINIMUM FPS. Which to me is far more important than any other number as games suck below 30fps (AMD's will dip here where Nvidia usually dips less), but everyone can play fine at 200fps.

    Crimson117 pretty much said it best describing a games favorite features and why they run better on one architecture or another. IF memory serves, mass effect/2/3 are based off unreal 3 engine (unreal 4 coming 2012 I think, but totally aimed at Xbox720? whatever they call it, so says Sweeney). Many games in development are based on that engine. It's a good indicator of unreal 3 engine games, current and future releases.

    The 6950 is not the direct competition for the 560 (oh, and are you completely unaware of the fact that half of the cards for sale on newegg only have 1GB of memory just like the 560?...whatever). You do have to pay an extra amount for 2GB, which start at about $240. Meanwhile newegg has a good 1/2 dozen 560's for under $200. Apple's/Oranges? Oh, everyone that buys a new card hopes it will still be relevant at least a year later...I thought that went without saying?

    "The one real true fact is the 6950 gets a huge win in all demanding titles, has 1GB more vRAM and can even unlock+OC very easily to levels none of the 560 versions can reach."

    Umm, correct, because it's another $40, I'd expect that. That's 20% more cost. Does it perform 20% better in every game? Must not, or AT wouldn't be saying it "trades blows with 6950" correct? Dirt 2 from Dec2009 (just a 3 months older than metro2033) is dominated by nvidia. Sorry. Suspect dirt3 will too. Wolfenstein (representing Id's Tech4 engine, until RAGE) is again dominated by Nvidia (no surprise, it's OpenGL based which Nvidia has always been good at). Rage could change all that, but currently (meaning NOW and RELEVANT) it's tech4. Again wolf's not much older (Aug2009) than metro 2033 (about 7 months?). DirectCompute for DirectX 11 looks like Nvidia dominates a 6970 there too (even base model beats 6970). Hmm. They don't have DX12 yet right? At idle the card is great power wise (leads all), under crysis it looks fine at load (overclocking only raises it 10w, very nice - AMD obviously still has a watts advantage in my mind - but nice improvement over 460 perf/watt wise). Great idle noise (80% of my pc time isn't in games), leads all. Load noise blows away my 5850 and it's nowhere near driving me out of my room. I love the 5850. So I'd happily take a 560 at 3-5 less DB's of noise if I were in the market now.

    Have you seen Physx in Alice? Very cool. I don't think AMD has that (wish NV would give or lic it to them). Watch E3, looks like some of Nvidia's stuff isn't bad... :) DUKE in 3D? Nvidia pretty cool?

    FYI: as noted I own a radeon 5850 from Amazon.com :) (took them 7-8 months to fill my backorder but $260 was still $40 less at the time). I own it purely because of noise/heat was a winner at the time, price awesom, perf was a wash and really depended on your favorite games. Both sides have great tech. It just depends on what you play. I'll be buying something in Dec, but honestly have no idea which side will get my money NOW. It will depend on what I'm playing THEN...LOL. OH, some people do use linux. :) I'm more than happy about the fact that AT see fit not to make their benchmarking picks base on making AMD a winner every time...Thanks AT. Between here, toms and a few others I cover about 40 games and always know which card is best for me at the time I buy. :) That's the point of AT etc right? ahh....Trolls...Fanboys... :(
  • L. - Wednesday, May 18, 2011 - link

    Interesting reply, but I am no AMD fanboy, i'm a bang for the buck fanboy that's all.
    Quick reply to your stuff :
    Civ 5 is older tech than Metro 2033, nobody cares about the release date.
    Civ 5 is also NOT a very relevant game, as there are others in the RTS genre which have much much more success (SC2 anyone ? ).

    Dirt 2 is also irrelevant as it gets the "lots-of-fps" syndrome which does not help any card make a big difference at all.

    Wolfenstein is as relevant as it is on my hdd, sitting there not being played.

    Where I cite the 6950 as competition (I can buy the 2gig version for 215 bucks, thanks), I assume we are talking about the upper range 560OC and 560 TI which are the cards I talk about when comparing apples to apples.

    AS a summary, your pricing argument does NOT stand (1/2 cards on newegg = all shit cards, no 560ti and no asus 560OC).

    While it is important to keep an eye on games that favor one architecture or another, these 'wins' should definitely weight less, especially when they are temporary (as you said about civ5, nvidia released a special patch just for that game, maybe amd will do the same, who knows), be they amd or nvidia wins.

    I like nVidia, I like their fermi idea, I like a lot of the stuff they do, but that does not change a thing about the current gamer gfx market : nvidia = more watts and more dollars for the same average performance in modern games.

    And no, civ5 will not be relevant a year from now, it's part of those games that die quick, it's not made by blizzard and it's not named sc2, the next few decent rts will wash it out, as for all past civ's and "normal" rts's.
  • Iketh - Wednesday, May 18, 2011 - link

    hey now, respect to Age of Empires baby!!!
  • Stas - Wednesday, May 18, 2011 - link

    AoE is the shit
  • TheJian - Wednesday, May 18, 2011 - link

    Bang for buck boy wouldn't make an across the board statement about Nvidia's product line being all trash. There are more than a few great buys on either side and not a lot of competition in the $200 range for the reviewed card here (so say AT in this article). Steampowered shows stats, and Civ5 is in the top 10 EVERY NIGHT. Just checked, 16,000 playing NOW, and that usually doubles or so at night (last I checked that is, months ago - but day stats look pretty much the same. As soon as an expansion comes out these numbers will swell again. If people can be proved to be playing it NOW (every night), I think it's relevant. Note METRO 2033 isn't even there, oops my mistake....500 peak players (that's it down near bottom in like 98th place)...ROFL. Really? Civ4 Beyond the sword has more people playing right now. People just quit playing though, because Civ5 sucks. /sarcasm/

    Civ5's re-playability is awesome. People don't quit playing it quickly. Expansion packs add even more to it. Agreed, bench SC2 also. We're clear on Civ5 not being a forgotten game now right? Top 10 played games every night on steam. Nuff said? It's not Starcraft 2, you're right, won't be played for 10 years probably. But I'd venture to guess they'll play it at least as long as Civ4 (yep, still playing it MORE than metro2033...ROFL). It's first expansion isn't even out yet.

    Wolfenstein (like it or not) is the latest OpenGL game out. So it's a good indicator of OpenGL performance (the only indicator we can get now, until Tech5, or another OpenGL game comes out. This is it. It's not really about the game, so much as the engine under it. When possible they test games based on engines that are being used no and on many games in the future. Prey2 and Brink use it too. Prey 2 coming 2012, Brink of course just out. Tech 4 still relevant eh? Games still being made with it. We shouldn't bench games that will give us an idea of a future games performance though...This sites for benching games you like. Test Brink then. You're just not making a good argument yet :)

    Please provide the LINK for the $215 2GB 6950 radeon? I can't get it off newegg for less than $239 after rebates.
    "AS a summary, your pricing argument does NOT stand (1/2 cards on newegg = all shit cards, no 560ti and no asus 560OC)." I wonder about the clocks on your card if you actually give me a link to it.

    Um, I quoted the "$hit cards" as you call them because that's what AT is reviewing here today. It's a review of the GTX 560, not the 560TI (never compared the TI, it's not the reviewed product). The ASUS TOP card is only $220 at newegg. ASUS also has a higher than tested here MID card that's only $199! 850/4104 (GTX 560 Mid), but this card is 850/4200. Not much faster (100mhz on the memory) but not a BASE card by any means.

    Uh, it's completely UNIMPORTANT to keep an eye on games that favor one architecture over the other and then weight them. LOL. Your problem is as a fanboy you just can't get over any game running better on your enemies product. It's IMPORTANT to be testing games that are using today engines (that will be used tomorrow in games based on said engines), or at least the top games out now that are heavily played. I don't care who's better at what. And if one wins over the other after benching it, we don't just throw it out because NV happens to win it.

    NO, I Didn't say nvidia released a special patch. Nvidia FIXED their drivers to work properly, AMD is currently trying to do the same. They're late. Sorry. Nvidia got there first (and should affect many other games, it's not a patch for this game, it's fully in the DX driver info box now). AMD is failing to produce a multithreading driver for DX currently so NV looks good in Civ5 (for now). Jeez did you read Ryan's post? It's NOT a cheat patch or something.

    I guess you should just make up a list of games for them that run great on AMD. You'll be happy then :) I won't care as long as you can prove they're very popular and played a lot now, or based on an engine that will be in games coming up shortly. FYI: DIRT2 is a DirectX 11 racer. The EGO engine is used in RaceDriver:GRID,F1 2010 & Flashpoint:Dragon Rising as well. Oh, Dirt3 and GRID2, F1 2011 and Flashpoint:Red River will be running EGO 2.0. Yep, games not out yet. So by testing one game we have a decent idea of play in 8 EGO engine based games. Though F1 2010 is probably better to indicate this as it runs EGO 1.5. But still all in the family here and very relevant. GTX 580 gets just over 110fps. The tested card is around 72. I wouldn't call this a 500fps waste game. Don't forget they aren't posting MINIMUM's here (occasionally, but I wish they ALWAYS included them as it affects gameplay below 30fps).
  • L. - Thursday, May 19, 2011 - link

    There are no great buys on the nVidia side at the moment, just go read all the relevant reviews and compare.

    16.000 people, double at night ? lol ???

    When I played EvE Online we were over 40.000 at night, and it's surely grown bigger since.. your game is irrelevant, thanks.

    Metro 2033 being not popular because it's not a multi game makes no difference : it IS a demanding game and a GOOD benchmark, noone cares if it sucks or not. It's like far cry .. it wasn't a very good game but it was a perfect benchmark.

    A game cannot be considered relevant or "played" when you have less than 5.000 people playing it online.. otherwise you could go ahead and say Command and Conquer Generals : Zero Hour is still alive ... and i can assure you it's totally dead, swarmed by cheaters and unpatched flaws.

    Wolfenstein didn't seem all that beautiful to me .

    The fact that you are unable to find good suppliers is your problem not mine, I'm not going to give you everything I searched for just because you're flaming around.

    Err .. it's a gigabyte 6950 2gb .. clocks and hardware is of course perfect.

    Well, base cards, shit cards, whatever cards, all cards but the top one don't ever come close to a hd6950 that costs at most 20 bucks more. and if you're looking on the cheap side, the 6870 eats the lowish 560's .. done.

    Top games being played has nothing to do with benchmarking (cfr far cry, crysis ...).
    Do you play unigine a lot ? ^^
    Games favoring one architecture strongly should be weighted less because they are not representing the gfx but rather the driver / gfx calls picked for the engine.

    The fact that nVidia patched is great, AMD will too, and thus it is not very relevant to review a game w/ and w/o patched drivers to compare two HARDWARE pieces, even if it's clear AMD tends to be late on driver patches.
  • YukaKun - Tuesday, May 17, 2011 - link

    I know that card ain't aimed towards the "enthusiast" crowd, but doing a lil' tandem on multi display config would have been very nice. At least, let 'em be on SLI for larger resolutions.

    Cheers!

Log in

Don't have an account? Sign up now