Temperature, Power, & Noise: Hot and Loud, but Not in the Good Way

For all of the gaming and compute performance data we have seen so far, we’ve only seen half of the story. With a 500mm2+ die and a TDP over 200W, there’s a second story to be told about the power, temperature, and noise characteristics of the GTX 400 series.

Idle GPU Temperature

Starting with idle temperatures, we can quickly see some distinct events among our cards. The top of the chart is occupied solely by AMD’s Radeon 5000 series, whose small die and low idle power usage let these cards idle at very cool temperatures. It’s not until half-way down the chart that we find our first GTX 400 card, with the 470 at 46C. Truth be told we were expecting something a bit better out of it given that its 33W idle is only a few watts over the 5870 and has a fairly large cooler to work with. Farther down the chart is the GTX 480, which is in the over-50 club at 51C idle. This is where NVIDIA has to pay the piper on their die size – even the amazingly low idle clockspeed of 50MHz/core 101MHz/shader 67.5Mhz/RAM isn't enough to drop it any further.

Load GPU Temperature - Crysis

Load GPU Temperature - Furmark

For our load temperatures, we have gone ahead and added Crysis to our temperature testing so that we can see both the worst-case temperatures of FurMark and a more normal gameplay temperature.

At this point the GTX 400 series is in a pretty exclusive club of hot cards – under Crysis the only other single-GPU card above 90C is the 3870, and the GTX 480 SLI is the hottest of any configuration we have tested. Even the dual-GPU cards don’t get quite this hot. In fact it’s quite interesting that unlike FurMark there’s quite a larger spread among card temperatures here, which only makes the GTX 400 series stand out more.

While we’re on the subject of temperatures, we should note that NVIDIA has changed the fan ramp-up behavior from the GTX 200 series. Rather than reacting immediately, the GTX 400 series fans have a ramp-up delay of a few seconds when responding to high temperatures, meaning you’ll actually see those cards get hotter than our sustained temperatures. This won’t have any significant impact on the card, but if you’re like us your eyes will pop out of your head at least once when you see a GTX 480 hitting 98C on FurMark.

Idle Power Consumption

Up next is power consumption. As we’ve already discussed, the GTX 480 and GTX 470 have an idle power consumption of 47W and 33W respectively, putting them out of the running for the least power hungry of the high-end cards. Furthermore the 1200W PSU we switched to for this review has driven up our idle power load a bit, which serves to suppress some of the differences in idle power draw between cards.

With that said the GTX 200 series either does decently or poorly, depending on your point of view. The GTX 480 is below our poorly-idling Radeon 4000 series cards, but well above the 5000 series. Meanwhile the GTX 470 is in the middle of the pack, sharing space with most of the GTX 200 series. The lone outlier here is the GTX 480 SLI. AMD’s power saving mode for Crossfire cards means that the GTX 480 SLI is all alone at a total power draw of 260W when idle.

Load Power Consumption - Crysis

Load Power Consumption - Furmark

For load power we have Crysis and FurMark, the results of which are quite interesting. Under Crysis not only is the GTX 480 SLI the most demanding card setup as we would expect, but the GTX 480 itself isn’t too far behind. As a single-GPU card it pulls in more power than either the GTX 295 or the Radeon 5970, both of which are dual-GPU cards. Farther up the chart is the GTX 470, which is the 2nd most power draining of our single-GPU cards.

Under FurMark our results change ever so slightly. The GTX 480 manages to get under the GTX 295, while the GTX 470 falls in the middle of the GTX 200 series pack. A special mention goes out to the GTX 480 SLI here, which at 851W under load is the greatest power draw we have ever seen for a pair of GPUs.

Idle Noise Levels

Idle noise doesn’t contain any particular surprises since virtually every card can reduce its fan speed to near-silent levels and still stay cool enough. The GTX 400 series is within a few dB of our noise floor here.

Load Noise Levels

Hot, power hungry things are often loud things, and there are no disappointments here. At 70dB the GTX 480 SLI is the loudest card configuration we have ever tested, while at 64.1dB the GTX 480 is the loudest single-GPU card, beating out even our unreasonably loud 4890. Meanwhile the GTX 470 is in the middle of the pack at 61.5dB, coming in amidst some of our louder single-GPU cards and our dual-GPU cards.

Finally, with this data in hand we went to NVIDIA to ask about the longevity of their cards at these temperatures, as seeing the GTX 480 hitting 94C sustained in a game left us worried. In response NVIDIA told us that they have done significant testing of the cards at high temperatures to validate their longevity, and their models predict a lifetime of years even at temperatures approaching 105C (the throttle point for GF100). Furthermore as they note they have shipped other cards that run roughly this hot such as the GTX 295, and those cards have held up just fine.

At this point we don’t have any reason to doubt NVIDIA’s word on this matter, but with that said this wouldn’t discourage us from taking the appropriate precautions. Heat does impact longevity to some degree – we would strongly consider getting a lifetime warranty for the GTX 480 to hedge our bets.

Wolfenstein Final Words
Comments Locked

196 Comments

View All Comments

  • Saiko Kila - Sunday, March 28, 2010 - link

    These MSRPs are not entirely, I mean historically correct... The first MSRP (list price) for HD 5850 was $259, and that was price you had to pay when buying on sites like newegg (there were some rebates, and some differences depending on manufacturer, but still you had to have a very potent hunting sense to get a card of any manufacturer, I got lucky twice). Shortly after launch (about one month, it was October) the MSRP (set by AMD) hiked to $279 and problems with supply not only continued but even worsened. Now, since November 2009, it's $299. HD 5870 followed generally similar path, though HD 5850 hiked more, which is no wonder. Note that this is for reference design only, some manufacturers had higher MSRPs, after all AMD or nvidia sell only chips and not gaming cards.

    If you believe anandtech, here you've got a link, the day the cards were announced:
    http://www.anandtech.com/video/showdoc.aspx?i=3643">http://www.anandtech.com/video/showdoc.aspx?i=3643

    The whole pricing things with HD 5xxx series is quite unusual (though not unexpected) since normally you'd anticipate the street price to be quite lower than MSRP, and then to drop even further, and you would be right. I remember buying EVGA GTX260 just after its launch and the price was good $20 lower than suggested price. That's why we need more competition, and for now the outlook isn't very bright, with nvidia not quite delivering...


    And these European prices - most if not all European countries have a heavy tax (VAT), this tax is always included and you have to pay it, there are other taxes too. In the US the sales tax is not included in the street price, and usually you can evade it after all (harder for Californians). Europeans usually get higher prices. Comparing US prices is thereby better, particularly in us dollars (most electronics deliveries are calculated in dollars in Europe). So the prices in the rest of the world were also boosted, even in Europe, despite weak dollar and other factors :)

    One note - HD5xxx cards are really very big and most of them have very unfriendly location of power sockets, so you'd expect to pay more for a proper, huge case. Also note that if you have a 600 W PSU or so you'd be smarter to keep it and not upgrade, unless REALLY necessary. The lower load means lower efficiency, especially when plugged to 115V/60Hz grid. So if you have a bigger PSU you pay more for electricity. And it seems that more gamers are concerned with that bill than in any time before... You couldn't blame them for that and it's sad in its own way.
  • LuxZg - Tuesday, March 30, 2010 - link

    Well, current MSRP is like I wrote it above. If there is no competition and/or demand is very high, prices always tend to go up. We're just lucky it's not happening often because in IT competition is usually very good.

    As for European prices, what do taxes have to do with it? We've got 23% taxes here, but it's included in all prices, so if nVidia goes up 23% so do AMD cards as well. If I'm looking at prices in the same country (and city, and sometimes store as well), and if nVidia is 300$ and ATI is 100 and 500, than I just can't compare them and say "hey, nVidia is faster than this 100$ ATI card, I?ll buy that"... no, you can't compare like that. Only thing you can do in that case is say something like "OK, so I have 300$ and fastest I can afford is nVidia" .. or "I want fastest there is, and I don't mind the cost" and you'll take HD5970 than. Or you can't afford any of those. So again, I don't get why cards in this review are so rigidly compared one to another as if they have exact same price (or +/- 10$ difference). And at one hand they compare MORE expensive nVidia card to QUITE CHEAPER AMD card, but won't compare that same nVidia card to a more expensive AMD card.. WHY?

    And AMD cards are no bigger than nVidia ones, and last time I've checked bigger case is way way cheaper than a new PSU. And I'm running my computer on, get this, 450W PSU, so I'm not wasting any excessive power on inefficiences on low loads ;) And since this PSU keeps overclocked HD4890, it should work just fine with non-overclocked HD5870. While I'm pretty sure that GTX470 would already mean a new PSU, new PSU that costs ~100$/80€ .. So I'd pay more $ in total, and get a slower card.

    Again, I'm not getting why there's such a rigid idea of GTX470=HD5850 & GTX480=HD5870 ..
  • LuxZg - Saturday, March 27, 2010 - link

    Just re-read the conclusion.. something lacks in this sentence:
    "If you need the fastest thing you can get then the choice is clear, .."
    Shouldn't it finish with "... choice is clear, HD5970..." ? That's what I'm saying, HD5970 wasn't mentioned in the entire conclusion. Past are the days of "single-GPU crown" .. That's just for nVidia to feel better. ATI Doesn't want "single GPU crown", they want the fastest graphics CARD. And they have it.. Serious lack in this article, serious.. And again, there is exact same amount of money dividing GTX480 and HD5870, as is between GTX480 and HD5970..
  • blindbox - Saturday, March 27, 2010 - link

    I know this is going to take quite a bit of work, but can't you colour up the main cards and its competition in this review? By main cards, I mean GTX 470, 480 and 5850 and 5870. It's giving me a hard time to make comparison. I'm sure you guys did this before.. I think.

    It's funny how you guys only coloured the 480.

    PS: I'm sorry for the spam, my comments are not appearing, and I'm sorry for replying to this guy when it is completely off topic, lol.
  • JarredWalton - Saturday, March 27, 2010 - link

    Yes, it did take a bit of work, but I did it for Ryan. The HD 5870/5970 results are in orange and the 5850 is in red. It makes more of a difference on crowded graphs, but it should help pick out the new parts and their competition. I'm guessing Ryan did it to save time, because frankly the graphing engine is a pain in the butt. Thankfully, the new engine should be up and running in the near future. :-)
  • Finally - Saturday, March 27, 2010 - link

    Further improvement idea:
    Give the dual-chip/SLI cards also another colour tone.
  • lemonadesoda - Sunday, March 28, 2010 - link

    No. Keep colouring simple. Just 3 or 4 colours max. More creates noise. If you need to highlight other results, colour the label, or circle or drop shadow or put a red * a the end.

    Just NO rainbow charts!
  • IceDread - Tuesday, March 30, 2010 - link

    The article does not contain hd 5970 in CF. The article does not mention the hd 5970 at all under conclusion. This is really weird. It is my belief that anandtech has become pro nvidia and is no longer an objective site. Obejtivity is looking at performance + functionality / price. HD 5970 is a clear winner here. After all, who cares if a card has 1, 2 or 20 gpus? It's the performance / price that matters.
  • Kegetys - Tuesday, March 30, 2010 - link

    According to a test in legitreviews.com having two monitors attached to the card causes the idle power use to rise quite a bit, I guess the anand test is done with just one monitor attached? It would be nice to see power consumption numbers for dual monitor use as well, I dont mind high power use during load but if the card does not idle properly (with two monitors) then that is quite a showstopper.
  • Ryan Smith - Wednesday, March 31, 2010 - link

    I have a second monitor (albeit 1680) however I don't use it for anything except 3D Vision reviews. But if dual monitor power usage is going to become an issue, it may be prudent to start including that.

Log in

Don't have an account? Sign up now