Temperature, Power, & Noise: Hot and Loud, but Not in the Good Way

For all of the gaming and compute performance data we have seen so far, we’ve only seen half of the story. With a 500mm2+ die and a TDP over 200W, there’s a second story to be told about the power, temperature, and noise characteristics of the GTX 400 series.

Idle GPU Temperature

Starting with idle temperatures, we can quickly see some distinct events among our cards. The top of the chart is occupied solely by AMD’s Radeon 5000 series, whose small die and low idle power usage let these cards idle at very cool temperatures. It’s not until half-way down the chart that we find our first GTX 400 card, with the 470 at 46C. Truth be told we were expecting something a bit better out of it given that its 33W idle is only a few watts over the 5870 and has a fairly large cooler to work with. Farther down the chart is the GTX 480, which is in the over-50 club at 51C idle. This is where NVIDIA has to pay the piper on their die size – even the amazingly low idle clockspeed of 50MHz/core 101MHz/shader 67.5Mhz/RAM isn't enough to drop it any further.

Load GPU Temperature - Crysis

Load GPU Temperature - Furmark

For our load temperatures, we have gone ahead and added Crysis to our temperature testing so that we can see both the worst-case temperatures of FurMark and a more normal gameplay temperature.

At this point the GTX 400 series is in a pretty exclusive club of hot cards – under Crysis the only other single-GPU card above 90C is the 3870, and the GTX 480 SLI is the hottest of any configuration we have tested. Even the dual-GPU cards don’t get quite this hot. In fact it’s quite interesting that unlike FurMark there’s quite a larger spread among card temperatures here, which only makes the GTX 400 series stand out more.

While we’re on the subject of temperatures, we should note that NVIDIA has changed the fan ramp-up behavior from the GTX 200 series. Rather than reacting immediately, the GTX 400 series fans have a ramp-up delay of a few seconds when responding to high temperatures, meaning you’ll actually see those cards get hotter than our sustained temperatures. This won’t have any significant impact on the card, but if you’re like us your eyes will pop out of your head at least once when you see a GTX 480 hitting 98C on FurMark.

Idle Power Consumption

Up next is power consumption. As we’ve already discussed, the GTX 480 and GTX 470 have an idle power consumption of 47W and 33W respectively, putting them out of the running for the least power hungry of the high-end cards. Furthermore the 1200W PSU we switched to for this review has driven up our idle power load a bit, which serves to suppress some of the differences in idle power draw between cards.

With that said the GTX 200 series either does decently or poorly, depending on your point of view. The GTX 480 is below our poorly-idling Radeon 4000 series cards, but well above the 5000 series. Meanwhile the GTX 470 is in the middle of the pack, sharing space with most of the GTX 200 series. The lone outlier here is the GTX 480 SLI. AMD’s power saving mode for Crossfire cards means that the GTX 480 SLI is all alone at a total power draw of 260W when idle.

Load Power Consumption - Crysis

Load Power Consumption - Furmark

For load power we have Crysis and FurMark, the results of which are quite interesting. Under Crysis not only is the GTX 480 SLI the most demanding card setup as we would expect, but the GTX 480 itself isn’t too far behind. As a single-GPU card it pulls in more power than either the GTX 295 or the Radeon 5970, both of which are dual-GPU cards. Farther up the chart is the GTX 470, which is the 2nd most power draining of our single-GPU cards.

Under FurMark our results change ever so slightly. The GTX 480 manages to get under the GTX 295, while the GTX 470 falls in the middle of the GTX 200 series pack. A special mention goes out to the GTX 480 SLI here, which at 851W under load is the greatest power draw we have ever seen for a pair of GPUs.

Idle Noise Levels

Idle noise doesn’t contain any particular surprises since virtually every card can reduce its fan speed to near-silent levels and still stay cool enough. The GTX 400 series is within a few dB of our noise floor here.

Load Noise Levels

Hot, power hungry things are often loud things, and there are no disappointments here. At 70dB the GTX 480 SLI is the loudest card configuration we have ever tested, while at 64.1dB the GTX 480 is the loudest single-GPU card, beating out even our unreasonably loud 4890. Meanwhile the GTX 470 is in the middle of the pack at 61.5dB, coming in amidst some of our louder single-GPU cards and our dual-GPU cards.

Finally, with this data in hand we went to NVIDIA to ask about the longevity of their cards at these temperatures, as seeing the GTX 480 hitting 94C sustained in a game left us worried. In response NVIDIA told us that they have done significant testing of the cards at high temperatures to validate their longevity, and their models predict a lifetime of years even at temperatures approaching 105C (the throttle point for GF100). Furthermore as they note they have shipped other cards that run roughly this hot such as the GTX 295, and those cards have held up just fine.

At this point we don’t have any reason to doubt NVIDIA’s word on this matter, but with that said this wouldn’t discourage us from taking the appropriate precautions. Heat does impact longevity to some degree – we would strongly consider getting a lifetime warranty for the GTX 480 to hedge our bets.

Wolfenstein Final Words
Comments Locked

196 Comments

View All Comments

  • arjunp2085 - Friday, March 26, 2010 - link

    For dealing with suck fake geometry, Fermi has several new tricks.

    is that supposed to be such??

    850 Watts for SLI.. man Air Conditioning for my room does not consume that much electricity

    Might have to go for industrial connections to use such high Electricity consumptions lol

    Green Team NOT GREEN....
  • Leyawiin - Friday, March 26, 2010 - link

    Guess I'll keep my GTX 260 for a year or so more and hope for better days.
  • hangfirew8 - Friday, March 26, 2010 - link

    Launch FAIL.

    All this waiting and a paper launch. They couldn't even manage the 1/2 dozen cards per vendor at Newegg of some previous soft launches.

    All this waiting an a small incremental increase over existing card performance. High power draw and temps. High prices, at least they had the sense not to price it like the 8800Ultra-which was a game changer. It had a big leap in performance plus brought us a new DX level, DX10.

    I've been holding off buying until this launch, I really wanted nVidia to pull something off here. Oh, well.

  • softdrinkviking - Friday, March 26, 2010 - link

    so by the time a "full" gf100 is available, how close will we be the the next gen AMD card?
    and how low will be the prices on the 58XX series be?

    this article never made an explicit buying recommendation, but how many people out there are still waiting to buy a gf100?
    6 months is a long time.
    after xmas and the post holiday season, anybody on the fence about it (i.e. not loyal nvidia fans) probably just went for amd card.
    so the question (for a majority of potential buyers?) isn't "which card do i buy?", it's "do i need/want to upgrade from my 58xx amd card to a gf100?"


    also, i'm curious to find out if fermi can be scaled down into a low profile card and offer superior performance in a form factor that relies so heavily on low temps and low power consumption.
    the htpc market is a big money maker, and a bad showing for nvidia there could really hurt them.
    maybe they won't even try?

  • shin0bi272 - Friday, March 26, 2010 - link

    great review as usual here at Anandtech. I would have thought in your conclusions you would have mentioned that, in light of the rather lack luster 5% performance crown that they now hold, that it wasnt the best idea for them to disable 6% of their cores on the thing after all.

    Why make a 512 core gpu then disable 32 of them and end up with poorer performance when youre already 6 months behind the competition, sucking up more juice, have higher temps and fan noise, and a higher price tag? That's like making the bugatti veyron and then disabling 2 of its 16 cylinders!

    That will probably be what nvidia does when amd releases their super cypress to beat the 480. They'll release the 485 with all 512 cores and better i/o for the ram.
  • blyndy - Saturday, March 27, 2010 - link

    "Fermi is arranged as 16 clusters of 32 shaders, and given that it is turning off 64 shaders, it looks like the minimum granularity it can fuse off is a single cluster of 32. This means it is having problems getting less than two unrecoverable errors per die, not a good sign."

    from: http://www.semiaccurate.com/2009/12/21/nvidia-cast...">http://www.semiaccurate.com/2009/12/21/nvidia-cast...
  • shin0bi272 - Saturday, March 27, 2010 - link

    dont quote semi accurate to me. If you wanna call 1 in 100 claims being correct as Semi accurate then fine you can... me I call it a smear. Especially since the guy who wrote that article is a known liar and hack. If you google for gtx480 and click on the news results and click on semi accurate you will see its listed as satire.
  • Jamahl - Friday, March 26, 2010 - link

    the same Ryan Smith who panned the 5830 for being a "paper launch" even though it was available one day later?

    What's wrong this time Ryan? Maybe there are so many bad things to say about Fermi, being "paper launched" was well down the pecking order of complaints?
  • AnandThenMan - Friday, March 26, 2010 - link

    I was thinking the same thing. The 5830 got slammed for being a paper launch even though it wasn't, but Fermi gets a pass? Why? This isn't even a launch at all despite what Nvidia says. Actual cards will be available in what, 17 days? That's assuming the date doesn't change again.
  • jeffrey - Saturday, March 27, 2010 - link

    I'll third that notion.

    Even though Ryan Smith mentioned that Fermi was paper launched today, the tone and way that the article read was much harsher on AMD/ATI. That is ridiculous considering that Ryan had to eat his own words with an "Update" on the 5830's availability.

    To be tougher on AMD/ATI, when they did in fact launch the 5830 that day and have hard-launched, to the best of their ability, the entire 5XX0 stack gives an impression of bias.

    A paper launch with availability at least two and a half weeks out for a product six months late is absurd!

Log in

Don't have an account? Sign up now