Power, Temperature, & Noise

As always, we’re wrapping up our look at a video card’s stock performance with a look at power, temperature, and noise. Officially AMD is holding the 7970GE’s TDP and PowerTune limits at the same level they were at for the 7970 – 250W – however unofficially because of the higher voltages, higher clockspeeds, and Digital Temperature Estimation eating into the remaining power headroom, we’re expecting power usage to increase. The question then is “how much?”

Radeon HD 7970 Series Voltages
Ref 7970GE Base Voltage Ref 7970GE Boost Voltage Ref 7970 Base Voltage
1.162v 1.218 1.175v

Because of chip-to-chip variation, the load voltage of 7970 cards varies with the chip and how leaky it is. Short of a large sample size there’s no way to tell what the voltage of an average 7970 or 7970GE is, so we can only look at what we have.

Unlike the 7970, the 7970GE has two distinct voltages: a voltage for its base clock, and a higher voltage for its boost clock. For our 7970GE sample the base clock voltage is 1.162v, which is 0.013v lower than our reference 7970’s base clock voltage (load voltage). On the other hand our 7970GE’s boost clock voltage is 1.218, which is 0.056v higher than its base clock voltage and 0.043v higher than our reference 7970’s load voltage. In practice this means that even with chip-to-chip variation, we’d expect the 7970GE to consume a bit more power than the reference 7970 when it can boost, but equal to (or less) than the 7970 when it’s stuck at its base clock.

So how does this play out for power, temperature, and noise? Let’s find out.

Starting with idle power, because it’s the same GPU on the same board there are no surprises here. Idle power consumption is actually down by 2W at the wall, but in practice this is such a small difference that it is almost impossible to separate from other sources. Though we wouldn’t be surprised if improving TSMC yields combined with AMD’s binning meant that real power consumption has actually decreased a hair.

Similar to idle, long idle power consumption is also slightly down. NVIDIA doesn’t have anything to rival AMD’s ZeroCore Power technology, so the 7970CE is drawing a full 10W less at the wall, a difference that will become more pronounced when we compare SLI and CF in the future.

Moving on to our load power we finally see our first 7970GE power results, and while it’s not terrible it’s not great either. Power at the wall has definitely increased, with our testbed pulling 429W with the 7970GE versus 391 with the 7970. Now not all of this is due to the GPU – a certain percentage is the CPU getting to sleep less often because it needs to prepare more frames for the faster GPU – but in practice most of the difference is consumed (and exhausted) by the GPU. So the fact that the 7970GE is drawing 67W more than the GTX 680 at the wall is not insignificant.

For a change of perspective we shift over to OCCT, which is our standard pathological workload and almost entirely GPU-driven. Compared to Metro, the power consumption increase from the 7970 to the 7970GE isn’t as great, but it’s definitely still there. Power has increased by 19W at the wall, which is actually more than we would have expected given the fact that the two have the same PowerTune limit and the fact that PowerTune should be heavily throttling both cards. Consequently this means that the 7970GE creates an even wider gap between the GTX 680 and AMD’s top card, with the 7970GE pulling 43W more at the wall.

Moving on to temperatures, we don’t see a major change here. Identical hardware begets identical idle temperatures, which for the 7970GE means a cool 34C. Though the GTX 680 is a smidge cooler at 32C.

Since we’ve already seen that GPU power consumption has increased under Metro, we would expect temperatures to also increase under Metro and that’s exactly what’s happened. And actually, temperatures have increased by quite a lot, from 74C on the 7970 to 81C on the 7970GE. Since both 7970 cards share the same cooler, the 7970GE has to work harder to dissipate that extra power the card consumes, and even then temperatures will still increase some. 81C is still rather typical for a high end card, but it means there’s less thermal headroom to play with when overclocking when compared to the 7970. Furthermore it means the 7970GE is now warmer than the GTX 680.

Thanks to PowerTune throttling the 7970GE doesn’t increase in temperature by nearly as much under OCCT as it does Metro, but we still see a 4C rise, pushing the 7970GE to 83C. Again this is rather normal for a high-end card, but it’s a sign of what AMD had to sacrifice to reach this level of gaming performance.

Last but not least we have our look at noise. Again with the same hardware we see no shift in idle noise, with the 7970GE registering at a quiet 40.2dBA.

Unfortunately for AMD, this is where the 7970GE starts to come off of the rails. It’s not just power consumption and temperatures that have increased for the 7970GE, but load noise too. And it’s by quite a lot. 61.5dBA is without question loud for a video card. In fact the only card in our GPU 12 database that’s louder is the Radeon HD 6990, a dual-GPU card that was notoriously loud. The fact of the matter is that the 7970GE is significantly louder than any other card in our benchmark suite, and in all likelihood the only card that could surpass it would be the GTX 480. As a result the 7970GE isn’t only loud but it’s in a category of its own, exceeding the GTX 680 by nearly 10dBA! Even the vanilla 7970 is 6.3dBA quieter.

Does OCCT end up looking any better? Unfortunately the answer is no. At 63.2dBA it’s still the loudest single-GPU card in our benchmark suite by nearly 3dBA, and far, far louder than either the GTX 680 or the 7970. We’re looking at a 10.7dBA gap between the 7970GE and the GTX 680, and a still sizable 5.9dBA gap between the 7970GE and 7970.

From these results it’s clear where AMD has had to make sacrifices to achieve performance that could rival the GTX 680. By using the same card and cooler and at the same time letting power consumption increase to feed that speed, they have boxed themselves into a very ugly situation where the only solution is to run their cooler fast and to run it loud. Maybe, maybe with a better cooler they could have kept noise levels similar to the 7970 (which would have meant it would still be louder than the GTX 680), but that’s not what we’re looking at.

The 7970GE is without question the loudest single-GPU video card we have seen in quite some time, and that’s nothing for AMD to be proud of. Everyone’s limit for noise differs, but when we’re talking about single-GPU cards exceeding 60dB in Metro we have to seriously ponder whether it’s something many gamers would be willing to put up with.

Synthetics OC: Power, Temperature, & Noise
Comments Locked

110 Comments

View All Comments

  • piroroadkill - Friday, June 22, 2012 - link

    While the noise is bad - the manufacturers are going to spew out non-reference, quiet designs in moments, so I don't think it's an issue.
  • silverblue - Friday, June 22, 2012 - link

    Toms added a custom cooler (Gelid Icy Vision-A) to theirs which reduced noise and heat noticably (about 6 degrees C and 7-8 dB). Still, it would be cheaper to get the vanilla 7970, add the same cooling solution, and clock to the same levels; that way, you'd end up with a GHz Edition clocked card which is cooler and quieter for about the same price as the real thing, albeit lacking the new boost feature.
  • ZoZo - Friday, June 22, 2012 - link

    Would it be possible to drop the 1920x1200 definition for test? 16/10 is dead, 1080p has been the standard for high definition on PC monitors for at least 4 years now, it's more than time to catch up with reality... Sorry for the rant, I'm probably nitpicking anyway...
  • Reikon - Friday, June 22, 2012 - link

    Uh, no. 16:10 at 1920x1200 is still the standard for high quality IPS 24" monitors, which is a fairly typical choice for enthusiasts.
  • paraffin - Saturday, June 23, 2012 - link

    I haven't been seeing many 16:10 monitors around thesedays, besides, since AT even tests iGPU performance at ANYTHING BUT 1080p your "enthusiast choice" argument is invalid. 16:10 is simply a l33t factor in a market dominated by 16:9. I'll take my cheap 27" 1080p TN's spaciousness and HD content nativiness over your pricy 24" 1200p IPS' "quality" anyday.
  • CeriseCogburn - Saturday, June 23, 2012 - link

    I went over this already with the amd fanboys.
    For literally YEARS they have had harpy fits on five and ten dollar card pricing differences, declaring amd the price perf queen.

    Then I pointed out nVidia wins in 1920x1080 by 17+% and only by 10+% in 1920x1200 - so all of a sudden they ALL had 1920x1200 monitors, they were not rare, and they have hundreds of extra dollars of cash to blow on it, and have done so, at no extra cost to themselves and everyone else (who also has those), who of course also chooses such monitors because they all love them the mostest...

    Then I gave them egg counts, might as well call it 100 to 1 on availability if we are to keep to their own hyperactive price perf harpying, and the lowest available higher rez was $50 more, which COST NOTHING because it helps amd, of course....

    I pointed out Anand pointed out in the then prior article it's an ~11% pixel difference, so they were told to calculate the frame rate difference... (that keeps amd up there in scores and winning a few they wouldn't otherwise).

    Dude, MKultra, Svengali, Jim Wand, and mass media, could not, combined, do a better job brainwashing the amd fan boy.

    Here's the link, since I know a thousand red-winged harpies are ready to descend en masse and caw loudly in protest...

    http://translate.google.pl/translate?hl=pl&sl=...

    1920x1080: " GeForce GTX680 is on average 17.61% more efficient than the Radeon 7970.
    Here, the performance difference in favor of the GTX680 are even greater"

    So they ALL have a 1920x1200, and they are easily available, the most common, cheap, and they look great, and most of them have like 2 or 3 of those, and it was no expense, or if it was, they are happy to pay it for the red harpy from hades card.
  • silverblue - Monday, June 25, 2012 - link

    Your comparison article is more than a bit flawed. The PCLab results, in particular, have been massively updated since that article. Looks like they've edited the original article, which is a bit odd. Still, AMD goes from losing badly in a few cases to not losing so badly after all, as the results on this article go to show. They don't displace the 680 as the best gaming card of the moment, but it certainly narrows the gap (even if the GHz Edition didn't exist).

    Also, without a clear idea of specs and settings, how can you just grab results for a given resolution from four or five different sites for each card, add them up and proclaim a winner? I could run a comparison between a 680 and 7970 in a given title with the former using FXAA and the latter using 8xMSAA, doesn't mean it's a good comparison. I could run Crysis 2 without any AA and AF at all at a given resolution on one card and then put every bell and whistle on for the other - without the playing field being even, it's simply invalid. Take each review at its own merits because at least then you can be sure of the test environment.

    As for 1200p monitors... sure, they're more expensive, but it doesn't mean people don't have them. You're just bitter because you got the wrong end of the stick by saying nobody owned 1200p monitors then got slapped down by a bunch of 1200p monitor owners. Regardless, if you're upset that NVIDIA suddenly loses performance as you ramp up the vertical resolution, how is that AMD's fault? Did it also occur to you that people with money to blow on $500 graphics cards might actually own good monitors as well? I bet there are some people here with 680s who are rocking on 1200p monitors - are you going to rag (or shall I say "rage"?) on them, too?

    If you play on a 1080p panel then that's your prerogative, but considering the power of the 670/680/7970, I'd consider that a waste.
  • FMinus - Friday, June 22, 2012 - link

    Simply put; No!

    1080p is the second worst thing that happened to the computer market in the recent years. The first worst thing being phasing out 4:3 monitors.
  • Tegeril - Friday, June 22, 2012 - link

    Yeah seriously, keep your 16:9, bad color reproduction away from these benchmarks.
  • kyuu - Friday, June 22, 2012 - link

    16:10 snobs are seriously getting out-of-touch when they start claiming that their aspect ratio gives better color reproduction. There are plenty of high-quality 1080p IPS monitors on the market -- I'm using one.

    That being said, it's not really important whether it's benchmarked at x1080 or x1200. There is a neglible difference in the number of pixels being drawn (one of the reasons I roll my eyes at 16:10 snobs). If you're using a 1080p monitor, just add anywhere from 0.5 to 2 FPS to the average FPS results from x1200.

    Disclaimer: I have nothing *against* 16:10. All other things being equal, I'd choose 16:10 over 16:9. However, with 16:9 monitors being so much cheaper, I can't justify paying a huge premium for a measily 120 lines of vertical resolution. If you're willing to pay for it, great, but kindly don't pretend that doing so somehow makes you superior.

Log in

Don't have an account? Sign up now