OC: Power, Temperature, & Noise

Our final task is our look at the 7970GE’s overclocking capabilities. As the 7970GE is based on the existing 7970 we aren’t expecting any significant changes, however it’s reasonable to expect that general manufacturing process improvements over the last 6 months will have pushed yields and tolerances a little higher, giving us just a bit more headroom.

At the same time the presence of the boost clock and its associated voltage is going to change overclocking as well. The higher voltage should lend itself to higher overclocks, meanwhile validating overclocks is also going to be a bit harder as now we need to make sure neither the overclocked base clock/voltage combination or the overclocked boost clock/voltage combination is unstable, similar to the extra effort needed to overclock the GTX 680 series.

Radeon HD 7970 Series Overclocking
  Ref 7970GE Ref 7970 XFX 7970 BEDD
Shipping Core Clock 1000MHz 925MHz 1000MHz
Shipping Max Boost Clock 1050MHz N/A N/A
Shipping Memory Clock 6GHz 5.5GHz 5.7GHz
Shipping Max Voltage 1.218v 1.175v 1.175v
       
Overclock Core Clock 1150MHz 1100MHz 1125MHz
Overclock Max Boost Clock 1200MHz N/A N/A
Overclock Memory Clock 6.4GHz 6.3GHz 6.3GHz
Overclock Max Boost Voltage 1.218v 1.175v 1.175v

After going through the full validation process we were able to hit an overclock of +150MHz, which pushed our base clock from 1000MHz to 1150MHz, and our boost clock from 1050MHz to 1200MHz. Depending on how you want to count this overclock amidst the presence of the boost clock this is either 25MHz better than our best 7970 card, or 75MHz better. In either case our 7970GE definitely overclocks better than our earlier 7970 cards but not significantly so, which is in-line with our expectations.

As with any overclocking effort based on a single sample our overclocking results are not going to be representative of what every card can do, but they are reasonable. With AMD now binning chips for the 7970GE we’d expect to see some stratification among the 7970 family such that high overclocking chips that would previously show up in 7970 cards will now show up in 7970GE cards instead. For penny-pinching overclockers this is not good news, but for more hardcore overclockers this is nothing new as AMD’s partners have been doing something similar with their factory overclocked cards for some time now.

Meanwhile our memory overclock isn’t significantly different from what we could pull off with the reference 7970. The limitation is the memory bus or Tahiti’s memory controller, neither of which has changed. After around 6.4GHz errors start catching up and performance gains become performance losses.

Moving on to our performance charts, we’re going to once again start with power, temperature, and noise, before moving on to gaming performance. We’ll be testing our 7970 cards with the PowerTune limit set to +20% in order to avoid any real possibility of being performance limited by PowerTune.

With the 7970GE’s already high load power, overclocking and raising the PowerTune limits isn’t doing it any favors when it comes to overclocking. On the contrary to being a free overclock power consumption now exceeds even the GTX 690 in all situations and power consumption is almost certainly in excess of 300W at the card level. As we’ll see in our gaming performance section we’re definitely getting more performance out of the 7970GE, but we’re paying for it with power.

With a rise in power consumption comes a rise in temperatures to a varying degree. At 83C under Metro the 7970GE has gotten warmer, but not significantly so. The same cannot be said for OCCT. At 89C we’re approaching the reasonable limits for this card and cooler.

The 7970GE was already loud at stock and overclocking it doesn’t help. Under Metro noise is now at 63.8dBA, and under OCCT it’s tied with the 6990 for noise at 66dBA. Even if you’re forgiving of noise, this is reaching the point where it’s going to be difficult to ignore. Serious 7970GE overclockers will want to seek other cards and/or aftermarket coolers.

Power, Temperature, & Noise OC: Gaming Performance
Comments Locked

110 Comments

View All Comments

  • piroroadkill - Friday, June 22, 2012 - link

    While the noise is bad - the manufacturers are going to spew out non-reference, quiet designs in moments, so I don't think it's an issue.
  • silverblue - Friday, June 22, 2012 - link

    Toms added a custom cooler (Gelid Icy Vision-A) to theirs which reduced noise and heat noticably (about 6 degrees C and 7-8 dB). Still, it would be cheaper to get the vanilla 7970, add the same cooling solution, and clock to the same levels; that way, you'd end up with a GHz Edition clocked card which is cooler and quieter for about the same price as the real thing, albeit lacking the new boost feature.
  • ZoZo - Friday, June 22, 2012 - link

    Would it be possible to drop the 1920x1200 definition for test? 16/10 is dead, 1080p has been the standard for high definition on PC monitors for at least 4 years now, it's more than time to catch up with reality... Sorry for the rant, I'm probably nitpicking anyway...
  • Reikon - Friday, June 22, 2012 - link

    Uh, no. 16:10 at 1920x1200 is still the standard for high quality IPS 24" monitors, which is a fairly typical choice for enthusiasts.
  • paraffin - Saturday, June 23, 2012 - link

    I haven't been seeing many 16:10 monitors around thesedays, besides, since AT even tests iGPU performance at ANYTHING BUT 1080p your "enthusiast choice" argument is invalid. 16:10 is simply a l33t factor in a market dominated by 16:9. I'll take my cheap 27" 1080p TN's spaciousness and HD content nativiness over your pricy 24" 1200p IPS' "quality" anyday.
  • CeriseCogburn - Saturday, June 23, 2012 - link

    I went over this already with the amd fanboys.
    For literally YEARS they have had harpy fits on five and ten dollar card pricing differences, declaring amd the price perf queen.

    Then I pointed out nVidia wins in 1920x1080 by 17+% and only by 10+% in 1920x1200 - so all of a sudden they ALL had 1920x1200 monitors, they were not rare, and they have hundreds of extra dollars of cash to blow on it, and have done so, at no extra cost to themselves and everyone else (who also has those), who of course also chooses such monitors because they all love them the mostest...

    Then I gave them egg counts, might as well call it 100 to 1 on availability if we are to keep to their own hyperactive price perf harpying, and the lowest available higher rez was $50 more, which COST NOTHING because it helps amd, of course....

    I pointed out Anand pointed out in the then prior article it's an ~11% pixel difference, so they were told to calculate the frame rate difference... (that keeps amd up there in scores and winning a few they wouldn't otherwise).

    Dude, MKultra, Svengali, Jim Wand, and mass media, could not, combined, do a better job brainwashing the amd fan boy.

    Here's the link, since I know a thousand red-winged harpies are ready to descend en masse and caw loudly in protest...

    http://translate.google.pl/translate?hl=pl&sl=...

    1920x1080: " GeForce GTX680 is on average 17.61% more efficient than the Radeon 7970.
    Here, the performance difference in favor of the GTX680 are even greater"

    So they ALL have a 1920x1200, and they are easily available, the most common, cheap, and they look great, and most of them have like 2 or 3 of those, and it was no expense, or if it was, they are happy to pay it for the red harpy from hades card.
  • silverblue - Monday, June 25, 2012 - link

    Your comparison article is more than a bit flawed. The PCLab results, in particular, have been massively updated since that article. Looks like they've edited the original article, which is a bit odd. Still, AMD goes from losing badly in a few cases to not losing so badly after all, as the results on this article go to show. They don't displace the 680 as the best gaming card of the moment, but it certainly narrows the gap (even if the GHz Edition didn't exist).

    Also, without a clear idea of specs and settings, how can you just grab results for a given resolution from four or five different sites for each card, add them up and proclaim a winner? I could run a comparison between a 680 and 7970 in a given title with the former using FXAA and the latter using 8xMSAA, doesn't mean it's a good comparison. I could run Crysis 2 without any AA and AF at all at a given resolution on one card and then put every bell and whistle on for the other - without the playing field being even, it's simply invalid. Take each review at its own merits because at least then you can be sure of the test environment.

    As for 1200p monitors... sure, they're more expensive, but it doesn't mean people don't have them. You're just bitter because you got the wrong end of the stick by saying nobody owned 1200p monitors then got slapped down by a bunch of 1200p monitor owners. Regardless, if you're upset that NVIDIA suddenly loses performance as you ramp up the vertical resolution, how is that AMD's fault? Did it also occur to you that people with money to blow on $500 graphics cards might actually own good monitors as well? I bet there are some people here with 680s who are rocking on 1200p monitors - are you going to rag (or shall I say "rage"?) on them, too?

    If you play on a 1080p panel then that's your prerogative, but considering the power of the 670/680/7970, I'd consider that a waste.
  • FMinus - Friday, June 22, 2012 - link

    Simply put; No!

    1080p is the second worst thing that happened to the computer market in the recent years. The first worst thing being phasing out 4:3 monitors.
  • Tegeril - Friday, June 22, 2012 - link

    Yeah seriously, keep your 16:9, bad color reproduction away from these benchmarks.
  • kyuu - Friday, June 22, 2012 - link

    16:10 snobs are seriously getting out-of-touch when they start claiming that their aspect ratio gives better color reproduction. There are plenty of high-quality 1080p IPS monitors on the market -- I'm using one.

    That being said, it's not really important whether it's benchmarked at x1080 or x1200. There is a neglible difference in the number of pixels being drawn (one of the reasons I roll my eyes at 16:10 snobs). If you're using a 1080p monitor, just add anywhere from 0.5 to 2 FPS to the average FPS results from x1200.

    Disclaimer: I have nothing *against* 16:10. All other things being equal, I'd choose 16:10 over 16:9. However, with 16:9 monitors being so much cheaper, I can't justify paying a huge premium for a measily 120 lines of vertical resolution. If you're willing to pay for it, great, but kindly don't pretend that doing so somehow makes you superior.

Log in

Don't have an account? Sign up now