Overclocking GTX 980

One of GTX 750 Ti’s more remarkable features was its overclocking headroom. GM107 could overclock so well that upon initial release, NVIDIA did not program in enough overclocking headroom in their drivers to allow for many GTX 750 Ti cards to be overclocked to their true limits. This is a legacy we would be glad to see repeated for GTX 980, and is a legacy we are going to put to the test.

As with NVIDIA’s Kepler cards, NVIDIA’s Maxwell cards are subject to NVIDIA’s stringent power and voltage limitations. Overvolting is limited to NVIDIA’s built in overvoltage function, which isn’t so much a voltage control as it is the ability to unlock 1-2 more boost bins and their associated voltages. Meanwhile TDP controls are limited to whatever value NVIDIA believes is safe for that model card, which can vary depending on its GPU and its power delivery design.

For GTX 980 we have a 125% TDP limit, meanwhile we are able to overvolt by 1 boost bin to 1265MHz, which utilizes a voltage of 1.25v.

GeForce GTX 980 Overclocking
  Stock Overclocked
Core Clock 1126MHz 1377MHz
Boost Clock 1216MHz 1466MHz
Max Boost Clock 1265MHz 1515MHz
Memory Clock 7GHz 7.8GHz
Max Voltage 1.25v 1.25v

GTX 980 does not let us down, and like its lower end Maxwell 1 based counterpart the GTX 980 turns in an overclocking performance just short of absurd. Even without real voltage controls we were able to push another 250MHz (22%) out of our GM204 GPU, resulting in an overclocked base clock of 1377MHz and more amazingly an overclocked maximum boost clock of 1515MHz. That makes this the first NVIDIA card we have tested to surpass both 1.4GHz and 1.5GHz, all in one fell swoop.

This also leaves us wondering just how much farther GM204 could overclock if we were able to truly overvolt it. At 1.25v I’m not sure too much more voltage is good for the GPU in the long term – that’s already quite a bit of voltage for a TSMC 28nm process – but I suspect there is some untapped headroom left in the GPU at higher voltages.

Memory overclocking on the other hand doesn’t end up being quite as extreme, but we’ve known from the start that at 7GHz for the stock memory clock, we were already pushing the limits for GDDR5 and NVIDIA’s memory controllers. Still, we were able to work another 800MHz (11%) out of the memory subsystem, for a final memory clock of 7.8GHz.

Before we go to our full results, in light of GTX 980’s relatively narrow memory bus and NVIDIA’s color compression improvements, we quickly broke apart our core and memory overclock testing in order to test each separately. This is to see which overclock has more effect: the core overclock or the memory overclock. One would presume that the memory overclock is the more important given the narrow memory bus, but as it turns out that is not necessarily the case.

GeForce GTX 980 Overclocking Performance
  Core (+22%) Memroy (+11%) Combined
Metro: LL
+15%
+4%
+20%
CoH2
+19%
+5%
+20%
Bioshock
+9%
+4%
+15%
Battlefield 4
+10%
+6%
+17%
Crysis 3
+12%
+5%
+15%
TW: Rome 2
+16%
+7%
+20%
Thief
+12%
+6%
+16%

While the core overclock is greater overall to begin with, what we’re also seeing is that the performance gains relative to the size of the overclock consistently favor the core overclock to the memory overclock. With a handful of exceptions our 11% memory overclock is netting us less than a 6% increase in performance. Meanwhile our 22% core overclock is netting us a 12% increase or more. This despite the fact that when it comes to core overclocking, the GTX 980 is TDP limited; in many of these games it could clock higher if the TDP budget was large enough to accommodate higher sustained clockspeeds.

Memory overclocking is still effective, and it’s clear that GTX 980 spends some of its time memory bandwidth bottlenecked (otherwise we wouldn’t be seeing even these performance gains), but it’s simply not as effective as core overclocking. And since we have more core headroom than memory headroom in the first place, it’s a double win for core overclocking.

To put it simply, the GTX 980 was already topping the charts. Now with overclocking it’s another 15-20% faster yet. With this overclock factored in the GTX 980 is routinely 2x faster than the GTX 680, if not slightly more.

OC: Load Power Consumption - Crysis 3

OC: Load Power Consumption - FurMark

But you do pay for the overclock when it comes to power consumption. NVIDIA allows you to increase the TDP by 25%, and to hit these performance numbers you are going to need every bit of that. So what was once a 165W card is now a 205W card.

OC: Load GPU Temperature - Crysis 3

OC: Load GPU Temperature - FurMark

Even though overclocking involves raising the temperature limit to 91C, NVIDIA's fan curve naturally tops out at 84C. So even in the case of overclocking the GTX 980 isn't going to reach temperatures higher than the mid-80s.

OC: Load Noise Levels - Crysis 3

OC: Load Noise Levels - FurMark

The noise penalty for overclocking is also pretty stiff. Since we're otherwise TDP limited, all of our workloads top out at 53.6dB, some 6.6dB higher than stock. In the big picture this means the overclocked GTX 980 is still in the middl of the pack, but it is noticably louder than before and louder than a few of NVIDIA's other cards. However interestingly enough it's no worse than the original stock GTX 680 at Crysis 3, and still better than said GTX 680 under FurMark. It's also still quieter than the stock Radeon R9 290X, not to mention the louder yet uber mode.

Power, Temperature, & Noise Final Words
Comments Locked

274 Comments

View All Comments

  • mesahusa - Tuesday, September 23, 2014 - link

    Nvidia and AMD havent moved to 22 because they don't have the funding. Intel has tens of billions to blow away in R&D. Broadwells going to be released in 2015, and its 14nm.
  • Hrel - Monday, September 22, 2014 - link

    In light of Nvidia not even trying to reduce manufacturing nodes It would be really nice to see them go on the offensive in the price war. $300 for the GTX980, everything lower from there. Probably not now, but like, spring 2015, that'd be great! Make good and sure to wipe out all the hold outs (like myself) keeping their old cards because they still play everything they play on 1080p. Kinda, get all your customers caught up on hardware in the same upgrade time frame.

    Then when they finally do drop nodes they can focus on making every card they sell run games at 8K resolution.
  • Nfarce - Monday, September 22, 2014 - link

    Hate to break the news to you, but if you want to game at high level (above 1080p), you need to pay at high level. There is nothing new about that in the entire history of PC enthusiast building and gaming either for those of us who remember making the "huge leap" from a 15" 1024x768 resolution CRT monitor to a whopping 19" 1600x1200 CRT monitor. At least not in the 20 years since I've been involved with it anyway.

    Besides all that, that's why GPU makers offer cards for different budgets. If you can't afford their top tier products, you can't afford to game top tier. Period and end of discussion.
  • tuxRoller - Monday, September 22, 2014 - link

    It seems as though the big improvement nvidia has made is to enable cpu-level scheduling/dvfs granularity into their chip. However, once all cores are engaged it ends up using as much power as its predecessor (see tomshardware).
    What I really want to know is how much of this due to purely driver-level changes.
  • yhselp - Tuesday, September 23, 2014 - link

    Exceptional design. The sad thing is that NVIDIA will take forever to release a 30 SMM Maxwell GPU and once it finally does, it would cost a ton; even later on when they release a "budget" version for an unreasonable price of around $650 it would be too late - the great performance potential of today wouldn't be so great tomorrow. Striving for and building amazing GPUs is the right way forward, not empowering the people with them is a crime. Whatever happened to $500 flagship products?
  • Rhodie - Wednesday, September 24, 2014 - link

    Just got a GTX970, and only latest Nvidia drivers will install for 9xx series cards it seems. Unfortunately the latest drivers totally screw up some programs that use CUDA, seem to hide its
    presence from programs lile Xillisoft Video Convertor Ultimate:-/ No response of course from either Nvidia or Xillisoft regarding the problem. Wonder how many other programs the drivers break?
  • garadante - Thursday, September 25, 2014 - link

    Geeze. Anandtech, do an updated best value graphics card list because since the launch of the 970/980 retailers are giving some serious price cuts to 770/780/780 Ti's. Newegg has a 780 for less than $300 after rebate and just a hair over $300 before rebate. I'm seeing 780 Ti's for ~$430 and 770s for ~$240. I am amazed to see price cuts this deep since I haven't seen them the last several generations and considering how overpriced these cards were. But while supplies last and prices hold/drop, this completely flips price/performance on it's head. I feel bad recommending an AMD 290 Tri-X to a friend a couple months back now. xD
  • garadante - Thursday, September 25, 2014 - link

    Please do an updates best value graphics card* Where are my manners! D:
  • jman9295 - Friday, September 26, 2014 - link

    Newegg has an Asus DirectCU II GTX 780 selling in the $290 range after a mail in rebate, promo code and discount. It also comes with a pre-order copy of the new Borderlands game. That has to be the best value to performance GPU out right now. It is almost a full $100 less than the cheapest non-reference R9 290 on newegg and $40 less than the cheapest reference R9 290 which is crazy since this same Asus GTX 780 was selling for over $550 just last month with no free games (and still is on Amazon for some reason).
  • mixer4x - Thursday, September 25, 2014 - link

    I feel bad for having just bought the 290 tri-x just a month ago! =(
    I bought it because you never know when the new cards will be released and how much they will cost. Unfortunately, the new cards came out too soon!

Log in

Don't have an account? Sign up now