Overclocking

Finally, no review of a GTX Titan card would be complete without a look at overclocking performance.

From a design standpoint, GTX Titan X already ships close to its power limits. NVIDIA’s 250W TDP can only be raised another 10% – to 275W – meaning that in TDP limited scenarios there’s not much headroom to play with. On the other hand with the stock voltage being so low, in clockspeed limited scenarios there’s a lot of room for pushing the performance envelope through overvolting. And neither of these options addresses the most potent aspect of overclocking, which is pushing the entirely clockspeed curve higher at the same voltages by increasing the clockspeed offsets.

GTX 980 ended up being a very capable overclocker, and as we’ll see it’s much the same story for the GTX Titan X.

GeForce GTX Titan X Overclocking
Stock Overclocked
Core Clock 1002MHz 1202MHz
Boost Clock 1076Mhz 1276MHz
Max Boost Clock 1215MHz 1452MHz
Memory Clock 7GHz 7.8GHz
Max Voltage 1.162v 1.218v

Even when packing 8B transistors into a 601mm2, the GM200 GPU backing the GTX Titan X continues to offer the same kind of excellent overclocking headroom that we’ve come to see from the other Maxwell GPUs. Overall we have been able to increase our GPU clockspeed by 200MHz (20%) and the memory clockspeed by 800MHz (11%). At its peak this leads to the GTX Titan X pushing a maximum boost clock of 1.45GHz, and while TDP restrictions mean it can’t sustain this under most workloads, it’s still an impressive outcome for overclocking such a large GPU.

OC: Battlefield 4 - 3840x2160 - Ultra Quality - 0x MSAA

OC: Crysis 3 - 3840x2160 - High Quality + FXAA

OC: Shadow of Mordor - 3840x2160 - Ultra Quality

OC: The Talos Principle - 3840x2160 - Ultra Quality

OC: Total War: Attila - 3840x2160 - Max Quality + Perf Shadows

The performance gains from this overclock are a very consistent 16-19% across all 5 of our sample games at 4K, indicating that we're almost entirely GPU-bound as opposed to memory-bound. Though not quite enough to push the GTX Titan X above 60fps in Shadow of Mordor or Crysis 3, this puts it even closer than the GTX Titan X was at stock. Meanwhile we do crack 60fps on Battlefield 4 and The Talos Principle.

OC: Load Power Consumption - Crysis 3

OC: Load Power Consumption - FurMark

OC: Load GPU Temperature - Crysis 3

Load GPU Temperature - FurMark

OC: Load Noise Levels - Crysis 3

OC: Load Noise Levels - FurMark

The tradeoff for this overclock is of course power and noise, both of which see significant increases. In fact the jump in power consumption with Crysis is a bit unexpected – further research shows that the GTX Titan X shifts from being temperature limited to TDP limited as a result of our overclocking efforts – while FurMark is in-line with the 25W increase in TDP. The 55dB noise levels that result, though not extreme, also mean that GTX Titan X is drifting farther away from being a quiet card. Ultimately it’s a pretty straightforward tradeoff for a further 16%+ increase in performance, but a tradeoff nonetheless.

Power, Temperature, & Noise Final Words
Comments Locked

276 Comments

View All Comments

  • dragonsqrrl - Tuesday, March 17, 2015 - link

    Had no idea that non reference Hawaii cards were generally undervolted resulting in lower power consumption. Source?
  • chizow - Tuesday, March 17, 2015 - link

    There is some science behind it, heat results in higher leakage resulting in higher power consumption. But yes I agree, the reviews show otherwise, in fact, they show the cards that dont' throttle and boost unabated draw even more power, closer to 300W. So yes, that increased perf comes at the expense of higher power consumption, not sure why the AMD faithful believe otherwise.
  • FlushedBubblyJock - Saturday, March 21, 2015 - link

    Duh. It's because they hate Physx.
  • Kutark - Tuesday, March 17, 2015 - link

    Yes, some of the new designs from aftermarket are cooler and quiter, but they dont use less power, the GPU is generating the power, the aftermarket companies can't alter that. They can only tame the beast, so to speak.
  • Yojimbo - Tuesday, March 17, 2015 - link

    Would be a good point if the performance were the same. But the Titan X is 50% faster. The scores are also total system power usage under gaming load, not card usage. Running at 50% faster frame rates is going to tax other parts of the system more, as well.
  • Kutark - Tuesday, March 17, 2015 - link

    You're kidding right. Your framerate in no way affects your power usage.
  • nevcairiel - Tuesday, March 17, 2015 - link

    Actually, it might. If the GPU is faster, it might need more CPU power, which in turn can increase power draw from the CPU.
  • DarkXale - Tuesday, March 17, 2015 - link

    Of course. Its the entire point of DX12/Mantle/Vulcan/Metal to reduce per-frame CPU work, and as a consequence per-frame CPU power consumption.
  • Yojimbo - Tuesday, March 17, 2015 - link

    The main point of my post is that Titan X gets 50% more performance/system watt. But yes, your frame rate should affect your power usage if you are GPU-bound. The CPU, for instance, will be working harder maintaining the higher frame rates. How much harder, I have no idea, but it's a variable that needs to be considered before testbug00's antecedent can be considered true.
  • dragonsqrrl - Wednesday, March 18, 2015 - link

    Actually frame rates have a lot to do with power usage.

    I don't think that needs any further explanation, anyone who's even moderately informed knows this, and even if they didn't could probably figure out why this might be the case in about 10 seconds.

Log in

Don't have an account? Sign up now