Overclocking

Finally, no review of a GTX Titan card would be complete without a look at overclocking performance.

From a design standpoint, GTX Titan X already ships close to its power limits. NVIDIA’s 250W TDP can only be raised another 10% – to 275W – meaning that in TDP limited scenarios there’s not much headroom to play with. On the other hand with the stock voltage being so low, in clockspeed limited scenarios there’s a lot of room for pushing the performance envelope through overvolting. And neither of these options addresses the most potent aspect of overclocking, which is pushing the entirely clockspeed curve higher at the same voltages by increasing the clockspeed offsets.

GTX 980 ended up being a very capable overclocker, and as we’ll see it’s much the same story for the GTX Titan X.

GeForce GTX Titan X Overclocking
Stock Overclocked
Core Clock 1002MHz 1202MHz
Boost Clock 1076Mhz 1276MHz
Max Boost Clock 1215MHz 1452MHz
Memory Clock 7GHz 7.8GHz
Max Voltage 1.162v 1.218v

Even when packing 8B transistors into a 601mm2, the GM200 GPU backing the GTX Titan X continues to offer the same kind of excellent overclocking headroom that we’ve come to see from the other Maxwell GPUs. Overall we have been able to increase our GPU clockspeed by 200MHz (20%) and the memory clockspeed by 800MHz (11%). At its peak this leads to the GTX Titan X pushing a maximum boost clock of 1.45GHz, and while TDP restrictions mean it can’t sustain this under most workloads, it’s still an impressive outcome for overclocking such a large GPU.

OC: Battlefield 4 - 3840x2160 - Ultra Quality - 0x MSAA

OC: Crysis 3 - 3840x2160 - High Quality + FXAA

OC: Shadow of Mordor - 3840x2160 - Ultra Quality

OC: The Talos Principle - 3840x2160 - Ultra Quality

OC: Total War: Attila - 3840x2160 - Max Quality + Perf Shadows

The performance gains from this overclock are a very consistent 16-19% across all 5 of our sample games at 4K, indicating that we're almost entirely GPU-bound as opposed to memory-bound. Though not quite enough to push the GTX Titan X above 60fps in Shadow of Mordor or Crysis 3, this puts it even closer than the GTX Titan X was at stock. Meanwhile we do crack 60fps on Battlefield 4 and The Talos Principle.

OC: Load Power Consumption - Crysis 3

OC: Load Power Consumption - FurMark

OC: Load GPU Temperature - Crysis 3

Load GPU Temperature - FurMark

OC: Load Noise Levels - Crysis 3

OC: Load Noise Levels - FurMark

The tradeoff for this overclock is of course power and noise, both of which see significant increases. In fact the jump in power consumption with Crysis is a bit unexpected – further research shows that the GTX Titan X shifts from being temperature limited to TDP limited as a result of our overclocking efforts – while FurMark is in-line with the 25W increase in TDP. The 55dB noise levels that result, though not extreme, also mean that GTX Titan X is drifting farther away from being a quiet card. Ultimately it’s a pretty straightforward tradeoff for a further 16%+ increase in performance, but a tradeoff nonetheless.

Power, Temperature, & Noise Final Words
Comments Locked

276 Comments

View All Comments

  • looncraz - Tuesday, March 17, 2015 - link

    If the most recent slides (allegedly leaked from AMD) hold true, the 390x will be at least as fast as the Titan X, though with only 8GB of RAM (but HBM!).

    A straight 4096SP GCN 1.2/3 GPU would be a close match-up already, but any other improvements made along the way will potentially give the 390X a fairly healthy launch-day lead.

    I think nVidia wanted to keep AMD in the dark as much as possible so that they could not position themselves to take more advantage of this, but AMD decided to hold out, apparently, until May/June (even though they apparently already have some inventory on hand) rather than give nVidia a chance to revise the Titan X before launch.

    nVidia blinked, it seems, after it became apparent AMD was just going to wait out the clock with their current inventory.
  • zepi - Wednesday, March 18, 2015 - link

    Unless AMD has achieved considerable increase in perf/w, they are going to have really hard time tuning those 4k shaders to a reasonable frequency without being a 450W card.

    Not that being a 500W is necessarily a deal breaker for everyone, but in practice cooling a 450W card without causing ear shattering level of noise is very difficult compared to cooling a 250W card.

    Let us wait and hope, since AMD really would need to get a break and make some money on this one...
  • looncraz - Wednesday, March 18, 2015 - link

    Very true. We know that with HBM there should already be a fairly beefy power savings (~20-30W vs 290X it seems).

    That doesn't buy them room for 1,280 more SPs, of course, but it should get them a healthy 256 of them. Then, GCN 1.3 vs 1.1 should have power advantages as well. GCN 1.2 vs 1.0 (R9 285 vs R9 280) with 1792 SPs showed a 60W improvement, if we assume GCN 1.1 to GCN 1.3 shows a similar trend the 390X should be pulling only about 15W more than the 290X with the rumored specs without any other improvements.

    Of course, the same math says the 290X should be drawing 350W, but that's because it assumes all the power is in the SPs... But I do think it reveals that AMD could possibly do it without drawing much, if any, more power without making any unprecedented improvements.
  • Braincruser - Wednesday, March 18, 2015 - link

    Yeah, but the question is, How well will the memory survive on top of a 300W GPU?
    Because the first part in a graphic card to die from high temperatures is the VRAM.
  • looncraz - Thursday, March 19, 2015 - link

    It will be to the side, on a 2.5d interposer, I believe.

    GPU thermal energy will move through the path of least resistance (technically, to the area with the greatest deltaT, but regulated by the material thermal conductivity coefficient), which should be into the heatsink or water block. I'm not sure, but I'd think the chips could operate in the same temperature range as the GPU, but maybe not. It may be necessary to keep them thermally isolated. Which shouldn't be too difficult, maybe as simple as not using thermal pads at all for the memory and allowing them to passively dissipate heat (or through interposer mounted heatsinks).

    It will be interesting to see what they have done to solve the potential issues, that's for sure.
  • Xenonite - Thursday, March 19, 2015 - link

    Yes, I agree that AMD would be able to absolutely destroy NVIDIA on the performance front if they designed a 500W GPU and left the PCB and waterblock design to their AIB partners.

    I would also absolutely love to see what kind of performance a 500W or even a 1kW graphics card would be able to muster; however, since a relatively constant 60fps presented with less than about 100ms of total system latency has been deemed sufficient for a "smooth and responsive" gaming experience, I simply can't imagine such a card ever seeing the light of day.
    And while I can understand everyone likes to pretend that they are saving the planet with their <150W GPUs, the argument that such a TDP would be very difficult to cool does not really hold much water IMHO.

    If, for instance, the card was designed from the ground up to dissipate its heat load over multiple 200W~300W GPUs, connected via a very-high-speed, N-directional data interconnect bus, the card could easily and (most importantly) quietly be cooled with chilled-watercooling dissipating into a few "quad-fan" radiators. Practically, 4 GM200-size GPUs could be placed back-to-back on the PCB, with each one rendering a quarter of the current frame via shared, high-speed frame buffers (thereby eliminating SLI-induced microstutter and "frame-pacing" lag). Cooling would then be as simple as installing 4 standard gpu-watercooling loops with each loop's radiator only having to dissipate the TDP of a single GPU module.
  • naxeem - Tuesday, March 24, 2015 - link

    They did solve that problem with a water-cooling solution. 390X WCE is probably what we'll get.
  • ShieTar - Wednesday, March 18, 2015 - link

    Who says they don't allow it? EVGA have already anounced two special models, a superclocked one and one with a watercooling-block:

    http://eu.evga.com/articles/00918/EVGA-GeForce-GTX...
  • Wreckage - Tuesday, March 17, 2015 - link

    If by fast you mean June or July. I'm more interested in a 980ti so I don't need a new power supply.
  • ArmedandDangerous - Saturday, March 21, 2015 - link

    There won't ever be a 980 Ti if you understand Nvidia's naming schemes. Ti's are for unlocked parts, there's nothing to further unlock on the 980 GM204.

Log in

Don't have an account? Sign up now