Overclocking

Finally, no review of a GTX Titan card would be complete without a look at overclocking performance.

From a design standpoint, GTX Titan X already ships close to its power limits. NVIDIA’s 250W TDP can only be raised another 10% – to 275W – meaning that in TDP limited scenarios there’s not much headroom to play with. On the other hand with the stock voltage being so low, in clockspeed limited scenarios there’s a lot of room for pushing the performance envelope through overvolting. And neither of these options addresses the most potent aspect of overclocking, which is pushing the entirely clockspeed curve higher at the same voltages by increasing the clockspeed offsets.

GTX 980 ended up being a very capable overclocker, and as we’ll see it’s much the same story for the GTX Titan X.

GeForce GTX Titan X Overclocking
Stock Overclocked
Core Clock 1002MHz 1202MHz
Boost Clock 1076Mhz 1276MHz
Max Boost Clock 1215MHz 1452MHz
Memory Clock 7GHz 7.8GHz
Max Voltage 1.162v 1.218v

Even when packing 8B transistors into a 601mm2, the GM200 GPU backing the GTX Titan X continues to offer the same kind of excellent overclocking headroom that we’ve come to see from the other Maxwell GPUs. Overall we have been able to increase our GPU clockspeed by 200MHz (20%) and the memory clockspeed by 800MHz (11%). At its peak this leads to the GTX Titan X pushing a maximum boost clock of 1.45GHz, and while TDP restrictions mean it can’t sustain this under most workloads, it’s still an impressive outcome for overclocking such a large GPU.

OC: Battlefield 4 - 3840x2160 - Ultra Quality - 0x MSAA

OC: Crysis 3 - 3840x2160 - High Quality + FXAA

OC: Shadow of Mordor - 3840x2160 - Ultra Quality

OC: The Talos Principle - 3840x2160 - Ultra Quality

OC: Total War: Attila - 3840x2160 - Max Quality + Perf Shadows

The performance gains from this overclock are a very consistent 16-19% across all 5 of our sample games at 4K, indicating that we're almost entirely GPU-bound as opposed to memory-bound. Though not quite enough to push the GTX Titan X above 60fps in Shadow of Mordor or Crysis 3, this puts it even closer than the GTX Titan X was at stock. Meanwhile we do crack 60fps on Battlefield 4 and The Talos Principle.

OC: Load Power Consumption - Crysis 3

OC: Load Power Consumption - FurMark

OC: Load GPU Temperature - Crysis 3

Load GPU Temperature - FurMark

OC: Load Noise Levels - Crysis 3

OC: Load Noise Levels - FurMark

The tradeoff for this overclock is of course power and noise, both of which see significant increases. In fact the jump in power consumption with Crysis is a bit unexpected – further research shows that the GTX Titan X shifts from being temperature limited to TDP limited as a result of our overclocking efforts – while FurMark is in-line with the 25W increase in TDP. The 55dB noise levels that result, though not extreme, also mean that GTX Titan X is drifting farther away from being a quiet card. Ultimately it’s a pretty straightforward tradeoff for a further 16%+ increase in performance, but a tradeoff nonetheless.

Power, Temperature, & Noise Final Words
Comments Locked

276 Comments

View All Comments

  • nos024 - Wednesday, March 18, 2015 - link

    Well lets see. Even when it launches, will it be readily available and not highly priced like the 290X. If the 290x was readily available when it was launched, I would've bought one.
  • eanazag - Wednesday, March 18, 2015 - link

    Based on leaked slides referencing Battlefield 4 at 4K resolution the 390X is 1.6x the 290X. In the context of this review results we could guess it comes up slightly short at 4K ultra and 10 fps faster than the Titan X at 4K medium. Far Cry 4 came in at 1.55 x the 290X.

    290X non-uber 4K ultra - BF4 - 35.5 fps x 1.6 = 56.8. >> Titan 58.3
    290X non-uber 4K medium - BF4 - 65.9 fps x 1.6 = 105.44 >> Titan 94.8

    290X non-uber 4K ultra - FC4 - 31.2 fps x 1.55 = 48.36 >> Titan 42.1
    290X non-uber 4K medium - FC4 - 40.9 fps x 1.55 = 63.395 >> Titan 60.5

    These numbers don't tell the whole story on how AMD arrived with the figures, but it paints the picture of a GPU that goes toe-to-toe with the Titan X. The slides also talk about a water cooler edition. I'm suspecting the wattage will be in the same ball park as the 290X and likely higher.

    With the Titan X full breadth compute muscle, I am not sure what the 980 Ti will look like. I suspect Nvidia is holding that back based on whatever AMD releases, so they can unload a smack down trump card. Rumored $700 for the 390X WCE with 8GB HBM (high bandwidth memory - 4096 bit width) and in Q2 (April-June). Titan X and 390X at the same price given what I know at the moment I would go with the Titan X.

    Stack your GPU $'s for July.
  • FlushedBubblyJock - Thursday, April 2, 2015 - link

    If the R9 390X doesn't come out at $499 months and months from now, it won't be worth it.
  • shing3232 - Tuesday, March 17, 2015 - link

    1/32 FP32? so, this is a big gaming core.
  • Railgun - Tuesday, March 17, 2015 - link

    Exactly why it's not a $999 card.
  • shing3232 - Tuesday, March 17, 2015 - link

    but, it was priced at 999.
  • Railgun - Tuesday, March 17, 2015 - link

    What I mean is that it's not worth being a 999 card. Yes, it's priced at that, but it's value doesn't support it.
  • Flunk - Tuesday, March 17, 2015 - link

    Plenty of dolts bought the first Titan as a gaming card so I'm sure someone will buy this. At least there's a bigger performance difference between the Titan X and GTX 980 than there was between the Titan and GTX 780.
  • Kevin G - Tuesday, March 17, 2015 - link

    Except the GTX 780 came after the Titan launched. Rather it was the original Titan compared to the GTX 680 and here we see a similar gap between the Titan X and the GTX 980. It is also widely speculated that we'll see a cut down GM200 to fit between the GTX 980 and the Titan X so history looks like it will repeat itself.
  • chizow - Tuesday, March 17, 2015 - link

    @Railgun, I'd disagree and I was very vocal against the original Titan for a number of reasons. Mainly because Nvidia used the 7970 launch as an opportunity to jump their 2nd fastest chip as mainstream. 2ndly, because they held back their flagship chip nearly a full year (GTX 680 launched Mar 2012, Titan Feb 2013) while claiming the whole time there was no bigger chip, they tried to justify the higher price point because it was a "compute" card and lastly because it was a cut down chip and we knew it.

    Titan X isn't being sold with any of those pretenses and now that the new pricing/SKU structure has settled in (2nd fastest chip = new $500 flagship), there isn't any of that sticker shock anymore. Its the full chip, there's no complaints about them holding anything back, and 12GB of VRAM is a ridiculous amount of VRAM to stick on a card, and that costs money. If EVGA wants to release an $800 Classified 980 and people see value in it, then certainly this Titan X does as well.

    At least for me, it is the more appealing option for me now than getting a 2nd 980 for SLI. Slightly lower performance, lower heat, no SLI/scaling issues, and no framebuffer VRAM concerns for the foreseeable future. I game at 2560x1440p on an ROG Swift btw, so that is right in this card's wheelhouse.

Log in

Don't have an account? Sign up now