Overclocking

Finally, no review of a GTX Titan card would be complete without a look at overclocking performance.

From a design standpoint, GTX Titan X already ships close to its power limits. NVIDIA’s 250W TDP can only be raised another 10% – to 275W – meaning that in TDP limited scenarios there’s not much headroom to play with. On the other hand with the stock voltage being so low, in clockspeed limited scenarios there’s a lot of room for pushing the performance envelope through overvolting. And neither of these options addresses the most potent aspect of overclocking, which is pushing the entirely clockspeed curve higher at the same voltages by increasing the clockspeed offsets.

GTX 980 ended up being a very capable overclocker, and as we’ll see it’s much the same story for the GTX Titan X.

GeForce GTX Titan X Overclocking
Stock Overclocked
Core Clock 1002MHz 1202MHz
Boost Clock 1076Mhz 1276MHz
Max Boost Clock 1215MHz 1452MHz
Memory Clock 7GHz 7.8GHz
Max Voltage 1.162v 1.218v

Even when packing 8B transistors into a 601mm2, the GM200 GPU backing the GTX Titan X continues to offer the same kind of excellent overclocking headroom that we’ve come to see from the other Maxwell GPUs. Overall we have been able to increase our GPU clockspeed by 200MHz (20%) and the memory clockspeed by 800MHz (11%). At its peak this leads to the GTX Titan X pushing a maximum boost clock of 1.45GHz, and while TDP restrictions mean it can’t sustain this under most workloads, it’s still an impressive outcome for overclocking such a large GPU.

OC: Battlefield 4 - 3840x2160 - Ultra Quality - 0x MSAA

OC: Crysis 3 - 3840x2160 - High Quality + FXAA

OC: Shadow of Mordor - 3840x2160 - Ultra Quality

OC: The Talos Principle - 3840x2160 - Ultra Quality

OC: Total War: Attila - 3840x2160 - Max Quality + Perf Shadows

The performance gains from this overclock are a very consistent 16-19% across all 5 of our sample games at 4K, indicating that we're almost entirely GPU-bound as opposed to memory-bound. Though not quite enough to push the GTX Titan X above 60fps in Shadow of Mordor or Crysis 3, this puts it even closer than the GTX Titan X was at stock. Meanwhile we do crack 60fps on Battlefield 4 and The Talos Principle.

OC: Load Power Consumption - Crysis 3

OC: Load Power Consumption - FurMark

OC: Load GPU Temperature - Crysis 3

Load GPU Temperature - FurMark

OC: Load Noise Levels - Crysis 3

OC: Load Noise Levels - FurMark

The tradeoff for this overclock is of course power and noise, both of which see significant increases. In fact the jump in power consumption with Crysis is a bit unexpected – further research shows that the GTX Titan X shifts from being temperature limited to TDP limited as a result of our overclocking efforts – while FurMark is in-line with the 25W increase in TDP. The 55dB noise levels that result, though not extreme, also mean that GTX Titan X is drifting farther away from being a quiet card. Ultimately it’s a pretty straightforward tradeoff for a further 16%+ increase in performance, but a tradeoff nonetheless.

Power, Temperature, & Noise Final Words
Comments Locked

276 Comments

View All Comments

  • Antronman - Thursday, March 19, 2015 - link

    The Titan has always been marketed as a hybrid between a gaming and graphics development card.
  • H3ld3r - Thursday, March 19, 2015 - link

    Agree 100%
  • H3ld3r - Thursday, March 19, 2015 - link

    http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_Titan...
  • Evarin - Thursday, March 19, 2015 - link

    For people thinking that VRAM is unneeded, you must not be heavy into modding. Especially with Fallout 4 and GTA 5 on the horizon, massive amounts of room for texture mods will come in handy.
  • Black Obsidian - Thursday, March 19, 2015 - link

    6-8GB would seem to meet that requirement nicely.

    As is often the case with "doubled RAM" models, by the time that 12GB of VRAM is useful, we'll be a couple of generations down the road, and cards with 12GB of VRAM will be much faster, much cheaper, or both.

    Maybe at that point a Titan X owner could pick up a cheap used card and run them in SLI, but even then they're laying out more money than a user who buys a $500 card every couple of years and has the VRAM he/she needs when it's actually useful.
  • H3ld3r - Thursday, March 19, 2015 - link

    I agree with you but don't forget how vram is used in sli and cf. Vram of gpu 1 mirrors vram of 0 so if have 2x 4gb you're only taking advantage of 4gb. Anyway i prefer fast ram than hughe amounts of it.
  • Evarin - Thursday, March 19, 2015 - link

    We've already had a game which called for 6GB VRAM for an advanced texture pack. Imagine an Elder Scrolls or a Fallout where every single object in the game has a 4k resolution texture. I think it'd be a challenge even for the titan.
  • Antronman - Sunday, March 22, 2015 - link

    The way that RAM works is the worse your system is, the more RAM you end up needing.

    There are plateaus, but as GPUs get faster you need less VRAM to store the same amount of information.

    The Titan X is much faster than the Titan BE, and thus needs less VRAM, assuming that the application is the same.

    Then we get into Direct X 12 and Vulkan. They're supposed to increase efficiency all-around, reducing the demand for resources like RAM and cores even more.
  • Death666Angel - Thursday, March 19, 2015 - link

    "the card is generally overpowered for the relatively low maximum resolutions of DL-DVI "
    So I can drive my 1440p 105Hz display with it and get above 105fps? No? So what kind of statement is that then. DL-DVI may be old, but to say that 1440p is a low maximum resolution, especially with 100Hz+ IPS displays which rely on DL-DVI input, is strange to say the least.
  • H3ld3r - Thursday, March 19, 2015 - link

    Based in what i saw in ryan's review 4k games aren't that much memory demanding. If so how can anyone explain R9 performance?

Log in

Don't have an account? Sign up now