Overclocking

Finally, no review of a GTX Titan card would be complete without a look at overclocking performance.

From a design standpoint, GTX Titan X already ships close to its power limits. NVIDIA’s 250W TDP can only be raised another 10% – to 275W – meaning that in TDP limited scenarios there’s not much headroom to play with. On the other hand with the stock voltage being so low, in clockspeed limited scenarios there’s a lot of room for pushing the performance envelope through overvolting. And neither of these options addresses the most potent aspect of overclocking, which is pushing the entirely clockspeed curve higher at the same voltages by increasing the clockspeed offsets.

GTX 980 ended up being a very capable overclocker, and as we’ll see it’s much the same story for the GTX Titan X.

GeForce GTX Titan X Overclocking
Stock Overclocked
Core Clock 1002MHz 1202MHz
Boost Clock 1076Mhz 1276MHz
Max Boost Clock 1215MHz 1452MHz
Memory Clock 7GHz 7.8GHz
Max Voltage 1.162v 1.218v

Even when packing 8B transistors into a 601mm2, the GM200 GPU backing the GTX Titan X continues to offer the same kind of excellent overclocking headroom that we’ve come to see from the other Maxwell GPUs. Overall we have been able to increase our GPU clockspeed by 200MHz (20%) and the memory clockspeed by 800MHz (11%). At its peak this leads to the GTX Titan X pushing a maximum boost clock of 1.45GHz, and while TDP restrictions mean it can’t sustain this under most workloads, it’s still an impressive outcome for overclocking such a large GPU.

OC: Battlefield 4 - 3840x2160 - Ultra Quality - 0x MSAA

OC: Crysis 3 - 3840x2160 - High Quality + FXAA

OC: Shadow of Mordor - 3840x2160 - Ultra Quality

OC: The Talos Principle - 3840x2160 - Ultra Quality

OC: Total War: Attila - 3840x2160 - Max Quality + Perf Shadows

The performance gains from this overclock are a very consistent 16-19% across all 5 of our sample games at 4K, indicating that we're almost entirely GPU-bound as opposed to memory-bound. Though not quite enough to push the GTX Titan X above 60fps in Shadow of Mordor or Crysis 3, this puts it even closer than the GTX Titan X was at stock. Meanwhile we do crack 60fps on Battlefield 4 and The Talos Principle.

OC: Load Power Consumption - Crysis 3

OC: Load Power Consumption - FurMark

OC: Load GPU Temperature - Crysis 3

Load GPU Temperature - FurMark

OC: Load Noise Levels - Crysis 3

OC: Load Noise Levels - FurMark

The tradeoff for this overclock is of course power and noise, both of which see significant increases. In fact the jump in power consumption with Crysis is a bit unexpected – further research shows that the GTX Titan X shifts from being temperature limited to TDP limited as a result of our overclocking efforts – while FurMark is in-line with the 25W increase in TDP. The 55dB noise levels that result, though not extreme, also mean that GTX Titan X is drifting farther away from being a quiet card. Ultimately it’s a pretty straightforward tradeoff for a further 16%+ increase in performance, but a tradeoff nonetheless.

Power, Temperature, & Noise Final Words
Comments Locked

276 Comments

View All Comments

  • chizow - Wednesday, March 18, 2015 - link

    And custom-cooled, higher clocked cards should? It took months for AMD to bring those to market and many of them cost more than the original reference cards and are also overclocked.

    http://www.newegg.com/Product/ProductList.aspx?Sub...

    Like I said, AMD fanboys made this bed, time to lie in it.
  • Witchunter - Wednesday, March 18, 2015 - link

    I hope you do realize calling out AMD fanboys in each and every one of your comments essentially paints you as Nvidia fanboy in the eyes of other readers. I'm here to read some constructive comments and all I see is you bitching about fanboys and being one yourself.
  • chizow - Wednesday, March 18, 2015 - link

    @Witchunter, the difference is, I'm not afraid to admit I'm a fan of the best, but I'm going to at least be consistent on my views and opinions. Whereas these AMD fanboys are crying foul for the same thing they threw a tantrum over a few years ago, ultimately leading to this policy to begin with. You don't find that ironic, that what they were crying about 4 years ago is suddenly a problem when the shoe is on the other foot? Maybe that tells you something about yourself and where your own biases reside? :)
  • Crunchy005 - Wednesday, March 18, 2015 - link

    @chizow either way you don't really offer constructive criticism and you call people dishonest without proving them wrong in any way and offering facts. You are one of the biggest fanboys out there and it kind of makes you lose credibility.
  • Crunchy005 - Wednesday, March 18, 2015 - link

    Ok wanted to add to this, I do like some of the comments you make but you are so fan boyish I am unable to take much stock in what you say. If you could offer more facts and stop just bashing AMD and praising the all powerful Nvidia is better in every way, despite the fact that AMD has advantages and has outperformed Nvidia in many ways, so has Nvidia outperformed AMD, they leap frog...if you did that we might all like to hear what you have to say.
  • FlushedBubblyJock - Thursday, April 2, 2015 - link

    I know what the truth is so I greatly enjoy what he says.
    If you can't handle the truth, that should be your problem, not everyone else's, obviously.
  • chizow - Monday, March 23, 2015 - link

    Like I said, I'm not here to sugarcoat things or keep it constructive, I'm here to set the record straight and keep the discussion honest. If that involves bruising some fragile AMD fanboy egos and sensibilities, so be it.

    I'm completely comfortable in my own skin knowing I'm a fan of the best, and that just happens to be Nvidia for graphics cards for the last near-decade since G80, and I'm certainly not afraid to tell you why that's the case backed with my usual facts, references etc. etc. You're free to verify my sources and references if you like to come to your own conclusion, but at the end of the day, that's the whole point of the internet, isn't it? Lay out the facts, let informed people make their own conclusions?

    In any case, the entire discussion and you can be the judge of whether my take on the topic is fair, you can clearly see, AMD fanboys caused this dilemma for themselves, many of which are the ones you see crying in this thread. Queue that Alanis Morissette song....

    http://anandtech.com/comments/3987/amds-radeon-687...
    http://anandtech.com/show/3988/the-use-of-evgas-ge...
  • Phartindust - Wednesday, March 18, 2015 - link

    Um, AMD doesn't manufacture after market cards.
  • dragonsqrrl - Tuesday, March 17, 2015 - link

    "use less power"

    ...right, and why would these non reference cards consume less power? Just hypothetically speaking, ignoring for a moment all the benchmarks out there that suggest otherwise.
  • squngy - Tuesday, March 17, 2015 - link

    Undervolting?

Log in

Don't have an account? Sign up now