The Final Word On Overclocking

Before we jump into our performance breakdown, I wanted to take a few minutes to write a bit of a feature follow-up to our overclocking coverage from Tuesday. Since we couldn’t reveal performance numbers at the time – and quite honestly we hadn’t even finished evaluating Titan – we couldn’t give you the complete story on Titan. So some clarification is in order.

On Tuesday we discussed how Titan reintroduces overvolting for NVIDIA products, but now with additional details from NVIDIA along with our own performance data we have the complete picture, and overclockers will want to pay close attention. NVIDIA may be reintroducing overvolting, but it may not be quite what many of us were first thinking.

First and foremost, Titan still has a hard TDP limit, just like GTX 680 cards. Titan cannot and will not cross this limit, as it’s built into the firmware of the card and essentially enforced by NVIDIA through their agreements with their partners. This TDP limit is 106% of Titan’s base TDP of 250W, or 265W. No matter what you throw at Titan or how you cool it, it will not let itself pull more than 265W sustained.

Compared to the GTX 680 this is both good news and bad news. The good news is that with NVIDIA having done away with the pesky concept of target power versus TDP, the entire process is much simpler; the power target will tell you exactly what the card will pull up to on a percentage basis, with no need to know about their separate power targets or their importance. Furthermore with the ability to focus just on just TDP, NVIDIA didn’t set their power limits on Titan nearly as conservatively as they did on GTX 680.

The bad news is that while GTX 680 shipped with a max power target of 132%, Titan is again only 106%. Once you do hit that TDP limit you only have 6% (15W) more to go, and that’s it. Titan essentially has more headroom out of the box, but it will have less headroom for making adjustments. So hardcore overclockers dreaming of slamming 400W through Titan will come away disappointed, though it goes without saying that Titan’s power delivery system was never designed for that in the first place. All indications are that NVIDIA built Titan’s power delivery system for around 265W, and that’s exactly what buyers will get.

Second, let’s talk about overvolting. What we didn’t realize on Tuesday but realize now is that overvolting as implemented in Titan is not overvolting in the traditional sense, and practically speaking I doubt too many hardcore overclockers will even recognize it as overvolting. What we mean by this is that overvolting was not implemented as a direct control system as it was on past generation cards, or even the NVIDIA-nixed cards like the MSI Lightning or EVGA Classified.

Overvolting is instead a set of two additional turbo clock bins, above and beyond Titan’s default top bin. On our sample the top bin is 1.1625v, which corresponds to a 992MHz core clock. Overvolting Titan to 1.2 means unlocking two more bins: 1006MHz @ 1.175v, and 1019MHz @ 1.2v. Or put another way, overvolting on Titan involves unlocking only another 27MHz in performance.

These two bins are in the strictest sense overvolting – NVIDIA doesn’t believe voltages over 1.1625v on Titan will meet their longevity standards, so using them is still very much going to reduce the lifespan of a Titan card – but it’s probably not the kind of direct control overvolting hardcore overclockers were expecting. The end result is that with Titan there’s simply no option to slap on another 0.05v – 0.1v in order to squeak out another 100MHz or so. You can trade longevity for the potential to get another 27MHz, but that’s it.

Ultimately, this means that overvolting as implemented on Titan cannot be used to improve the clockspeeds attainable through the use of the offset clock functionality NVIDIA provides. In the case of our sample it peters out after +115MHz offset without overvolting, and it peters out after +115MHz offset with overvolting. The only difference is that we gain access to a further 27MHz when we have the thermal and power headroom available to hit the necessary bins.

GeForce GTX Titan Clockspeed Bins
Clockspeed Voltage
1019MHz 1.2v
1006MHz 1.175v
992MHz 1.1625v
979MHz 1.15v
966MHz 1.137v
953MHz 1.125v
940MHz 1.112v
927MHz 1.1v
914MHz 1.087v
901MHz 1.075v
888MHz 1.062v
875MHz 1.05v
862MHz 1.037v
849MHz 1.025v
836MHz 1.012v

Finally, as with the GTX 680 and GTX 690, NVIDIA will be keeping tight control over what Asus, EVGA, and their other partners release. Those partners will have the option to release Titan cards with factory overclocks and Titan cards with different coolers (i.e. water blocks), but they won’t be able to expose direct voltage control or ship parts with higher voltages. Nor for that matter will they be able to create Titan cards with significantly different designs (i.e. more VRM phases); every Titan card will be a variant on the reference design.

This is essentially no different than how the GTX 690 was handled, but I think it’s something that’s important to note before anyone with dreams of big overclocks throws down $999 on a Titan card. To be clear, GPU Boost 2.0 is a significant improvement in the entire power/thermal management process compared to GPU Boost 1.0, and this kind of control means that no one needs to be concerned with blowing up their video card (accidentally or otherwise), but it’s a system that comes with gains and losses. So overclockers will want to pay close attention to what they’re getting into with GPU Boost 2.0 and Titan, and what they can and cannot do with the card.

Titan's Performance Unveiled Titan’s Compute Performance (aka Ph.D Lust)
Comments Locked

337 Comments

View All Comments

  • varg14 - Thursday, February 21, 2013 - link

    I will hang on to my SLI 560 tis for a while longer. Since i game at 1080p they perform very well.
  • mayankleoboy1 - Thursday, February 21, 2013 - link

    Some video conversion benchmarks please.
  • mayankleoboy1 - Thursday, February 21, 2013 - link

    Ohh, and the effect of PCIE2.0 VS PCIE3.0 also. Lets see how much is the Titan gimped by PCIE2.0
  • Ryan Smith - Thursday, February 21, 2013 - link

    This isn't something we can do at this second, but it's definitely something we can follow up on once things slow down a bit.
  • mayankleoboy1 - Thursday, February 21, 2013 - link

    Sure. I am looking forward to a part three of the Titan review
  • Hrel - Thursday, February 21, 2013 - link

    The problem with that reasoning, that they're raising here, is that the 7970 is almost as fast and costs a lot less. The Titan is competing, based on performance, with the 7970. Based on that comparison it's a shitty deal.

    http://www.newegg.com/Product/Product.aspx?Item=N8...

    $430. So based on that I'd say the highest price you can justify for this card is $560. We'll round up to $600.

    Nvidia shipping this, at this price, and just saying "it's a luxury product" is bullshit. It's not a luxury product, it's their version of a 7970GHE. But they want to try and get a ridiculous profit to support their PhysX and CUDA projects.

    Nvidia just lost me as a customer. This is the last straw. This card should be pushing the pricing down on the rest of their lineup. They SHOULD be introducing it to compete with the 7970GHE. Even at my price range, compare the GTX660 to the 7870GHE, or better yet the sub $200 7850. They just aren't competitive anymore. I'll admit, I was a bit of a Nvidia fan boy. Loved their products. Was disappointed by older ATI cards and issues I had with them. (stability, screen fitting properly, audio issues) But ATI has become AMD and they've improved quality a lot and Nvidia is USING their customers loyalty; that's just wrong.

    I'm done with Nvidia on the desktop. By the time I need a new laptop AMD will probably have the graphics switching all sorted; so I'm probably done with Nvidia on laptops too.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    LOL - be done, and buy the alternative crap - amd.

    You'll be sorry, and when you have to hold back admitting it, I'll be laughing the whole time.

    Poor baby can't pony up the grand, so he's boycotting the whole line.
    You know you people are the sickest freaks the world has ever seen, and frankly I don't believe you, and consider you insane.

    You're all little crybaby socialist activists. ROFL You're all pathetic.

    nVidia won't listen to you, so go blow on your amd crash monkey, you and two other people will do it before amd disappears into bankruptcy, and then we can laugh at your driver less video cards.

    I never seen bigger crybaby two year olders in my entire life. You all live in your crybaby world together, in solidarity - ROFL

    No one cares if you lying turds claim you aren't buying nVidia - they have billions and are DESTROYING amd because you cheapskate losers cost amd money - LOL

    YOU ARE A BURDEN AND CANNOT PAY FOR THE PRODUCTION OF A VIDEO CARD !

    Enjoy your false amd ghetto loser lifestyle.
  • Soulnibbler - Thursday, February 21, 2013 - link

    Hey, I'm excited about the fp64 performance but I'm not going to have any time to write code for a bit so I'll ask the question that would let me justify buying a card like this:

    How much acceleration should I expect using this card with Capture One as compared to AMD/software rendering. I've heard anecdotal evidence that the openCL code paths in version 7 make everything much faster, but I'd like a metric before I give up my current setup (windows in VMware) and dual-boot to get openCL support.

    I know openCL is not yet ready on this card but when you revisit it could we see a little Capture One action?

    Preferably the benchmark sets would be high resolution images at both high and low iso.
  • Ryan Smith - Monday, February 25, 2013 - link

    I'm afraid I don't know anything about Capture One. Though if you could describe it, that would be helpful.
  • Soulnibbler - Monday, February 25, 2013 - link

    Capture One is a raw developer for digital cameras.
    http://www.phaseone.com/en/Imaging-Software.aspx
    notably for medium format digital backs but also for 35mm and aps sensors. It could be considered a competitor to Adobe's Lightroom and ACR software but the medium format camera support and workflow are the major differentiators.

    The last two releases have had openCL support for both previews and exporting which I've heard has lead to reductions in time to get an image through post.

    I'd imagine that one could benchmark on a large library of photos and determine if this card as a compute card is any improvement over standard gaming cards in this use scenario.

    I'd imagine this is part of the market that NVIDIA is aiming at as I know at least one user who switched to an ATI W7000 for openCL support with Capture One.

Log in

Don't have an account? Sign up now