The Final Word On Overclocking

Before we jump into our performance breakdown, I wanted to take a few minutes to write a bit of a feature follow-up to our overclocking coverage from Tuesday. Since we couldn’t reveal performance numbers at the time – and quite honestly we hadn’t even finished evaluating Titan – we couldn’t give you the complete story on Titan. So some clarification is in order.

On Tuesday we discussed how Titan reintroduces overvolting for NVIDIA products, but now with additional details from NVIDIA along with our own performance data we have the complete picture, and overclockers will want to pay close attention. NVIDIA may be reintroducing overvolting, but it may not be quite what many of us were first thinking.

First and foremost, Titan still has a hard TDP limit, just like GTX 680 cards. Titan cannot and will not cross this limit, as it’s built into the firmware of the card and essentially enforced by NVIDIA through their agreements with their partners. This TDP limit is 106% of Titan’s base TDP of 250W, or 265W. No matter what you throw at Titan or how you cool it, it will not let itself pull more than 265W sustained.

Compared to the GTX 680 this is both good news and bad news. The good news is that with NVIDIA having done away with the pesky concept of target power versus TDP, the entire process is much simpler; the power target will tell you exactly what the card will pull up to on a percentage basis, with no need to know about their separate power targets or their importance. Furthermore with the ability to focus just on just TDP, NVIDIA didn’t set their power limits on Titan nearly as conservatively as they did on GTX 680.

The bad news is that while GTX 680 shipped with a max power target of 132%, Titan is again only 106%. Once you do hit that TDP limit you only have 6% (15W) more to go, and that’s it. Titan essentially has more headroom out of the box, but it will have less headroom for making adjustments. So hardcore overclockers dreaming of slamming 400W through Titan will come away disappointed, though it goes without saying that Titan’s power delivery system was never designed for that in the first place. All indications are that NVIDIA built Titan’s power delivery system for around 265W, and that’s exactly what buyers will get.

Second, let’s talk about overvolting. What we didn’t realize on Tuesday but realize now is that overvolting as implemented in Titan is not overvolting in the traditional sense, and practically speaking I doubt too many hardcore overclockers will even recognize it as overvolting. What we mean by this is that overvolting was not implemented as a direct control system as it was on past generation cards, or even the NVIDIA-nixed cards like the MSI Lightning or EVGA Classified.

Overvolting is instead a set of two additional turbo clock bins, above and beyond Titan’s default top bin. On our sample the top bin is 1.1625v, which corresponds to a 992MHz core clock. Overvolting Titan to 1.2 means unlocking two more bins: 1006MHz @ 1.175v, and 1019MHz @ 1.2v. Or put another way, overvolting on Titan involves unlocking only another 27MHz in performance.

These two bins are in the strictest sense overvolting – NVIDIA doesn’t believe voltages over 1.1625v on Titan will meet their longevity standards, so using them is still very much going to reduce the lifespan of a Titan card – but it’s probably not the kind of direct control overvolting hardcore overclockers were expecting. The end result is that with Titan there’s simply no option to slap on another 0.05v – 0.1v in order to squeak out another 100MHz or so. You can trade longevity for the potential to get another 27MHz, but that’s it.

Ultimately, this means that overvolting as implemented on Titan cannot be used to improve the clockspeeds attainable through the use of the offset clock functionality NVIDIA provides. In the case of our sample it peters out after +115MHz offset without overvolting, and it peters out after +115MHz offset with overvolting. The only difference is that we gain access to a further 27MHz when we have the thermal and power headroom available to hit the necessary bins.

GeForce GTX Titan Clockspeed Bins
Clockspeed Voltage
1019MHz 1.2v
1006MHz 1.175v
992MHz 1.1625v
979MHz 1.15v
966MHz 1.137v
953MHz 1.125v
940MHz 1.112v
927MHz 1.1v
914MHz 1.087v
901MHz 1.075v
888MHz 1.062v
875MHz 1.05v
862MHz 1.037v
849MHz 1.025v
836MHz 1.012v

Finally, as with the GTX 680 and GTX 690, NVIDIA will be keeping tight control over what Asus, EVGA, and their other partners release. Those partners will have the option to release Titan cards with factory overclocks and Titan cards with different coolers (i.e. water blocks), but they won’t be able to expose direct voltage control or ship parts with higher voltages. Nor for that matter will they be able to create Titan cards with significantly different designs (i.e. more VRM phases); every Titan card will be a variant on the reference design.

This is essentially no different than how the GTX 690 was handled, but I think it’s something that’s important to note before anyone with dreams of big overclocks throws down $999 on a Titan card. To be clear, GPU Boost 2.0 is a significant improvement in the entire power/thermal management process compared to GPU Boost 1.0, and this kind of control means that no one needs to be concerned with blowing up their video card (accidentally or otherwise), but it’s a system that comes with gains and losses. So overclockers will want to pay close attention to what they’re getting into with GPU Boost 2.0 and Titan, and what they can and cannot do with the card.

Titan's Performance Unveiled Titan’s Compute Performance (aka Ph.D Lust)
Comments Locked

337 Comments

View All Comments

  • CeriseCogburn - Saturday, February 23, 2013 - link

    Here you are arac, some places can do things this place claims it cannot.

    See the massive spanking amd suffers.

    http://www.bit-tech.net/hardware/2013/02/21/nvidia...

    That's beyond a 40% lead for the nvidia Titan above and beyond the amd flagship. LOL

    No problem. No cpu limited crap. I guess some places know how to test.

    TITAN 110 min 156 max
    7970ghz 72 min 94 max
  • TheJian - Sunday, February 24, 2013 - link

    Jeez, I wish I had read your post before digging up my links. Yours is worse than mine making my point on skyrim even more valid.

    In your link the GTX670 takes out the 7970ghz even at 2560x1200. I thought all these dumb NV cards were bandwidth limited ;) Clear separation on all cards in this "cpu limited" benchmark on ALL resolutions.

    Hold on let me wrap my head around this...So with your site, and my 3 links to skyrim benchmarks in my posts (one of them right here at anandtech telling how to add gfx, their 7970ghz article), 3/4 of them showing separations according to their GPU class...Doesn't that mean they are NOT cpu bound? Am I missing something here? :) Are you wondering if Ryan benched skyrim with the hi-res pack after it came out, found it got smacked around by NV and dropped it? I mean he's claiming he tested it right above your post and found skyrim cpu limited. Is he claiming he didn't think adding a HI-RES PACK that's official would NOT add graphical slowdowns? This isn't a LOW-RES pack right?

    http://www.anandtech.com/show/6025/radeon-hd-7970-...
    Isn't that Ryan's article:
    "We may have to look at running extra graphics effects (e.g. TrSSAA/AAA) to thin the herd in the future."...Yep I think that's his point. PUT IN THE FREAKIN PACK. Because Skyrim didn't just become worthless as a benchmark as TONS are playing it, unlike Crysis Warhead and Dirt Showdown. Which you can feel free to check the server link I gave, nobody playing Warhead today either. I don't think anyone ever played Showdown to begin with (unlike warhead which actually was fun in circa 2008).

    http://www.vgchartz.com/game/23202/crysis-warhead/
    Global sales .01mil...That's a decimal point right?
    http://www.vgchartz.com/game/70754/dirt-showdown/
    It hasn't reached enough sales to post the decimal point. Heck xbox360 only sold 140K units globally. Meanwhile:
    http://www.vgchartz.com/game/49111/the-elder-scrol...
    2.75million sold (that's not a decimal any more)! Which one should be in the new game suite? Mods and ratings are keeping this game relevant for a long time to come. That's the PC sales ONLY (which is all we're counting here anyway).
    http://elderscrolls.wikia.com/wiki/Official_Add-on...
    The high-res patch is an OFFICIAL addon. Can't see why it's wrong to benchmark what EVERYONE would download to check out that bought the game, released feb 2012. Heck benchmark dawnguard or something. It came Aug 2012. I'm pretty sure it's still selling and being played. PCper, techpowerup, anandtech's review of the 7970ghz and now this bit-tech.net site. Skyrim's not worth benching but all 4 links show what to do (up the gfx!) and results come through fine and 3 sites show NV winning (your site of course the one of the four that ignores the game - hmm, sort of shows my bias comment doesn't it?). No cpu limit at 3 other sites who installed the OFFICIAL pack I guess, but you can't be bothered to test a HI-RES pack that surely stresses a gpu harder than without? What are we supposed to believe here?

    Looks like you may have a point Cerise.
    Thanks for the link BTW:
    http://www.bit-tech.net/hardware/2013/02/21/nvidia...
    You can consider witcher 2 added as a 15th benchmarkable game you left out Ryan. Just wish they'd turn on ubersampling. As mins are ~55 for titan here even at 2560x1600. Clearly with it on this would be a NON cpu limited game too (it isn't cpu limited even off). Please refrain from benchmarking games with less than a 100K units in sales. By definition that means nobody is playing them OR buying them right? And further we can extrapolate that nobody cares about their performance. Can anyone explain why skyrim with hires (and an addon that came after) is excluded but TWO games with basically ZERO sales are in here as important games that will be hanging with us for a few years?
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    Yes, appreciate it thanks, and your links I'll be checking out now.

    They already floated the poster vote article for the new game bench lineup, and what was settled upon already was Never Settle heavily flavored, so don't expect anything but the same or worse here.
    That's how it goes and there's a lot of pressure and PC populism and that great 2 week yearly vacation, and certainly attempting to prop a dying amd ship that "enables" this whole branch of competition for review sites is certainly not ignored. A hand up, a hand out, give em hand !
    lol

    Did you see where Wiz there at TPU in Titan review mentioned nVidia SLI plays 18 of 19 in house game tests and amd CF fails on 6 of them... currently fails on 6 of 19.

    " NVIDIA has done a very good job here in the past, and out of the 19 games in our test suite, SLI only fails in F1 2012. Compare that to 6 out of 19 failed titles with AMD CrossFire. "
    http://www.techpowerup.com/reviews/NVIDIA/GeForce_...

    So the amd fanboys have a real problem recommending 79xx rather 7xxx or 6xxx doubled or tripled up as an alternative with equal or better cost and "some performance wins" when THIRTY THREE PERCENT OF THE TIME AMD CF FAILS.

    I'm sorry, I was supposed to lie about that and claim all of amd's driver issues are behind it and it's all equal and amd used to have problems and blah blah blah the green troll company has driver issues too and blah blah blah...
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    Oh man, investigative reporting....lol

    " http://www.vgchartz.com/game/23202/crysis-warhead/
    Global sales .01mil...That's a decimal point right?
    http://www.vgchartz.com/game/70754/dirt-showdown/
    It hasn't reached enough sales to post the decimal point. Heck xbox360 only sold 140K units globally. Meanwhile:
    http://www.vgchartz.com/game/49111/the-elder-scrol...
    2.75million sold (that's not a decimal any more)! Which one should be in the new game suite? "

    Well it's just a mad, mad, amd world ain't it.

    You have a MASSIVE point there.

    Excellent link, that's a bookmark.
  • Zingam - Thursday, February 21, 2013 - link

    GeForce Titan "That means 1/3 FP32 performance, or roughly 1.3TFLOPS"
    Playstation 4 "High-end PC GPU (also built by AMD), delivering 1.84TFLOPS of performance"

    Can somebody explain to me how that above could be? GeForce Titan $999 graphics card has much lesser performance than what would be in basically (if I understand properly) an APU by AMD for $500 for the full system??? I doubt that Sony will accept $1000 or more loss but what I find even more doubtful that an APU could have that much performance.

    Please, somebody clarify!
  • chizow - Thursday, February 21, 2013 - link

    1/3 FP32 is double-precision FP64 throughput for Titanic. The PS4 must be quoting single-precision FP32 throughput and 1.84TFlops is nothing impressive in that regard. I believe GT200/RV670 were producing numbers in that range for single-precision FLOPs.
  • Blazorthon - Thursday, February 21, 2013 - link

    You are correct about PS4 quoting single precision and such, but I'm sure that you're wrong about GT200 being anywhere near 1.8TFLOPS in single precision. That number is right around the Radeon 7850.
  • chizow - Saturday, February 23, 2013 - link

    GT200 was around 1TFlop, I was confused because the same gen cards (RV670) were in the 1.2-1.3TFLOP range due to AMD's somewhat overstated VLIW5 theoretical peak numbers. Cypress for example was ~2.5TFlops so I wasn't too far off the mark in quoted TFLOPs.

    But yes if PS4 is GCN the performance would be closer to a 7850 in an apples to apples comparison.
  • frogger4 - Thursday, February 21, 2013 - link

    Yep, the quoted number for the PS4 is the single precision performance. It's just over the single precision FP for the HD7850 at 1.76flops, and it has one more compute unit, so that makes sense. The double precision for Pitcairn GPUs is 1/16th of that.

    The single precision performance for the Titan is (more than) three times the 1.3Tflop double precision number. Hope that clears it up!
  • StealthGhost - Thursday, February 21, 2013 - link

    Why are the settings/resolution used for, at least Battlefield 3, not consistent with those used in previous tests on GPUs, most directly those in Bench? Makes it harder to compare.

    Bench is such a great tool, it should be constantly updated and completely relevant, not discarded like it seems to be with these tests.

Log in

Don't have an account? Sign up now