Overclocking

Finally, let’s spend a bit of time looking at the overclocking prospects for the GTX 780 Ti. Although GTX 780 Ti is now the fastest GK110 part, based on what we've seen with GTX 780 and GTX Titan there should still be some headroom to play with. Meanwhile there will also be the matter of memory overclocking, as 7GHz GDDR5 on a 384-bit bus presents us with a new baseline that we haven't seen before.

GeForce GTX 780 Ti Overclocking
  Stock Overclocked
Core Clock 876MHz 1026MHz
Boost Clock 928MHz 1078MHz
Max Boost Clock 1020MHz 1169MHz
Memory Clock 7GHz 7.6GHz
Max Voltage 1.187v 1.187v

Overall our overclock for the GTX 780 Ti is a bit on the low side compared to the other GTX 780 cards we’ve seen in the past, but not immensely so. With a GPU overclock of 150MHz, we’re able to push the base clock and maximum boost clocks ahead by 17% and 14% respectively, which should further extend NVIDIA’s performance lead by a similar amount.

Meanwhile the inability to unlock a higher boost bin through overvolting is somewhat disappointing, as this is the first time we’ve seen this happen. To be clear here GTX 780 Ti does support overvolting – our card offers up to another 75mV of voltage – however on closer examination our card doesn’t have a higher bin within reach; 75mV isn’t enough to reach the next validated bin. Apparently this is something that can happen with the way NVIDIA bins their chips and implements overvolting, though this the first time we’ve seen a card actually suffer from this. The end result is that it limits our ability to boost at the highest bins, as we’d normally have a bin or two unlocked to further increase the maximum boost clock.

As for memory overclocking, we were able to squeeze out a bit more out of our 7GHz GDDR5, pushing our memory clock 600MHz (9%) higher to 7.6GHz. Memory overclocking is always something of a roll of the dice, so it’s not clear here whether this is average or not for a GK110 setup with 7GHz GDDR5. Given the general drawbacks of a wider memory bus we wouldn’t be surprised if this was average, but at the same time in practice GK110 cards haven’t shown themselves to be as memory bandwidth limited as GK104 cards. So 9%, though a smaller gain than what we’ve seen on other cards, should still provide GTX 780 Ti with enough to keep the overclocked GPU well fed.

Starting as always with power, temperatures, and noise, we can see that overclocking GTX 780 Ti further increases its power consumption, and to roughly the same degree as what we’ve seen with GTX 780 and GTX Titan in the past. With a maximum TDP of just 106% (265W) the change isn’t so much that the card’s power limit has been significantly lifted – as indicated by FurMark – but rather raising the temperature limit virtually eliminates temperature throttling and as such allows the card to more frequently stay at its highest, most power hungry boost bins.

Despite the 95C temperature target we use for overclocking, the GTX 780 Ti finds its new equilibrium point at 85C. The fan will ramp up long before it allows us to get into the 90s.

Given the power jump we saw with Crysis 3 the noise ramp up is surprisingly decent. A 3dB rise in noise is going to be noticeable, but even in these overclocked conditions it will avoid being an ear splitting change. To that end overclocking means we’re getting off of GK110’s standard noise efficiency curve just as it does for power, so the cost will almost always outpace the payoff on a relative basis.

Finally, looking at gaming performance the the overall performance gains for overclocking are generally consistent. Between our 6 games we see a 10-14% performance increase, all in excess of the memory overclock and closely tracking the GPU overclock. GTX 780 Ti is already the fastest single-GPU card, so this only further improves its performance lead. But it does so while cutting into whatever is above it, be it the games where the stock 290X has a lead, or multi-GPU setups such as the 7990.

Power, Temperature, & Noise Final Words
Comments Locked

302 Comments

View All Comments

  • 1Angelreloaded - Thursday, November 7, 2013 - link

    Physically the 780 and 780 TI are literally the same unit, minus minor things. The difference is the neutered chip, and OC'd VRAM, Which means your paying for the same unit at 2 completely different prices, in fact, How much does it cost to disable the SMX texture? So shouldn't the overhead on the original unit be higher with more work having to be done? Or like AMD tricore are we paying for defective chips again ?
  • TheJian - Thursday, November 7, 2013 - link

    Wrong. They have been saving DEFECT FREE GK110 units for months just to be able to launch this with good quantity (probably only started having better results at B1, which all of these are). I doubt that there are many 780's that have fully working units that are disabled. They are failed Tesla chips (you can say DP is disabled on purpose, but not the SMX's). Do you really think 550mm chips have a ZERO defect rate?...LOL. I would be surprised if the first runs of Titan had any more working SMX's too as they were directly failed Tesla's. Sure there are probably a few cards with some working that are disabled but Yields and history say with chips this big there just has to be a pretty high defect rate vs. 100% working chips. It is pretty much the largest chip TSMC can make. That's not easy. Both AMD and NV do this to salvage failed chips (heck everybody does). You come with a flagship, then anything that fails you mark as a lower model (many models). It allows you to increase your yield and chips that can be sold. You should be thankful they have the tech to do this or we'd all be paying FAR higher prices do to chucking chips by the millions in the trash.
  • TheJian - Thursday, November 7, 2013 - link

    http://www.tomshardware.com/reviews/geforce-gtx-78...
    "Not to be caught off-guard, Nvidia was already binning its GK110B GPUs, which have been shipping since this summer on GeForce GTX 780 and Titan cards. The company won’t get specific about what it was looking for, but we have to imagine it set aside flawless processors with the lowest power leakage to create a spiritual successor for GeForce GTX 580. Today, those fully-functional GPUs drop into Nvidia’s GeForce GTX 780 Ti."

    There, don't have to believe me...confirmed I guess ;)
  • beck2448 - Thursday, November 7, 2013 - link

    Great job Nvidia! I think the partners with custom cooling will get another 15 to 20 % performance out of it with lower temps and less noise, and that is insane for a single GPU. Can't wait to see the Lightning and Windforce editions.
  • aznjoka - Thursday, November 7, 2013 - link

    The crossfire scaling on the 290x is much better than the 780ti. If you are running a dual card set up, getting a 290x is pretty much a no brainer.
  • beck2448 - Thursday, November 7, 2013 - link

    from Benchmark reviews: In conclusion, GeForce GTX 780 Ti is the gamer’s version of GTX TITAN with a powerful lead ahead of Radeon R9 290X. Even if it were possible for the competition to overclock and reach similar frame rate performance, temperatures and noise would still heavily favor the GTX 780 Ti design. I was shocked at how loud AMD’s R9 290X would roar once it began to heat up midway through a benchmark test, creating a bit of sadness for gamers trying to play with open speakers instead of an insulated headset. There is a modest price difference between them, but quite frankly, the competition doesn’t belong in the same class.
    Read more at http://benchmarkreviews.com/8468/nvidia-geforce-gt...
  • 1Angelreloaded - Thursday, November 7, 2013 - link

    TBH the stock Nvidia cooler isn't that much better either they tend to run a lot hotter than ACX/Twin Frozer, and other such solutions, so both have cooling headroom, Hawaii though is just plain ridiculous.
  • deedubs - Thursday, November 7, 2013 - link

    Noticed the graph for shadowplay performance has its labels reversed. It makes it look like SP increases performance instead of decreasing.
  • Ryan Smith - Thursday, November 7, 2013 - link

    Whoops. Thanks for that. The multi-series graph tool is a bit picky...
  • Filiprino - Thursday, November 7, 2013 - link

    I don't see NVIDIA as a real winner here, really. Their margin is very tight, and AMD drivers still have to mature, and when you talk about crossfire, AMD is doing clearly better, for $200 less and 6dB more.

Log in

Don't have an account? Sign up now