Overclocking

Finally, let’s spend a bit of time looking at the overclocking prospects for the GTX 780 Ti. Although GTX 780 Ti is now the fastest GK110 part, based on what we've seen with GTX 780 and GTX Titan there should still be some headroom to play with. Meanwhile there will also be the matter of memory overclocking, as 7GHz GDDR5 on a 384-bit bus presents us with a new baseline that we haven't seen before.

GeForce GTX 780 Ti Overclocking
  Stock Overclocked
Core Clock 876MHz 1026MHz
Boost Clock 928MHz 1078MHz
Max Boost Clock 1020MHz 1169MHz
Memory Clock 7GHz 7.6GHz
Max Voltage 1.187v 1.187v

Overall our overclock for the GTX 780 Ti is a bit on the low side compared to the other GTX 780 cards we’ve seen in the past, but not immensely so. With a GPU overclock of 150MHz, we’re able to push the base clock and maximum boost clocks ahead by 17% and 14% respectively, which should further extend NVIDIA’s performance lead by a similar amount.

Meanwhile the inability to unlock a higher boost bin through overvolting is somewhat disappointing, as this is the first time we’ve seen this happen. To be clear here GTX 780 Ti does support overvolting – our card offers up to another 75mV of voltage – however on closer examination our card doesn’t have a higher bin within reach; 75mV isn’t enough to reach the next validated bin. Apparently this is something that can happen with the way NVIDIA bins their chips and implements overvolting, though this the first time we’ve seen a card actually suffer from this. The end result is that it limits our ability to boost at the highest bins, as we’d normally have a bin or two unlocked to further increase the maximum boost clock.

As for memory overclocking, we were able to squeeze out a bit more out of our 7GHz GDDR5, pushing our memory clock 600MHz (9%) higher to 7.6GHz. Memory overclocking is always something of a roll of the dice, so it’s not clear here whether this is average or not for a GK110 setup with 7GHz GDDR5. Given the general drawbacks of a wider memory bus we wouldn’t be surprised if this was average, but at the same time in practice GK110 cards haven’t shown themselves to be as memory bandwidth limited as GK104 cards. So 9%, though a smaller gain than what we’ve seen on other cards, should still provide GTX 780 Ti with enough to keep the overclocked GPU well fed.

Starting as always with power, temperatures, and noise, we can see that overclocking GTX 780 Ti further increases its power consumption, and to roughly the same degree as what we’ve seen with GTX 780 and GTX Titan in the past. With a maximum TDP of just 106% (265W) the change isn’t so much that the card’s power limit has been significantly lifted – as indicated by FurMark – but rather raising the temperature limit virtually eliminates temperature throttling and as such allows the card to more frequently stay at its highest, most power hungry boost bins.

Despite the 95C temperature target we use for overclocking, the GTX 780 Ti finds its new equilibrium point at 85C. The fan will ramp up long before it allows us to get into the 90s.

Given the power jump we saw with Crysis 3 the noise ramp up is surprisingly decent. A 3dB rise in noise is going to be noticeable, but even in these overclocked conditions it will avoid being an ear splitting change. To that end overclocking means we’re getting off of GK110’s standard noise efficiency curve just as it does for power, so the cost will almost always outpace the payoff on a relative basis.

Finally, looking at gaming performance the the overall performance gains for overclocking are generally consistent. Between our 6 games we see a 10-14% performance increase, all in excess of the memory overclock and closely tracking the GPU overclock. GTX 780 Ti is already the fastest single-GPU card, so this only further improves its performance lead. But it does so while cutting into whatever is above it, be it the games where the stock 290X has a lead, or multi-GPU setups such as the 7990.

Power, Temperature, & Noise Final Words
POST A COMMENT

300 Comments

View All Comments

  • Wreckage - Thursday, November 07, 2013 - link

    The 290X = Bulldozer. Hot, loud, power hungry and unable to compete with an older architecture.

    Kepler is still king even after being out for over a year.
    Reply
  • trolledboat - Thursday, November 07, 2013 - link

    Hey look, it's a comment from a permanently banned user at this website for trolling, done before someone could of even read the first page.

    Back in reality, very nice card, but sorely overpriced for such a meagre gain over 780. It also is slower than the cheaper 290x in some cases.

    Nvidia needs more price cuts right now. 780 and 780ti are both badly overpriced in the face of 290 and 290x
    Reply
  • neils58 - Thursday, November 07, 2013 - link

    I think Nvidia probably have the right strategy, G-Sync is around the corner and it's a game changer that justifies the premium for their brand - AMD's only answer to it at this time is going crossfire to try and ensure >60FPS at all times for V-Sync. Nvidia are basically offering a single card solution that even with the brand premium and G-sync monitors comes out less expensive than crossfire. 780Ti for 1440p gamers, 780 for for 1920p gamers. Reply
  • Kamus - Thursday, November 07, 2013 - link

    I agree that G-Sync is a gamechanger, but just what do you mean AMD's only answer is crossfire? Mantle is right up there with g-sync in terms of importance. And from the looks of it, a good deal of AAA developers will be supporting Mantle.

    As a user, it kind of sucks, because I'd love to take advantage of both.
    That said, we still don't know just how much performance we'll get by using mantle, and it's only limited to games that support it, as opposed to G-Sync, which will work with every game right out of the box.

    But on the flip side, you need a new monitor for G-Sync, and at least at first, we know it will only be implemented on 120hz TN panels. And not everybody is willing to trade their beautiful looking IPS monitor for a TN monitor, specially since they will retail at $400+ for 23" 1080p.
    Reply
  • Wreckage - Thursday, November 07, 2013 - link

    Gsync will work with every game past ad present. So far Mantle is only confirmed in one game. That's a huge difference. Reply
  • Basstrip - Thursday, November 07, 2013 - link

    TLDR: When considering Gsync as a competitive advantage, add the cost of a new monitor. When considering Matnle support, think multiplatform and think next-gen consoles having AMD GPUs. Another plus side for NVidia is shadowplay and SHIELD though (but again, added costs if you consider SHIELD).

    Gsync is not such a game changer as you have yet to see both a monitor with Gsync AND its pricing. The fact that I would have to upgrade my monitor and that that Gsync branding will add another few $$$ on the price tag is something you guys have to consider.

    So to consider Gsync as a competitive advantage when considering a card, add the cost of a monitor to that. Perfect for those that are going to upgrade soon but for those that won't, Gsync is moot.

    Mantle on its plus side will be used on consoles and pc (as both PS4 and Xbox One have AMD processors, developpers of games will most probably be using it). You might not care about consoles but they are part of the gaming ecosystem and sadly, we pc users tend to get the shafted by developpers because of consoles. I remember Frankieonpc mentioning he used to play tons of COD back in the COD4 days and said that development tends to have shifted towards consoles so the tuning was a bit more off for pc (paraphrasing slightly).

    I'm in the market for both a new monitor and maybe a new card so I'm a bit on the fence...
    Reply
  • Wreckage - Thursday, November 07, 2013 - link

    Mantle will not be used on consoles. AMD already confirmed this. Reply
  • althaz - Thursday, November 07, 2013 - link

    Mantle is not used on consoles...because the consoles already have something very similar. Reply
  • Kamus - Thursday, November 07, 2013 - link

    You are right, consoles use their own API for GCN, guess what mantle is used for?
    *spoiler alert* GCN
    Reply
  • EJS1980 - Thursday, November 07, 2013 - link

    Mantle is irrefutably NOT coming to consoles, so do your due diligence before trying to make a point. :) Reply

Log in

Don't have an account? Sign up now