The NVIDIA GeForce GTX 780 Ti Reviewby Ryan Smith on November 7, 2013 9:01 AM EST
Power, Temperature, & Noise
As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.
|GeForce GTX 780 Series Voltages|
|GTX 780 Ti Boost Voltage||GTX 780 Boost Voltage||GTX 780 Ti Base Voltage|
Taking a quick look at voltages, we find that our GTX 780 Ti operates at a slightly higher voltage at its maximum boost bin than the original GTX 780 did. The difference is minor, but the additional voltage may be necessary to hit the slightly higher clockspeeds GTX 780 Ti operates at relative to GTX Titan and GTX 780.
|GeForce GTX 780 Ti Average Clockspeeds|
|Max Boost Clock||1020MHz|
|TW: Rome 2||
Moving on to clockspeeds, we find that the GTX 780 Ti does very well when it comes to boosting. With a maximum boost clock of 1020MHz, we have 2 benchmarks averaging 1000MHz, and another 4 averaging 980MHz or better.
With all of our GK110 cards sharing a common design, at idle there’s very little to differentiate them. Other than GTX Titan’s extra 3GB of VRAM, we’re essentially looking at identical cards when idling.
Moving on to load power, we can see the power/heat ramifications of the slight clockspeed increase coupled with the activation of the 15th SMX. Even with the further optimizations NVIDIA has put into the new revision of GK110, power consumption has gone up in accordance with the higher performance of the card, just as we’d expect. Since NVIDIA doesn’t notably alter their power efficiency here, that increased performance has to come at the cost of increased power consumption. Though in this benchmark it’s worth pointing out that we’re measuring from the wall and that GTX 780 Ti outperforms GTX Titan by 8%, so some of that 29W power difference will come from the higher CPU load caused by the increased framerates.
As for the GTX 780 Ti SLI, here we see power level off at 556W, 20W more than the GTX 780 SLI. Some (if not most) of that is going to be explained by the increased CPU power consumption from the GTX 780 Ti SLI’s higher framerates. Coupled with that is the fact that in SLI setups these cards get hotter, and hence have to downclock a bit more to maintain equilibrium, which helps to offset the increased power requirements of GTX 780 Ti and keep the SLI results so close to the GTX 780 SLI results.
Switching over to FurMark, we find that power consumption is also up, but only slightly. With GPU Boost 2.0 clamping down on power consumption all of our GK110 cards should be clamped at 250W here, and with a difference between GTX 780 and GTX 780 Ti of under 10W, that’s exactly what appears to be happening here.
On a side note, it’s interesting to note here that under FurMark we’re seeing the GTX 780 Ti draw more power than the Radeon R9 290X. Despite the fact that the 290X has a higher rated TDP, in the card’s default quiet mode the card can’t actually dissipate as much heat (and thereby consume as much power) as the GTX 780 Ti can.
For idle temperatures we’re once again looking at cards that are for all intents and purposes identical. At 30C the GTX 780 Ti easily stays nice and cool.
As we mentioned in our look at the GTX 780 Ti hardware, NVIDIA has increased their default temperature throttle point from 80C on the GTX Titan/780 to 83C on the GTX 780 Ti. The end result is that in all of our temperature limited tests the GTX 780 Ti will peak at 83C-84C, whereas the older GK110 cards will peak at 80C-81C.
FurMark reiterates what we saw with Crysis 3. The temps are up a bit across the board, while the GK110 cards are holding near their throttle points. The SLI setups meanwhile approach the upper-80s at 88C, reflecting the fact that even with blowers, there’s some impact on neighboring cards in high load situations.
Our last idle scenario, we once again see all of our GK110 cards performing similarly, with idle noise levels in the 38dB-39dB range.
Moving on to our gaming load noise results, we can see the full repercussions of the GTX 780 Ti’s higher average power consumption coupled with the card’s higher temperature throttle point. Moving the throttle point along the same curve has the end result of moving higher the equilibrium point and thereby the card’s operating noise levels. As the fastest single-GPU card on this card, the GTX 780 Ti is still doing very well for itself and for a blower based design at 51.7dB, though at 1.5dB louder than GTX Titan and 4.2dB louder than GTX 780 the noise tradeoff for the card’s higher performance is very clear. Meanwhile the fact that it’s tied with the GTX 780 SLI comes with its own bit of irony.
Speaking of the GTX 780 SLI, we can see the noise impact of SLI configurations too. The GTX 780 Ti SLI levels out at 53.7dB, 2dB louder than our single-card configuration and 2dB louder than the GTX 780 SLI. At this point it’s just a bit louder than the 290X and quieter than a number of other 290 series setups.
Finally with load noise levels under FurMark we can see where our various cards will peak at for noise levels. The GTX 780 Ti creeps up to 52.3dB, essentially tying with the GTX 780 and GTX Titan. Otherwise it comes in just behind the 290X, and the start of the pack for our multi-GPU setups.
As for the GTX 780 Ti SLI, like our single-card comparison points its up slightly as compared to the GTX 780 SLI.
Overall, our look at power, temperatures, and noise has been a rather straightforward validation of our earlier suspicions. GTX 780 Ti’s higher performance leads to higher power consumption, and will all other factors being held equal – including the cooler – power, temps, and noise levels all rise a bit as compared to GTX Titan and GTX 780. There’s no such thing as a free lunch here, and while GPU Boost 2.0 will keep the maximum levels suitably in check, on average GTX 780 Ti is going to be a bit worse than the other GK110 cards due to those factors. Though even with the increased noise levels in particular, GTX 780 Ti is still able to outperform 290X on noise while also delivering better gaming performance, which makes this another tidy victory for NVIDIA.
Post Your CommentPlease log in or sign up to comment.
View All Comments
Wreckage - Thursday, November 7, 2013 - linkThe 290X = Bulldozer. Hot, loud, power hungry and unable to compete with an older architecture.
Kepler is still king even after being out for over a year.
trolledboat - Thursday, November 7, 2013 - linkHey look, it's a comment from a permanently banned user at this website for trolling, done before someone could of even read the first page.
Back in reality, very nice card, but sorely overpriced for such a meagre gain over 780. It also is slower than the cheaper 290x in some cases.
Nvidia needs more price cuts right now. 780 and 780ti are both badly overpriced in the face of 290 and 290x
neils58 - Thursday, November 7, 2013 - linkI think Nvidia probably have the right strategy, G-Sync is around the corner and it's a game changer that justifies the premium for their brand - AMD's only answer to it at this time is going crossfire to try and ensure >60FPS at all times for V-Sync. Nvidia are basically offering a single card solution that even with the brand premium and G-sync monitors comes out less expensive than crossfire. 780Ti for 1440p gamers, 780 for for 1920p gamers.
Kamus - Thursday, November 7, 2013 - linkI agree that G-Sync is a gamechanger, but just what do you mean AMD's only answer is crossfire? Mantle is right up there with g-sync in terms of importance. And from the looks of it, a good deal of AAA developers will be supporting Mantle.
As a user, it kind of sucks, because I'd love to take advantage of both.
That said, we still don't know just how much performance we'll get by using mantle, and it's only limited to games that support it, as opposed to G-Sync, which will work with every game right out of the box.
But on the flip side, you need a new monitor for G-Sync, and at least at first, we know it will only be implemented on 120hz TN panels. And not everybody is willing to trade their beautiful looking IPS monitor for a TN monitor, specially since they will retail at $400+ for 23" 1080p.
Wreckage - Thursday, November 7, 2013 - linkGsync will work with every game past ad present. So far Mantle is only confirmed in one game. That's a huge difference.
Basstrip - Thursday, November 7, 2013 - linkTLDR: When considering Gsync as a competitive advantage, add the cost of a new monitor. When considering Matnle support, think multiplatform and think next-gen consoles having AMD GPUs. Another plus side for NVidia is shadowplay and SHIELD though (but again, added costs if you consider SHIELD).
Gsync is not such a game changer as you have yet to see both a monitor with Gsync AND its pricing. The fact that I would have to upgrade my monitor and that that Gsync branding will add another few $$$ on the price tag is something you guys have to consider.
So to consider Gsync as a competitive advantage when considering a card, add the cost of a monitor to that. Perfect for those that are going to upgrade soon but for those that won't, Gsync is moot.
Mantle on its plus side will be used on consoles and pc (as both PS4 and Xbox One have AMD processors, developpers of games will most probably be using it). You might not care about consoles but they are part of the gaming ecosystem and sadly, we pc users tend to get the shafted by developpers because of consoles. I remember Frankieonpc mentioning he used to play tons of COD back in the COD4 days and said that development tends to have shifted towards consoles so the tuning was a bit more off for pc (paraphrasing slightly).
I'm in the market for both a new monitor and maybe a new card so I'm a bit on the fence...
Wreckage - Thursday, November 7, 2013 - linkMantle will not be used on consoles. AMD already confirmed this.
althaz - Thursday, November 7, 2013 - linkMantle is not used on consoles...because the consoles already have something very similar.
Kamus - Thursday, November 7, 2013 - linkYou are right, consoles use their own API for GCN, guess what mantle is used for?
*spoiler alert* GCN
EJS1980 - Thursday, November 7, 2013 - linkMantle is irrefutably NOT coming to consoles, so do your due diligence before trying to make a point. :)