Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

GeForce GTX 780 Series Voltages
GTX 780 Ti Boost Voltage GTX 780 Boost Voltage GTX 780 Ti Base Voltage
1.187v 1.1625v 1.012v

Taking a quick look at voltages, we find that our GTX 780 Ti operates at a slightly higher voltage at its maximum boost bin than the original GTX 780 did. The difference is minor, but the additional voltage may be necessary to hit the slightly higher clockspeeds GTX 780 Ti operates at relative to GTX Titan and GTX 780.

GeForce GTX 780 Ti Average Clockspeeds
Max Boost Clock 1020MHz
Metro: LL
1000MHz
CoH2
997MHz
Bioshock
954MHz
Battlefield 3
980MHz
Crysis 3
980MHz
Crysis: Warhead
1000MHz
TW: Rome 2
950MHz
Hitman
993MHz
GRID 2
967MHz
Furmark
823MHz

Moving on to clockspeeds, we find that the GTX 780 Ti does very well when it comes to boosting. With a maximum boost clock of 1020MHz, we have 2 benchmarks averaging 1000MHz, and another 4 averaging 980MHz or better.

With all of our GK110 cards sharing a common design, at idle there’s very little to differentiate them. Other than GTX Titan’s extra 3GB of VRAM, we’re essentially looking at identical cards when idling.

Moving on to load power, we can see the power/heat ramifications of the slight clockspeed increase coupled with the activation of the 15th SMX. Even with the further optimizations NVIDIA has put into the new revision of GK110, power consumption has gone up in accordance with the higher performance of the card, just as we’d expect. Since NVIDIA doesn’t notably alter their power efficiency here, that increased performance has to come at the cost of increased power consumption. Though in this benchmark it’s worth pointing out that we’re measuring from the wall and that GTX 780 Ti outperforms GTX Titan by 8%, so some of that 29W power difference will come from the higher CPU load caused by the increased framerates.

As for the GTX 780 Ti SLI, here we see power level off at 556W, 20W more than the GTX 780 SLI. Some (if not most) of that is going to be explained by the increased CPU power consumption from the GTX 780 Ti SLI’s higher framerates. Coupled with that is the fact that in SLI setups these cards get hotter, and hence have to downclock a bit more to maintain equilibrium, which helps to offset the increased power requirements of GTX 780 Ti and keep the SLI results so close to the GTX 780 SLI results.

Switching over to FurMark, we find that power consumption is also up, but only slightly. With GPU Boost 2.0 clamping down on power consumption all of our GK110 cards should be clamped at 250W here, and with a difference between GTX 780 and GTX 780 Ti of under 10W, that’s exactly what appears to be happening here.

On a side note, it’s interesting to note here that under FurMark we’re seeing the GTX 780 Ti draw more power than the Radeon R9 290X. Despite the fact that the 290X has a higher rated TDP, in the card’s default quiet mode the card can’t actually dissipate as much heat (and thereby consume as much power) as the GTX 780 Ti can.

For idle temperatures we’re once again looking at cards that are for all intents and purposes identical. At 30C the GTX 780 Ti easily stays nice and cool.

As we mentioned in our look at the GTX 780 Ti hardware, NVIDIA has increased their default temperature throttle point from 80C on the GTX Titan/780 to 83C on the GTX 780 Ti. The end result is that in all of our temperature limited tests the GTX 780 Ti will peak at 83C-84C, whereas the older GK110 cards will peak at 80C-81C.

FurMark reiterates what we saw with Crysis 3. The temps are up a bit across the board, while the GK110 cards are holding near their throttle points. The SLI setups meanwhile approach the upper-80s at 88C, reflecting the fact that even with blowers, there’s some impact on neighboring cards in high load situations.

Our last idle scenario, we once again see all of our GK110 cards performing similarly, with idle noise levels in the 38dB-39dB range.

Moving on to our gaming load noise results, we can see the full repercussions of the GTX 780 Ti’s higher average power consumption coupled with the card’s higher temperature throttle point. Moving the throttle point along the same curve has the end result of moving higher the equilibrium point and thereby the card’s operating noise levels. As the fastest single-GPU card on this card, the GTX 780 Ti is still doing very well for itself and for a blower based design at 51.7dB, though at 1.5dB louder than GTX Titan and 4.2dB louder than GTX 780 the noise tradeoff for the card’s higher performance is very clear. Meanwhile the fact that it’s tied with the GTX 780 SLI comes with its own bit of irony.

Speaking of the GTX 780 SLI, we can see the noise impact of SLI configurations too. The GTX 780 Ti SLI levels out at 53.7dB, 2dB louder than our single-card configuration and 2dB louder than the GTX 780 SLI. At this point it’s just a bit louder than the 290X and quieter than a number of other 290 series setups.

Finally with load noise levels under FurMark we can see where our various cards will peak at for noise levels. The GTX 780 Ti creeps up to 52.3dB, essentially tying with the GTX 780 and GTX Titan. Otherwise it comes in just behind the 290X, and the start of the pack for our multi-GPU setups.

As for the GTX 780 Ti SLI, like our single-card comparison points its up slightly as compared to the GTX 780 SLI.

Overall, our look at power, temperatures, and noise has been a rather straightforward validation of our earlier suspicions. GTX 780 Ti’s higher performance leads to higher power consumption, and will all other factors being held equal – including the cooler – power, temps, and noise levels all rise a bit as compared to GTX Titan and GTX 780. There’s no such thing as a free lunch here, and while GPU Boost 2.0 will keep the maximum levels suitably in check, on average GTX 780 Ti is going to be a bit worse than the other GK110 cards due to those factors. Though even with the increased noise levels in particular, GTX 780 Ti is still able to outperform 290X on noise while also delivering better gaming performance, which makes this another tidy victory for NVIDIA.

Compute Overclocking
Comments Locked

302 Comments

View All Comments

  • 1Angelreloaded - Thursday, November 7, 2013 - link

    They can't because of antitrust/monopoly laws, the penalties for NVidia would be retarded from the Gov. TBH since ati has been lowballing it lately this has caused NVidia to cap yields for higher prices.
  • Mondozai - Friday, December 13, 2013 - link

    EJS the buttboy for Nvidia keeps entertaining us! Dance monkey, dance!
  • Kodongo - Thursday, November 7, 2013 - link

    Us? Speak for yourself. If you willingly allowed nVidia to rape your wallet, more fool you. Me, I will go for the best price-performance cards which puts me firmly in the Radeon camp at the moment.
  • 1Angelreloaded - Thursday, November 7, 2013 - link

    Depends on your perspective, SLI is just better overall, and supported better. I'll gladly pay for a better product versus 1 at mainstream budget with less feature sets.
  • anubis44 - Thursday, November 7, 2013 - link

    "SLI is just better overall".

    Not anymore. HardOCP said: "We've been telling our readers for years that CrossFire just didn't feel as good as SLI while gaming.

    Those times have changed, at least on the new Radeon R9 290/X series. The new CrossFire technology has improved upon the CrossFire experience in a vastly positive way. Playing games on the Radeon R9 290X CrossFire configuration was a smooth experience. In fact, it was smoother than SLI in some games. It was also smoother on the 4K display at 3840x2160 gaming, and it was noticeably smoother in Eyefinity at 5760x1200."

    Read the whole R9 290X crossfire article here:

    http://www.hardocp.com/article/2013/11/01/amd_rade...

    Finally, ignore the noise about noise on the reference R9 290(X) cards. The custom cooled versions are coming out by the end of November and they'll be as quiet and cool as the nVidia cards, but faster and cheaper.
  • TheJian - Thursday, November 7, 2013 - link

    You need to read balance sheets: paste from another post I made at tomshardware (pre 780ti)-
    Simple economics...NV doesn't make as much as they did in 2007. They are not gouging anyone and should be charging more (so should AMD) and neither side should be handing out free games. Do you want them to be able to afford engineers and good drivers or NOT? AMD currently can't afford them due to your price love, so you get crap drivers that still are not fixed. It's sad people don't understand the reason you have crap drivers is they have lost $6Billion in 10yrs! R&D isn't FREE and the king of the hill gets to charge more than the putz. Why do you think their current card is 10db’s higher in noise, 50-70 watts higher and far hotter? NO R&D money.

    NV made ~550mil last 12 months (made $850 in 2007). Intel made ~10Billion (made under 7B 2007, so profits WAY UP, NV way down). Also INtel had 54B in assets 2007, now has 84billion! Who's raping you? The Nvidia hate is hilarious. I like good drivers, always improving products, and new perf/features. That means they need to PROFIT or we'll get crappy drivers from NV also.

    Microsoft 2007=14B, this year $21B (again UP HUGE!)
    Assets 2007=64B, 2013=146Billion HOLY SHITE.

    Who's raping you...IT isn't Nvidia...They are not doing nearly as well as 2007. So if they were even raping you then, now they're just asking you to show them your boobs...ROFL. MSFT/Intel on the other hand are asking you to bend over and take it like a man, oh and give me your wallet when I'm done, hey and that car too, heck sign over your house please...

    APPLE 2007=~3Bil profits 2013=41Billion (holy 13.5x the raping).
    Assets 2007=25B, wait for it...2013=176Billion!
    bend over and take it like a man, oh and give me your wallet when I'm done, hey and that car too, heck sign over your house please...Did you mention you're planning on having kids?...Name them Apple and I want them as slaves too...LOL

    Are we clear people. NV makes less now than 2007 and hasn't made near that 850mil since. Why? Because market forces are keeping them down which is only hurting them, and their R&D (that force is AMD, who by the way make ZERO). AMD is killing themselves and fools posting crap like this is why (OK, it's managements fault for charging stupidly low prices and giving out free games). You can thank the price of your card for your crappy AMD drivers

    Doesn't anyone want AMD to make money? Ask for HIGHER PRICES! Not lower, and quit demonizing NV who doesn't make NEAR what they did in 2007! Intel killed their chipset business and cost them a few hundred million each year. See how that works. If profits for these two companies don't start going up we're all going to get slower product releases (witness what just happened, no new cards for 2yrs if you can't even call AMD's new as it just catches OLD NV cards and runs hot doing it), and we can all expect CRAP DRIVERS with those slower released cards.
  • mohammadm5 - Monday, November 11, 2013 - link

    http://www.aliexpress.com/item/Wholesale-Price-GeF...

    thats the wholesale price its not nvidia that charges so much is the resellers. the profit nvidia makes per gpu is very low but the reseller make alot of money, also the new amd r9 290 is going for $255 per unit at wholesale price and the r9 280x is going for $160 dollar per unit. you have to also remember thats the distributer price not the manufacturer price,witch should be alot lower. i know the gtx 780 at manufacturer price sells from $200 to $280 depending on brand.

    so remember this is america were they sell you something made in china for 1 dollar for 10 dollars
  • RussianSensation - Thursday, November 7, 2013 - link

    Looks overpriced to be honest.

    I'd rather get MSI Lightning 780 or better yet grab 2 after-market R9 290s once they are out for $100-150 more and likely get 50-60% more performance. High resolution gaming advantage over R9 290X melts away to less than 8%. It looks even worse against $399 R9 290 - only a 15% advantage for a 75% price increase. Terrible value proposition. NV should have priced this guy at $599.

    http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_780_T...
  • A5 - Thursday, November 7, 2013 - link

    The article repeatedly points out that is overpriced. Like every other flagship card ever.

    Anyone looking for price/performance is getting a 280 or 770 (or lower).
  • Dantte - Thursday, November 7, 2013 - link

    Can we remove Battlefield 3 from the benchmarks and add Battlefield 4 please. BF3 is now 2 years old and is no long current with the genre. When's the last time you heard someone say "hey, I wonder how well this card will perform in BF3," I bet not for a while, but I have been hearing that exact statement for BF4 for the last year!

Log in

Don't have an account? Sign up now