Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

GTX 780 comes into this phase of our testing with a very distinct advantage. Being based on an already exceptionally solid card in the GTX Titan, it’s guaranteed to do at least as well as Titan here. At the same time because its practical power consumption is going to be a bit lower due to the fewer enabled SMXes and fewer RAM chips, it can be said that it has Titan’s cooler and a lower yet TDP, which can be a silent (but deadly) combination.

GeForce GTX 780 Voltages
GTX 780 Max Boost GTX 780 Base GTX 780 Idle
1.1625v 1.025v 0.875v

Unsurprisingly, voltages are unchanged from Titan. GK110’s max safe load voltage is 1.1625v, with 1.2v being the maximum overvoltage allowed by NVIDIA. Meanwhile idle remains at 0.875v, and as we’ll see idle power consumption is equal too.

Meanwhile we also took the liberty of capturing the average clockspeeds of the GTX 780 in all of the games in our benchmark suite. In short, although the GTX 780 has a higher base clock than Titan (863MHz versus 837MHz), the fact that it only goes to one higher boost bin (1006MHz versus 993MHz) means that the GTX 780 doesn’t usually clock much higher than GTX Titan under load; for one reason or another it typically settles at the boost bin as the GTX Titan on tests that offer consistent work loads. This means that in practice the GTX 780 is closer to a straight-up harvested GTX Titan, with no practical clockspeed differences.

GeForce GTX Titan Average Clockspeeds
  GTX 780 GTX Titan
Max Boost Clock 1006MHz 992MHz
DiRT:S
1006MHz
992MHz
Shogun 2
966MHz
966MHz
Hitman
992MHz
992MHz
Sleeping Dogs
969MHz
966MHz
Crysis
992MHz
992MHz
Far Cry 3
979MHz
979MHz
Battlefield 3
992MHz
992MHz
Civilization V
1006MHz
979MHz

Idle power consumption is by the book. With the GTX 780 equipped, our test system sees 110W at the wall, a mere 1W difference from GTX Titan, and tied with the 7970GE. Idle power consumption of video cards is getting low enough that there’s not a great deal of difference between the latest generation cards, and what’s left is essentially lost as noise.

Moving on to power consumption under Battlefield 3, we get our first real confirmation of our earlier theories on power consumption. Between the slightly lower load placed on the CPU from the lower framerate, and the lower power consumption of the card itself, GTX 780 draws 24W less at the wall. Interestingly this is exactly how much our system draws with the GTX 580 too, which accounting for lower CPU power consumption means that video card power consumption on the GTX 780 is down compared to the GTX 580. GTX 780 being a harvested part helps a bit with that, but it still means we’re looking at quite the boost in performance relative to the GTX 580 for a simultaneous decrease in video card power consumption.

Moving along, we see that power consumption at the wall is higher than both the GTX 680 and 7970GE. The former is self-explanatory: the GTX 780 features a bigger GPU and more RAM, but is made on the same 28nm process as the GTX 680. So for a tangible performance improvement within the same generation, there’s nowhere for power consumption to go but up. Meanwhile as compared to the 7970GE, we are likely seeing a combination of CPU power consumption differences and at least some difference in video card power consumption, though this doesn’t make it possible to specify how much of each.

Switching to FurMark and its more pure GPU load, our results become compressed somewhat as the GTX 780 moves slightly ahead of the 7970GE. Power consumption relative to Titan is lower than what we expected it to be considering both cards are hitting their TDP limits, though compared to GTX 680 it’s roughly where it should be. At the same time this reflects a somewhat unexpected advantage for NVIDIA; despite the fact that GK110 is a bigger and logically more power hungry GPU than AMD’s Tahiti, the power consumption of the resulting cards isn’t all that different. Somehow NVIDIA has a slight efficiency advantage here.

Moving on to idle temperatures, we see that GTX 780 hits the same 30C mark as GTX Titan and 7970GE.

With GPU Boost 2.0, load temperatures are kept tightly in check when gaming. The GTX 780’s default throttle point is 80C, and that’s exactly what happens here, with GTX 780 bouncing around that number while shifting between its two highest boost bins. Note that like Titan however this means it’s quite a bit warmer than the open air cooled 7970GE, so it will be interesting to see if semi-custom GTX 780 cards change this picture at all.

Whereas GPU Boost 2.0 keeps a lid on things when gaming, it’s apparently a bit more flexible on FurMark, likely because the video card is already heavily TDP throttled.

Last but not least we have our look at idle noise. At 38dB GTX 780 is essentially tied with GTX Titan, which again comes at no great surprise. At least in our testing environment one would be hard pressed to tell the difference between GTX 680, GTX 780, and GTX Titan at idle. They’re essentially as quiet as a card can get without being silent.

Under BF3 we see the payoff of NVIDIA’s fan modifications, along with the slightly lower effective TDP of GTX 780. Despite – or rather because – it was built on the same platform as GTX Titan, there’s nowhere for idle noise to go down. As a result we have a 250W blower based card hitting 48.1dB under load, which is simply unheard of. At nearly a 4dB improvement over both GTX 680 and GTX 690 it’s a small but significant improvement over NVIDIA’s previous generation cards, and even Titan has the right to be embarrassed. Silent it is not, but this is incredibly impressive for a blower. The only way to beat something like this is with an open air card, as evidenced by the 7970GE, though that does comes with the usual tradeoffs for using such a cooler.

Because of the slightly elevated FurMark temperatures we saw previously, GTX 780 ends up being a bit louder than GTX Titan under FurMark. This isn’t something that we expect to see under any non-pathological workload, and I tend to favor BF3 over FurMark here anyhow, but it does point to there being some kind of minor difference in throttling mechanisms between the two cards. At the same time this means that GTX 780 is still a bit louder than our open air cooled 7970GE, though not by as large a difference as we saw with BF3.

Overall the GTX 780 generally meets or exceeds the GTX Titan in our power, temp, and noise tests, just as we’d expect for a card almost identical to Titan itself. The end result is that it maintains every bit of Titan’s luxury and stellar performance, and if anything improves on it slightly when we’re talking about the all-important aspects of load noise. It’s a shame that coolers such as 780’s are not a common fixture on cheaper cards, as this is essentially unparalleled as far as blower based coolers are concerned.

At the same time this sets up an interesting challenge for NVIDIA’s partners. To pass Greenlight they need to produce cards with coolers that function as good or as better than the reference GTX 780 in NVIDIA’s test environment. This is by no means impossible, but it’s not going to be an easy task. So it will be interesting to see what partners cook up, especially with the obligatory dual fan open air cooled models.

Compute Final Thoughts
POST A COMMENT

155 Comments

View All Comments

  • varad - Thursday, May 23, 2013 - link

    You do realize that a GPU like Titan has almost 5 times the number of transistors compared to Intel's biggest Core i7 CPU, right? There are 7.1 billion transistors in Titan vs 1.4 billion in Core i7 3770k. So, it means they cannot match the price of "a good CPU" unless they decide to become a non-profit organization :) Reply
  • AssBall - Thursday, May 23, 2013 - link

    Well if all you needed was a single titan to run your is, computations, games, and nothing else, then no problem. Reply
  • krutou - Sunday, May 26, 2013 - link

    Two problems with your logic

    22 nm fabrication is more expensive (price per transistor)

    CPUs are more difficult to design
    Reply
  • An00bis - Friday, May 31, 2013 - link

    it's not like you can just shove your hand in a jar full of transistors and just slap it on a chip and consider it a cpu, a cpu is required to do a gpu's task (integrated gpu) AND be good at everything a gpu can't do, which is... well lots of things actually. A gpu is much simpler, hence why the manufacturing + designing cost is probably more expensive than a big ass card that has to include memory+a pcb+a gpu Reply
  • chizow - Thursday, May 23, 2013 - link

    Great card, but a year late. This is what GTX 600 series should've been but we all know how that went.

    I think Nvidia made some pretty big mistakes with how they handled the entire Kepler generation after Tahiti's launch price debacle. I know their financial statements and stockholders don't agree but they've managed to piss off their core consumers at every performance segment.

    Titan owners have to feel absolutely gutted at this point having paid $1000 for a part that is only ~10-15% faster than the GTX 780. End result of this generation is we are effectively paying 50-100% more for the same class of card than previous generations. While the 780 is a great card and a relatively good value compared to Titan, we're still paying $650 for what is effectively Kepler's version of the GTX 470.
    Reply
  • Crisium - Thursday, May 23, 2013 - link

    People who bought a Titan knew what they were getting into. If you have regrets, you were in no position to buy a $1000 GPU to begin with and made a grievous financial error.

    $650 isn't horrible for this price, but you are still paying the Nvidia Tax.
    Reply
  • chizow - Thursday, May 23, 2013 - link

    I don't think so, if you polled GTX Titan owners if they would've paid $1000 knowing 2-3 months later there would be a part that performed similarly at 35% less price, I think you would hear most of them would've waited to buy not 1, but 2 for just a bit more. Or instead of buying 2 Titans, buying 3x780s.

    Also, it really has nothing to do with being in a financial position or not, it's funny when Titan released I made the comment anyone interested in Titan would be better served by simply investing that money into Nvidia stock, letting that money grow on Titan's fat margins, and then buying 2x780s when they released. All according to plan, for my initial investment of 1 Titan I can buy 2x780s.

    But I won't. Nvidia blew it this generation, I'll wait for Maxwell.
    Reply
  • IanCutress - Thursday, May 23, 2013 - link

    Titan was a compute card with optional gaming, rather than a gaming card with optional FP64 compute. That's why the price difference exists. If you bought a Titan card for Gaming, then you would/should have been smart enough to know that a similar card without compute was around the corner. Reply
  • chizow - Thursday, May 23, 2013 - link

    Unfortunately, that was never how *GTX* Titan was marketed, straight from the horses mouth:
    "With the DNA of the world’s fastest supercomputer and the soul of NVIDIA® Kepler™ architecture, GeForce® GTX TITAN GPU is a revolution in PC gaming performance."

    Not to mention the fact Titan is a horrible compute card and value outside of CUDA workloads, and even there it suffers as a serious compute card due to the lack of ECC. It's an overpriced gaming card, plain and simple.

    At the time, it was still uncertain whether or not Nvidia would launch more SKUs based on GK110 ASIC, but informed consumers knew Nvidia had to do something with all the chips that didn't make the TDP cut as Tesla parts.
    Reply
  • mayankleoboy1 - Thursday, May 23, 2013 - link

    Really ? Apart from a few apps, Titan is poor compared to a 7970. It has bad OpenGL performance, which 90% of industry renderfarms use.
    Titan is really an overpriced gaming card.
    Reply

Log in

Don't have an account? Sign up now