Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

GTX 780 comes into this phase of our testing with a very distinct advantage. Being based on an already exceptionally solid card in the GTX Titan, it’s guaranteed to do at least as well as Titan here. At the same time because its practical power consumption is going to be a bit lower due to the fewer enabled SMXes and fewer RAM chips, it can be said that it has Titan’s cooler and a lower yet TDP, which can be a silent (but deadly) combination.

GeForce GTX 780 Voltages
GTX 780 Max Boost GTX 780 Base GTX 780 Idle
1.1625v 1.025v 0.875v

Unsurprisingly, voltages are unchanged from Titan. GK110’s max safe load voltage is 1.1625v, with 1.2v being the maximum overvoltage allowed by NVIDIA. Meanwhile idle remains at 0.875v, and as we’ll see idle power consumption is equal too.

Meanwhile we also took the liberty of capturing the average clockspeeds of the GTX 780 in all of the games in our benchmark suite. In short, although the GTX 780 has a higher base clock than Titan (863MHz versus 837MHz), the fact that it only goes to one higher boost bin (1006MHz versus 993MHz) means that the GTX 780 doesn’t usually clock much higher than GTX Titan under load; for one reason or another it typically settles at the boost bin as the GTX Titan on tests that offer consistent work loads. This means that in practice the GTX 780 is closer to a straight-up harvested GTX Titan, with no practical clockspeed differences.

GeForce GTX Titan Average Clockspeeds
  GTX 780 GTX Titan
Max Boost Clock 1006MHz 992MHz
DiRT:S
1006MHz
992MHz
Shogun 2
966MHz
966MHz
Hitman
992MHz
992MHz
Sleeping Dogs
969MHz
966MHz
Crysis
992MHz
992MHz
Far Cry 3
979MHz
979MHz
Battlefield 3
992MHz
992MHz
Civilization V
1006MHz
979MHz

Idle power consumption is by the book. With the GTX 780 equipped, our test system sees 110W at the wall, a mere 1W difference from GTX Titan, and tied with the 7970GE. Idle power consumption of video cards is getting low enough that there’s not a great deal of difference between the latest generation cards, and what’s left is essentially lost as noise.

Moving on to power consumption under Battlefield 3, we get our first real confirmation of our earlier theories on power consumption. Between the slightly lower load placed on the CPU from the lower framerate, and the lower power consumption of the card itself, GTX 780 draws 24W less at the wall. Interestingly this is exactly how much our system draws with the GTX 580 too, which accounting for lower CPU power consumption means that video card power consumption on the GTX 780 is down compared to the GTX 580. GTX 780 being a harvested part helps a bit with that, but it still means we’re looking at quite the boost in performance relative to the GTX 580 for a simultaneous decrease in video card power consumption.

Moving along, we see that power consumption at the wall is higher than both the GTX 680 and 7970GE. The former is self-explanatory: the GTX 780 features a bigger GPU and more RAM, but is made on the same 28nm process as the GTX 680. So for a tangible performance improvement within the same generation, there’s nowhere for power consumption to go but up. Meanwhile as compared to the 7970GE, we are likely seeing a combination of CPU power consumption differences and at least some difference in video card power consumption, though this doesn’t make it possible to specify how much of each.

Switching to FurMark and its more pure GPU load, our results become compressed somewhat as the GTX 780 moves slightly ahead of the 7970GE. Power consumption relative to Titan is lower than what we expected it to be considering both cards are hitting their TDP limits, though compared to GTX 680 it’s roughly where it should be. At the same time this reflects a somewhat unexpected advantage for NVIDIA; despite the fact that GK110 is a bigger and logically more power hungry GPU than AMD’s Tahiti, the power consumption of the resulting cards isn’t all that different. Somehow NVIDIA has a slight efficiency advantage here.

Moving on to idle temperatures, we see that GTX 780 hits the same 30C mark as GTX Titan and 7970GE.

With GPU Boost 2.0, load temperatures are kept tightly in check when gaming. The GTX 780’s default throttle point is 80C, and that’s exactly what happens here, with GTX 780 bouncing around that number while shifting between its two highest boost bins. Note that like Titan however this means it’s quite a bit warmer than the open air cooled 7970GE, so it will be interesting to see if semi-custom GTX 780 cards change this picture at all.

Whereas GPU Boost 2.0 keeps a lid on things when gaming, it’s apparently a bit more flexible on FurMark, likely because the video card is already heavily TDP throttled.

Last but not least we have our look at idle noise. At 38dB GTX 780 is essentially tied with GTX Titan, which again comes at no great surprise. At least in our testing environment one would be hard pressed to tell the difference between GTX 680, GTX 780, and GTX Titan at idle. They’re essentially as quiet as a card can get without being silent.

Under BF3 we see the payoff of NVIDIA’s fan modifications, along with the slightly lower effective TDP of GTX 780. Despite – or rather because – it was built on the same platform as GTX Titan, there’s nowhere for idle noise to go down. As a result we have a 250W blower based card hitting 48.1dB under load, which is simply unheard of. At nearly a 4dB improvement over both GTX 680 and GTX 690 it’s a small but significant improvement over NVIDIA’s previous generation cards, and even Titan has the right to be embarrassed. Silent it is not, but this is incredibly impressive for a blower. The only way to beat something like this is with an open air card, as evidenced by the 7970GE, though that does comes with the usual tradeoffs for using such a cooler.

Because of the slightly elevated FurMark temperatures we saw previously, GTX 780 ends up being a bit louder than GTX Titan under FurMark. This isn’t something that we expect to see under any non-pathological workload, and I tend to favor BF3 over FurMark here anyhow, but it does point to there being some kind of minor difference in throttling mechanisms between the two cards. At the same time this means that GTX 780 is still a bit louder than our open air cooled 7970GE, though not by as large a difference as we saw with BF3.

Overall the GTX 780 generally meets or exceeds the GTX Titan in our power, temp, and noise tests, just as we’d expect for a card almost identical to Titan itself. The end result is that it maintains every bit of Titan’s luxury and stellar performance, and if anything improves on it slightly when we’re talking about the all-important aspects of load noise. It’s a shame that coolers such as 780’s are not a common fixture on cheaper cards, as this is essentially unparalleled as far as blower based coolers are concerned.

At the same time this sets up an interesting challenge for NVIDIA’s partners. To pass Greenlight they need to produce cards with coolers that function as good or as better than the reference GTX 780 in NVIDIA’s test environment. This is by no means impossible, but it’s not going to be an easy task. So it will be interesting to see what partners cook up, especially with the obligatory dual fan open air cooled models.

Compute Final Thoughts
Comments Locked

155 Comments

View All Comments

  • SymphonyX7 - Thursday, May 23, 2013 - link

    *mildly/narrowly trailing the GTX 680
  • chizow - Thursday, May 23, 2013 - link

    AMD released some significant driver updates in ~Oct 2012, branded "Never Settle" drivers that did boost GCN performance significantly, ~10-20% in some cases where they were clearly deficient relative to Nvidia parts. It was enough to make up the difference in a lot of cases or extend the lead to where the GE is generally faster than the 680.

    On the flipside, some of AMD's performance claims, particularly with CF have come under fire due to concerns about microstutter and frame latency, ie. the ongoing runtframe saga.
  • Vayra - Thursday, May 23, 2013 - link

    Drivers possibly?
  • kallogan - Thursday, May 23, 2013 - link

    High end overpriced gpu again ! Next !
  • wumpus - Thursday, May 23, 2013 - link

    Except that the 780 is nothing more than a Titan with even more cuda performance disabled. Presumably, they are expecting to get Titan sales to people interested in GPU computing, if only for geeklust/boasting.
  • wumpus - Thursday, May 23, 2013 - link

    My above comment was supposed to be a reply. Ignore/delete if possible.
  • ifrit39 - Thursday, May 23, 2013 - link

    Shadow Play is the most interesting news here. It costs a not-insignificant amount of money to buy a decent capture card that will record HD video. This is a great alternative as it requires no extra hardware and has little CPU/GPU overhead. Anything that ends up on the net will be compressed by youtube or other service anyway. I can't wait to remove fraps and install shadow play.
  • ahamling27 - Saturday, May 25, 2013 - link

    Fraps isn't the best, but they somehow have the market cornered. Look up Bandicam, I use it exclusively and I get great captures at a fraction the size. Plus they aren't cut up into 4 gig files. It has at least 15x more customization like putting watermarks in your capture or if you do like to segment your files you can have it do that at any size or time length you want. Plus you can record two sound sources at once, like your game and mic, or your game and whatever voice chat software you use.

    Anyway, I probably sound like I work for them now, but I can assure you I don't. This Shadow Play feature is definitely piquing my interest. If it's implemented wisely, it might just shut all the other software solutions down.
  • garadante - Thursday, May 23, 2013 - link

    There were two things that instantly made me dislike this card, much as I've liked Nvidia in the past: completely disabling the compute performance down to 600 series levels which was the reason I was more forgiving towards AMD in the 600/7000 series generation, and that they've priced this card at $650. If I remember correctly, the 680 was priced at $500-550 at launch, and that itself was hard to stomach as it was and still is widely believed GK104 was meant to be their mid-range chip. This 780 is more like what I imagined the 680 having been and if it launched at that price point, I'd be more forgiving.

    As it is... I'm very much rooting for AMD. I hope with these new hires, of which Anandtech even has an article of their new dream team or some such, that AMD can become competitive. Hopefully the experience developers get with their kind-of-funky architecture with the new consoles, however underwhelming they are, brings software on the PC both better multithreaded programming and performance, and better programming and performance to take advantage of AMD's module scheme. Intel and Nvidia both need some competition so we can get this computer hardware industry a bit less stagnated and better for the consumer.
  • EJS1980 - Tuesday, May 28, 2013 - link

    The 680 was $500 at launch, and was the main reason why AMD received so much flak for their 7970 pricing. At the time it launched, the 680 blew the 7970 away in terms of gaming performance, which was thee reason AMD had to respond with across the board price drops on the 7950/70, even though it took them a few months.

Log in

Don't have an account? Sign up now