Power, Temperature, & Noise

Last but certainly not least, we have our obligatory look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason to ignore the noise.

It’s for that reason that GPU manufacturers also seek to keep power usage down, and under normal circumstances there’s a pretty clear relationship between power consumption, heat generated, and the amount of noise the fans will generate to remove that heat. At the same time however this is an area that NVIDIA is focusing on for Titan, as a premium product means they can use premium materials, going above and beyond what more traditional plastic cards can do for noise dampening.

GeForce GTX Titan Voltages
Titan Max Boost Titan Base Titan Idle
1.1625v 1.012v 0.875v

Stopping quickly to take a look at voltages, Titan’s peak stock voltage is at 1.162v, which correlates to its highest speed bin of 992MHz. As the clockspeeds go farther down these voltages drop, to a load low of 0.95v at 744MHz. This ends up being a bit less than the GTX 680 and most other desktop Kepler cards, which go up just a bit higher to 1.175v. Since NVIDIA is classifying 1.175v as an “overvoltage” on Titan, it looks like GK110 isn’t going to be quite as tolerant of voltages as GK104 was.

GeForce GTX Titan Average Clockspeeds
Max Boost Clock 992MHz
DiRT:S 992MHz
Shogun 2 966MHz
Hitman 992MHz
Sleeping Dogs 966MHz
Crysis 992MHz
Far Cry 3 979MHz
Battlefield 3 992MHz
Civilization V 979MHz

One thing we quickly notice about Titan is that thanks to GPU Boost 2 and the shift from what was primarily a power based boost system to a temperature based boost system is that Titan hits its maximum speed bin far more often and sustains it more often too, especially since there’s no longer a concept of a power target with Titan, and any power limits are based entirely by TDP.  Half of our games have an average clockspeed of 992MHz, or in other words never triggered a power or thermal condition that would require Titan to scale back its clockspeed. For the rest of our tests the worst clockspeed was all of 2 bins (26MHz) lower at 966MHz, with this being a mix of hitting both thermal and power limits.

On a side note, it’s worth pointing out that these are well in excess of NVIDIA’s official boost clock for Titan. With Titan boost bins being based almost entirely on temperature, the average boost speed for Titan is going to be more dependent on environment (intake) temperatures than GTX 680 was, so our numbers are almost certainly a bit higher than what one would see in a hotter environment.

Starting as always with a look at power, there’s nothing particularly out of the ordinary here. AMD and NVIDIA have become very good at managing idle power through power gating and other techniques, and as a result idle power has come down by leaps and bounds over the years. At this point we still typically see some correlation between die size and idle power, but that’s a few watts at best. So at 111W at the wall, Titan is up there with the best cards.

Moving on to our first load power measurement, as we’ve dropped Metro 2033 from our benchmark suite we’ve replaced it with Battlefield 3 as our game of choice for measuring peak gaming power consumption. BF3 is a difficult game to run, but overall it presents a rather typical power profile which of all the games in our benchmark suite makes it one of the best representatives.

In any case, as we can see Titan’s power consumption comes in below all of our multi-GPU configurations, but higher than any other single-GPU card. Titan’s 250W TDP is 55W higher than GTX 680’s 195W TDP, and with a 73W difference at the wall this isn’t too far off. A bit more surprising is that it’s drawing nearly 50W more than our 7970GE at the wall, given the fact that we know the 7970GE usually gets close to its TDP of 250W. At the same time since this is a live game benchmark, there are more factors than just the GPU in play. Generally speaking, the higher a card’s performance here, the harder the rest of the system will have to work to keep said card fed, which further increases power consumption at the wall.

Moving to Furmark our results keep the same order, but the gap between the GTX 680 and Titan widens, while the gap between Titan and the 7970GE narrows. Titan and the 7970GE shouldn’t be too far apart from each other in most situations due to their similar TDPs (even if NVIDIA and AMD TDPs aren’t calculated in quite the same way), so in a pure GPU power consumption scenario this is what we would expect to see.

Titan for its part is the traditional big NVIDIA GPU, and while NVIDIA does what they can to keep it in check, at the end of the day it’s still going to be among the more power hungry cards in our collection. Power consumption itself isn’t generally a problem with these high end cards so long as a system has the means to cool it and doesn’t generate much noise in doing so.

Moving on to temperatures, for a single card idle temperatures should be under 40C for anything with at least a decent cooler. Titan for its part is among the coolest at 30C; its large heatsink combined with its relatively low idle power consumption makes it easy to cool here.

Because Titan’s boost mechanisms are now temperature based, Titan’s temperatures are going to naturally gravitate towards its default temperature target of 80C as the card raises and lowers clockspeeds to maximize performance while keeping temperatures at or under that level. As a result just about any heavy load is going to see Titan within a couple of degrees of 80C, which makes for some very predictable results.

Looking at our other cards, while the various NVIDIA cards are still close in performance the 7970GE ends up being quite a bit cooler due to its open air cooler. This is typical of what we see with good open air coolers, though with NVIDIA’s temperature based boost system I’m left wondering if perhaps those days are numbered. So long as 80C is a safe temperature, there’s little reason not to gravitate towards it with a system like NVIDIA’s, regardless of the cooler used.

Load GPU Temperature - FurMark

With Furmark we see everything pull closer together as Titan holds fast at 80C while most of the other cards, especially the Radeons, rise in temperature. At this point Titan is clearly cooler than a GTX 680 SLI, 2C warmer than a single GTX 680, and still a good 10C warmer than our 7970GE.

Idle Noise Levels

Just as with the GTX 690, one of the things NVIDIA focused on was construction choices and materials to reduce noise generated. So long as you can keep noise down, then for the most part power consumption and temperatures don’t matter.

Simply looking at idle shows that NVIDIA is capable of delivering on their claims. 37.8dB is the quietest actively cooled high-end card we’ve measured yet, besting even the luxury GTX 690, and the also well-constructed GTX 680. Though really with the loudest setup being all of 40.5dB, none of these setups is anywhere near loud at idle.

It’s with load noise that we finally see the full payoff of Titan’s build quality. At 51dB it’s only marginally quieter than the GTX 680, but as we recall from our earlier power data, Titan is drawing nearly 70W more than GTX 680 at the wall. In other words, despite the fact that Titan is drawing significantly more power than GTX 680, it’s still as quiet as or quieter than the aforementioned card. This coupled with Titan’s already high performance is Titan’s true power in NVIDIA’s eyes; it’s not just fast, but despite its speed and despite its TDP it’s as quiet as any other blower based card out there, allowing them to get away with things such as Tiki and tri-SLI systems with reasonable noise levels.

Much like what we saw with temperatures under Furmark, noise under Furmark has our single-GPU cards bunching up. Titan goes up just enough to tie GTX 680 in our pathological scenario, meanwhile our multi-GPU cards start shooting up well past Titan, while the 7970GE jumps up to just shy of Titan. This is a worst case scenario, but it’s a good example of how GPU Boost 2.0’s temperature functionality means that Titan quite literally keeps its cool and thereby keeps its noise in check.

Of course we would be remiss to point out that in all these scenarios the open air cooled 7970GE is still quieter, and in our gaming scenario by actually by quite a bit. Not that Titan is loud, but it doesn’t compare to the 7970GE. Ultimately we get to the age old debate between blowers and open air coolers; open air coolers are generally quieter, but blowers allow for more flexibility with products, and are more lenient with cases with poor airflow.

Ultimately Titan is a blower so that NVIDIA can do concept PCs like Tiki, which is something an open air cooler would never be suitable for. For DIY builders the benefits may not be as pronounced, but this is also why NVIDIA is focusing so heavily on boutique systems where the space difference really matters. Whereas realistically speaking, AMD’s best blower-capable card is the vanilla 7970, a less power hungry but also much less powerful card.

Synthetics Final Thoughts
Comments Locked

337 Comments

View All Comments

  • chizow - Thursday, February 21, 2013 - link

    You must not have followed the development of GPUs, and particularly flagship GPUs very closely in the last decade or so.

    G80, the first "Compute GPGPU" as Nvidia put it, was first and foremost a graphics part and a kickass one at that. Each flagship GPU after, GT200, GT200b, GF100, GF110 have continued in this vein...driven by the desktop graphics market first, Tesla/compute market second. Hell, the Tesla business did not even exist until the GeForceTesla200. Jensen Huang, Nvidia's CEO, even got on stage likening his GPUs to superheroes with day jobs as graphics cards while transforming into supercomputers at night.

    Now Nvidia flips the script, holds back the flagship GPU from the gaming market that *MADE IT POSSIBLE* and wants to charge you $1K because it's got "SuperComputer Guts"??? That's bait and switch, stab in the back, whatever you want to call it. So yes, if you were actually in this market before, Nvidia has screwed you over to the tune of $1K for something that used to cost $500-$650 max.
  • CeriseCogburn - Saturday, February 23, 2013 - link

    You only spend at max $360 for a video card as you stated, so this doesn't affect you and you haven't been screwed.

    Grow up crybaby. A company may chagre what it desires, and since you're never buying, who cares how many times you scream they screwed everyone ?
    NO ONE CARES, not even you, since you never even pony up $500, as you yourself stated in this long, continuous crybaby whine you made here, and have been making, since the 680 was released, or rather, since Charlie fried your brain with his propaganda.

    Go get your 98 cent a gallon gasoline while you're at it , you fool.
  • chizow - Saturday, February 23, 2013 - link

    Uh no, I've spent over $1K in a single GPU purchasing transaction, have you? I didn't think so.

    I'm just unwilling to spend *$2K* for what cost $1K in the past for less than the expected increase in performance. I spent $700 this round instead of the usual $1K because that's all I was willing to pay for a mid-range ASIC in GK104 and while it was still a significant upgrade to my last set of $1K worth of graphics cards, I wasn't going to plunk down $1K for a set of mid-range GK104 GTX 680s.

    It's obvious you have never bought in this range of GPUs in the past, otherwise you wouldn't be posting such retarded replys for what is clearly usurious pricing by Nvidia.

    Now go away, idiot.
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    Wrong again, as usual.
    So what it boils down to is you're a cheapskate, still disgruntled, still believe in Charlie D's lie, and are angry you won't have the current top card at a price you demand.
    I saw your whole griping list in the other thread too, but none of what you purchase or don't purchase makes a single but of difference when it comes to your insane tinfoil hat lies that you have used for your entire argument

    Once again, pretending you aren't aware of production capacity leaves you right where you brainless rant started a long time ago.

    You cover your tracks whining about ATI's initial price, which wasn't out of line either, and ignore nVidia's immediate crushing of it when the 680 came out, as you still complained about the performance increase there. You're a crybaby, that's it.

    That's what you have done now for months on end, whined and whined and whined, and got caught over and over in exaggerations and lies, demanding a perfectly increasing price perf line slanting upwards, for years on end, lying about it's past, which I caught you on in the earlier reviews.

    Well dummy, that's not how performance/price increases work in any area of computer parts, anyway.
    Glad you're just another freaking parrot, as the reviewers have trained you fools to automaton levels.
  • Pontius - Thursday, February 21, 2013 - link

    My only interest at the moment is OpenCL compute performance. Sad to see it's not working at the moment, but once they get the kinks worked out, I would really love to see some benchmarks.

    Also, as any GPGPU programmer knows, the number one bottleneck for GPU computing is randomly accessing memory. If you are working only within the on-chip local memory, then yes, you get blazingly fast speeds on a GPU. However, the second you do something as simple as a += on a global memory location, your performance grinds to a screeching halt. I would really like to see the performance of these cards on random memory heavy OpenCL benchmarks. Thanks for the review!
  • codedivine - Thursday, February 21, 2013 - link

    We may do this in the future if I get some time off from univ work. Stay tuned :)
  • Pontius - Thursday, February 21, 2013 - link

    Thanks codedevine, I'll keep an eye out.
  • Pontius - Thursday, February 21, 2013 - link

    My only interest at the moment is OpenCL compute performance. Sad to see it's not working at the moment, but once they get the kinks worked out, I would really love to see some benchmarks.

    Also, as any GPGPU programmer knows, the number one bottleneck for GPU computing is randomly accessing memory. If you are working only within the on-chip local memory, then yes, you get blazingly fast speeds on a GPU. However, the second you do something as simple as a += on a global memory location, your performance grinds to a screeching halt. I would really like to see the performance of these cards on random memory heavy OpenCL benchmarks. Thanks for the review!
  • Bat123Man - Thursday, February 21, 2013 - link

    The Titan is nothing more than a proof-of-concept; "Look what we can do! Whohoo! Souped up to the max!" Nvidia is not intending this card to be for everyone. They know it will be picked up by a few well-moneyed enthusiasts, but it is really just a science project so that when people think about "the fastest GPU on the market", they think Nvidia.

    How often do you guys buy the best of the best as soon as it is out the door anyway ? $1000, $2000, it makes no difference, most of us wouldn't buy it even at 500 bucks. This is all about bragging rights, pure and simple.
  • Oxford Guy - Thursday, February 21, 2013 - link

    Not exactly. The chip isn't fully enabled.

Log in

Don't have an account? Sign up now