Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

The GTX Titan X represents a very interesting intersection for NVIDIA, crossing Maxwell’s unparalleled power efficiency with GTX Titan’s flagship level performance goals and similarly high power allowance. The end result is that this gives us a chance to see how well Maxwell holds up when pushed to the limit; to see how well the architecture holds up in the form of a 601mm2 GPU with a 250W TDP.

GeForce GTX Titan X Voltages
GTX Titan X Boost Voltage GTX 980 Boost Voltage GTX Titan X Idle Voltage
1.162v 1.225v 0.849v

Starting off with voltages, based on our samples we find that NVIDIA has been rather conservative in their voltage allowance, presumably to keep power consumption down. With the highest stock boost bin hitting a voltage of just 1.162v, GTX Titan X operates notably lower on the voltage curve than the GTX 980. This goes hand-in-hand with GTX Titan X’s stock clockspeeds, which are around 100MHz lower than GTX 980.

GeForce GTX Titan X Average Clockspeeds
Game GTX Titan X GTX 980
Max Boost Clock 1215MHz 1252MHz
Battlefield 4
1088MHz
1227MHz
Crysis 3
1113MHz
1177MHz
Mordor
1126MHz
1164MHz
Civilization: BE
1088MHz
1215MHz
Dragon Age
1189MHz
1215MHz
Talos Principle
1126MHz
1215MHz
Far Cry 4
1101MHz
1164MHz
Total War: Attila
1088MHz
1177MHz
GRID Autosport
1151MHz
1190MHz

Speaking of clockspeeds, taking a look at our average clockspeeds for GTX Titan X and GTX 980 showcases just why the 50% larger GM200 GPU only leads to an average performance advantage of 35% for the GTX Titan X. While the max boost bins are both over 1.2GHz, the GTX Titan has to back off far more often to stay within its power and thermal limits. The final clockspeed difference between the two cards depends on the game in question, but we’re looking at a real-world clockspeed deficit of 50-100MHz for GTX Titan X.

Idle Power Consumption

Starting off with idle power consumption, the GTX Titan X comes out strong as expected. Even at 8 billion transistors, NVIDIA is able to keep power consumption at idle very low, with all of our recent single-GPU NVIDIA cards coming in at 73-74W at the wall.

Load Power Consumption - Crysis 3

Load Power Consumption - FurMark

Meanwhile load power consumption for GTX Titan X is more or less exactly what we’d expect. With NVIDIA having nailed down their throttling mechanisms for Kepler and Maxwell, the GTX Titan X has a load power profile almost identical to the GTX 780 Ti, the closest equivalent GK110 card. Under Crysis 3 this manifests itself as a 20W increase in power consumption at the wall – generally attributable to the greater CPU load from GTX Titan X’s better GPU performance – while under FurMark the two cards are within 2W of each other.

Compared to the GTX 980 on the other hand, this is of course a sizable increase in power consumption. With a TDP difference on paper of 85W, the difference at the wall is an almost perfect match. GTX Titan X still offers Maxwell’s overall energy efficiency, delivering greatly superior performance for the power consumption, but this is a 250W card and it shows. Meanwhile the GTX Titan X’s power consumption also ends up being very close to the unrestricted R9 290X Uber, which in light of the Titan’s 44% 4K performance advantage further drives home the point about NVIDIA’s power efficiency lead at this time.

Idle GPU Temperature

With the same Titan cooler and same idle power consumption, it should come as no surprise that the GTX Titan X offers the same idle temperatures as its GK110 predecessors: a relatively cool 32C.

Load GPU Temperature - Crysis 3

Load GPU Temperature - FurMark

Moving on to load temperatures, the GTX Titan X has a stock temperature limit of 83C, just like the GTX 780 Ti. Consequently this is exactly where we see the card top out at under both FurMark and Crysis 3. 83C does lead to the card temperature throttling in most cases, though as we’ve seen in our look at average clockspeeds it’s generally not a big drop.

Idle Noise Levels

Last but not least we have our noise results. With the Titan cooler backing it, the GTX Titan X has no problem keeping quiet at idle. At 37.0db(A) it's technically the quietest card among our entire collection of high-end cards, and from a practical perspective is close to silent.

Load Noise Levels - Crysis 3

Load Noise Levels - FurMark

Much like GTX Titan X’s power profile, GTX Titan X’s noise profile almost perfectly mirrors the GTX 780 Ti. With the card hitting 51.3dB(A) under Crysis 3 and 52.4dB(A) under FurMark, it is respectively only 0.4dB and 0.1dB off from the GTX 780 Ti. From a practical perspective what this means is that the GTX Titan X isn’t quite the hushed card that was the GTX 980 – nor with a 250W TDP would we expect it to be – but for its chart-topping gaming performance it delivers some very impressive acoustics. The Titan cooler continues to serve NVIDIA well, allowing them to dissipate 250W in a blower without making a lot of noise in the process.

Overall then, from a power/temp/noise perspective the GTX Titan X is every bit as impressive as the original GTX Titan and its GTX 780 Ti sibling. Thanks to the Maxwell architecture and Titan cooler, NVIDIA has been able to deliver a 50% increase in gaming performance over the GTX 780 Ti without an increase in power consumption or noise, leading to NVIDIA once again delivering a flagship video card that can top the performance charts without unnecessarily sacrificing power consumption or noise.

Compute Overclocking
Comments Locked

276 Comments

View All Comments

  • Braincruser - Wednesday, March 18, 2015 - link

    The titan was teased 10 days ago...
  • Tunnah - Wednesday, March 18, 2015 - link

    It feels nVidia are just taking the pee out of us now. I was semi-miffed at the 970 controversy, I know for business reasons etc. it doesn't make sense to truly trounce the competition (and your own products) when you can instead hold something back and keep it tighter, and have something to release in case they surprise you.

    And I was semi-miffed when I heard it would be more like a 33% improvement over the current cream of the crop, instead of the closer to 50% increase the Titan was over the 680, because they have to worry about the 390x, and leave room for a Titan X White Y Grey SuperHappyTime version.

    But to still charge $1000 even though they are keeping the DP performance low, this is just too far. The whole reasoning for the high price tag was you were getting a card that was not only a beast of a gaming card, but it would hold its own as a workstation card too, as long as you didn't need the full Quadro service. Now it is nothing more than a high end card, a halo product...that isn't actually that good!

    When it comes down to it, you're paying 250% the cost for 33% more performance, and that is disgusting. Don't even bring RAM into it, it's not only super cheap and in no way a justification for the cost, but in fact is useless, because NO GAMER WILL EVER NEED THAT MUCH, IT WAS THE FLIM FLAMMING WORKSTATION CROWD WHO NEEDING THAT FLIM FLAMMING AMOUNT OF FLOOMING RAM YOU FLUPPERS!

    This feels like a big juicy gob of spit in our faces. I know most people bought these purely for the gaming option and didn't use the DP capability, but that's not the point - it was WORTH the $999 price tag. This simply is not, not in the slightest. $650, $750 tops because it's the best, after all..but $999 ? Not in this lifetime.

    I've not had an AMD card since way back in the days of ATi, I am well and truly part of the nVidia crowd, even when they had a better card I'd wait for the green team reply. But this is actually insulting to consumers.

    I was never gonna buy one of these, I was waiting on the 980Ti for the 384bit bus and the bumps that come along with it...but now I'm not only hoping the 390x is better than people say because then nVidia will have to make it extra good..I'm hoping it's better than they say so I can actually buy it.

    For shame nVidia, what you're doing with this card is unforgivable
  • Michael Bay - Wednesday, March 18, 2015 - link

    So you`re blaming a for-profit company for being for-profit.
  • maximumGPU - Wednesday, March 18, 2015 - link

    no he's not. He's blaming a for-profit compaby abusing it's position at the expense of its customers.
    Maxwell is great, and i've got 2 of them in my rig. But titan X is a bit of a joke. The only justification the previous titan had was that it could be viewed as a cheap professional cards. Now that's gone but you're still paying the same price.
    Unfortunately nvidia will put the highest price they can get away with, and 999$ doesn't seem to deter some hardcore fans no matter how much poor value it represents.
    I certainly hope the sales don't meet their expectations.
  • TheinsanegamerN - Wednesday, March 18, 2015 - link

    I would argue that the vram may be needed later on. 4GB is already tight with SoM, and future games will only push that up.
    people said that 6GB was too much for the OG titan, but SoM can eat that up at 4k, and other games are not far behind. especially for SLI setups, that memory will come in handy.
    Thats what really killed the 770. gpu was fine for me, but 2GB was way to little vram.
  • Tal Greywolf - Wednesday, March 18, 2015 - link

    Not being a gamer, I would like to see a review in which many of these top-of-the-line gaming cards are tested against a different sort of environment. For example, I'd love to see how cards compare handling graphics software packages such as Photoshop, Premier Pro, Lightwave, Cinema 4D, SolidWorks and others. If these cards are really pushing the envelope, then they should compare against the Quadro and FirePro lines.
  • Ranger101 - Wednesday, March 18, 2015 - link

    I think it's safe to say that Nvidia make technically superior cards as compared to AMD,
    at least as far as the last 2 generations of GPUs are concerned. While the AMD cards consume
    more power and produce more heat, this issue is not a determining factor when I upgrade unlike
    price and choice.

    I will not buy this card, despite the fact that I find it to be a very desirable and
    techically impressive card, because I don't like being price-raped and because
    I want AMD to be competitive.

    I will buy the 390X because I prefer a "consumer wins" situation where there are at least 2
    companies producing competitive products and lets be clear AMD GPUs are competitve, even when you factor in what is ultimately a small increase in heat and noise, not to mention lower prices.

    It was a pleasant surprise to see the R295X2 at one point described as "very impressive" yet
    I think it would have been fair if Ryan had drawn more attention to AMD "wins," even though they
    are not particularly significant, such as the most stressful Shadow of Mordor benchmarks.

    Most people favour a particular brand, but surely even the most ardent supporters wouldn't want to see a situation where there is ONLY Intel and ONLY Nvidia. We are reaping the rewards of this scenario already in terms of successive generations of Intel CPUs offering performance improvements that are mediocre at best.

    I can only hope that the 390X gets a positive review at Anandtech.
  • Mystichobo - Wednesday, March 18, 2015 - link

    Looking forward to a 390 with the same performance for 400-500. I certainly got my money's worth out of the r9 290 when it was released. Don't understand how anyone could advocate this $1000 single card price bracket created for "top tier".
  • Geforce man - Wednesday, March 18, 2015 - link

    What still frustrates me, is the lack of using a modern aftermarket r9 290/x.
  • Crunchy005 - Wednesday, March 18, 2015 - link

    I actually really like how the new titan looks, shows what can be done. The problem with this card at this price point is it defeats what the titan really should be. Without the couple precision performance this card becomes irrelevant I feel(overpriced gaming card). The original titan was an entry level compute card outside of the quadro lineup. I know there are drawbacks to multiGPU setups but I would go for 2 980's or 970's for same or less money than the Titan X.

    I also found these benchmarks very interesting because you can see how much each game can be biased to a certain card. AMDs 290x, an old card, beat out the 980 in some cases, mostly at 4k resolutions and lost in others at the same resolution. Just goes to show that you also have to look at individual game performance as well as overall performance when buying a card.

    Can't wait for the 390x from AMD that should be very interesting.

Log in

Don't have an account? Sign up now