Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

The GTX Titan X represents a very interesting intersection for NVIDIA, crossing Maxwell’s unparalleled power efficiency with GTX Titan’s flagship level performance goals and similarly high power allowance. The end result is that this gives us a chance to see how well Maxwell holds up when pushed to the limit; to see how well the architecture holds up in the form of a 601mm2 GPU with a 250W TDP.

GeForce GTX Titan X Voltages
GTX Titan X Boost Voltage GTX 980 Boost Voltage GTX Titan X Idle Voltage
1.162v 1.225v 0.849v

Starting off with voltages, based on our samples we find that NVIDIA has been rather conservative in their voltage allowance, presumably to keep power consumption down. With the highest stock boost bin hitting a voltage of just 1.162v, GTX Titan X operates notably lower on the voltage curve than the GTX 980. This goes hand-in-hand with GTX Titan X’s stock clockspeeds, which are around 100MHz lower than GTX 980.

GeForce GTX Titan X Average Clockspeeds
Game GTX Titan X GTX 980
Max Boost Clock 1215MHz 1252MHz
Battlefield 4
1088MHz
1227MHz
Crysis 3
1113MHz
1177MHz
Mordor
1126MHz
1164MHz
Civilization: BE
1088MHz
1215MHz
Dragon Age
1189MHz
1215MHz
Talos Principle
1126MHz
1215MHz
Far Cry 4
1101MHz
1164MHz
Total War: Attila
1088MHz
1177MHz
GRID Autosport
1151MHz
1190MHz

Speaking of clockspeeds, taking a look at our average clockspeeds for GTX Titan X and GTX 980 showcases just why the 50% larger GM200 GPU only leads to an average performance advantage of 35% for the GTX Titan X. While the max boost bins are both over 1.2GHz, the GTX Titan has to back off far more often to stay within its power and thermal limits. The final clockspeed difference between the two cards depends on the game in question, but we’re looking at a real-world clockspeed deficit of 50-100MHz for GTX Titan X.

Idle Power Consumption

Starting off with idle power consumption, the GTX Titan X comes out strong as expected. Even at 8 billion transistors, NVIDIA is able to keep power consumption at idle very low, with all of our recent single-GPU NVIDIA cards coming in at 73-74W at the wall.

Load Power Consumption - Crysis 3

Load Power Consumption - FurMark

Meanwhile load power consumption for GTX Titan X is more or less exactly what we’d expect. With NVIDIA having nailed down their throttling mechanisms for Kepler and Maxwell, the GTX Titan X has a load power profile almost identical to the GTX 780 Ti, the closest equivalent GK110 card. Under Crysis 3 this manifests itself as a 20W increase in power consumption at the wall – generally attributable to the greater CPU load from GTX Titan X’s better GPU performance – while under FurMark the two cards are within 2W of each other.

Compared to the GTX 980 on the other hand, this is of course a sizable increase in power consumption. With a TDP difference on paper of 85W, the difference at the wall is an almost perfect match. GTX Titan X still offers Maxwell’s overall energy efficiency, delivering greatly superior performance for the power consumption, but this is a 250W card and it shows. Meanwhile the GTX Titan X’s power consumption also ends up being very close to the unrestricted R9 290X Uber, which in light of the Titan’s 44% 4K performance advantage further drives home the point about NVIDIA’s power efficiency lead at this time.

Idle GPU Temperature

With the same Titan cooler and same idle power consumption, it should come as no surprise that the GTX Titan X offers the same idle temperatures as its GK110 predecessors: a relatively cool 32C.

Load GPU Temperature - Crysis 3

Load GPU Temperature - FurMark

Moving on to load temperatures, the GTX Titan X has a stock temperature limit of 83C, just like the GTX 780 Ti. Consequently this is exactly where we see the card top out at under both FurMark and Crysis 3. 83C does lead to the card temperature throttling in most cases, though as we’ve seen in our look at average clockspeeds it’s generally not a big drop.

Idle Noise Levels

Last but not least we have our noise results. With the Titan cooler backing it, the GTX Titan X has no problem keeping quiet at idle. At 37.0db(A) it's technically the quietest card among our entire collection of high-end cards, and from a practical perspective is close to silent.

Load Noise Levels - Crysis 3

Load Noise Levels - FurMark

Much like GTX Titan X’s power profile, GTX Titan X’s noise profile almost perfectly mirrors the GTX 780 Ti. With the card hitting 51.3dB(A) under Crysis 3 and 52.4dB(A) under FurMark, it is respectively only 0.4dB and 0.1dB off from the GTX 780 Ti. From a practical perspective what this means is that the GTX Titan X isn’t quite the hushed card that was the GTX 980 – nor with a 250W TDP would we expect it to be – but for its chart-topping gaming performance it delivers some very impressive acoustics. The Titan cooler continues to serve NVIDIA well, allowing them to dissipate 250W in a blower without making a lot of noise in the process.

Overall then, from a power/temp/noise perspective the GTX Titan X is every bit as impressive as the original GTX Titan and its GTX 780 Ti sibling. Thanks to the Maxwell architecture and Titan cooler, NVIDIA has been able to deliver a 50% increase in gaming performance over the GTX 780 Ti without an increase in power consumption or noise, leading to NVIDIA once again delivering a flagship video card that can top the performance charts without unnecessarily sacrificing power consumption or noise.

Compute Overclocking
Comments Locked

276 Comments

View All Comments

  • Denithor - Wednesday, March 18, 2015 - link

    Correct, but then they should have priced it around $800, not $1k. The reason they could demand $1k for the original Titan was due to the FP64 compute functionality on board.

    This is exactly what they did when they made the GTX 560 Ti, chopped out the compute features to maximize gaming power at a low cost. The reason that one was such a great card was due to price positioning, not just performance.
  • chizow - Monday, March 23, 2015 - link

    @Denithor, I disagree, the reason they could charge $1K for the original Titan was because there was still considerable doubt there would ever be a traditionally priced GeForce GTX card based on GK110, the compute aspect was just add-on BS to fluff up the price.

    Since then of course, they released not 1, but 2 traditional GTX cards (780 and Ti) that were much better received by the gaming market in terms of both price and in the case of the Ti, performance. Most notably was the fact the original Titan price on FS/FT and Ebay markets quickly dropped below that of the 780Ti. If the allure of the Titan was indeed for DP compute, it would have held its price, but the fact Titan owners were dumping their cards for less than what it cost to buy a 780Ti clearly showed the demand and price justification for a Titan for compute alone simply wasn't there. Also, important to note Titan's drivers were still GeForce, so even if it did have better DP performance, there were still a lot of driver limitations related to CUDA preventing it from reaching Quadro/Tesla levels of performance.

    Simply put, Nvidia couldn't pull that trick again under the guise of compute this time around, and people like me who weren't willing to pay a penny for compute over gaming weren't willing to justify that price tag for features we had no use for. Titan X on the other hand, its 100% dedicated to gamers, not a single transistor budgeted for something I don't care about, and no false pretenses to go with it.
  • Samus - Thursday, March 19, 2015 - link

    The identity crisis this card has with itself is that for all the effort, it's still slower than two 980's in SLI, and when overclocked to try to catch up to them, ends up using MORE POWER than two 980's in SLI.

    So for the price (being identical) wouldn't you just pick up two 980's which offer more performance, less power consumption and FP64 (even if you don't need it, it'll help the resell value in the future)?
  • LukaP - Thursday, March 19, 2015 - link

    The 980 have the same 1/32 DP performance as the Titan X. And Titan never was a sensible card. Noone sensible buys it over a x80 of that generation (which i assume will be 1080 or whatever they call it, based on GM200 with less ram, and maybe some disabled ROPs).

    The Titan is a true flagship. making no sense economically, but increasing your penis size by miles
  • chizow - Monday, March 23, 2015 - link

    I considered going this route but ultimately decided against it despite having used many SLI setups in the past. There's a number of things to like about the 980 but ultimately I felt I didn't want to be hamstrung by the 4GB in the future. There are already a number of games that push right up to that 4GB VRAM usage at 1440p and in the end I was more interested in bringing up min FPS than absolutely maxing out top-end FPS with 980 SLI.

    Power I would say is about the same, 980 is super efficient but once overclocked, with 2 of them I am sure the 980 set-up would use as much if not more than the single Titan X.
  • naxeem - Saturday, March 21, 2015 - link

    You're forgetting three things:

    1. NO game uses even close to 8GB, let alone 12

    2. $1000/1300€ puts it to exactly double the price of exactly the same performance level you get with any other solution: 970 SLI kicks it with $750, 295x2 does the same, 2x290X also...
    In Europe, the card is even 30% more expensive than in US and than other cards so even less people will buy it there.

    3. In summer, when AMD releases 390X for $700 and gives even better performance, Nvidia will either have to drop TitanX to the same price or suffer being smashed around at the market.

    Keep in mind HBM is seriously a performance kicker for high resolutions, end-game gaming that TitanX is intended for. No amount of RAM can counter RAM bandwidth, especially when you don't really need over 6-7GB for even the most demanding games out there.
  • ArmedandDangerous - Saturday, March 21, 2015 - link

    Or they could just say fuck it and keep the Titan at it's exact price and release a x80 GM200 at a lower price with some features cut that will still compete with whatever AMD has to offer. This is the 3rd Titan, how can you not know this by now.
  • naxeem - Tuesday, March 24, 2015 - link

    Well, yes. But without any compute performance of previous Titans, who would any why buy a 1000 Titan X while having exact same performance in some 980Ti or alike?
    Those who need 12GB for rendering may as well buy Quadros with more VRAM... When you need 12, you need more anyway... For gaming, 12GB means jack sht.
  • Thetrav55 - Friday, March 20, 2015 - link

    Well its only the fastest card in the WORLD look at it that way the fattest card in the world ONLY 1000$ I know I know 1000 does not justify the performance but its the fastest card in the WORLD!!!
  • agentbb007 - Wednesday, June 24, 2015 - link

    LOL had to laugh @ farealstarfareal's comment that the 390X would likely blow the doors off the Titan X, the 390X is nowhere near the Titan X, it's closer to a 980. The all mighty R9 FuryX reviews posted this morning and it's not even beating the 980ti.

Log in

Don't have an account? Sign up now