Meet The GeForce GTX Titan X

Now that we’ve had a chance to look at the GM200 GPU at the heart of GTX Titan X, let’s take a look at the card itself.

From a design standpoint NVIDIA put together a very strong card with the original GTX Titan, combining a revised, magnesium-less version of their all-metal shroud with a high performance blower and vapor chamber assembly. The end result was a high performance 250W card that was quieter than some open-air cards, much quieter than a bunch of other blowers, and shiny to look at to boot. This design was further carried forward for the reference GTX 780 series, its stylings copied for the GTX Titan Z, and used with a cheaper cooling apparatus for the reference GTX 980.

For GTX Titan X, NVIDIA has opted to leave well enough alone, having made virtually no changes to the shroud or cooling apparatus. And truth be told it’s hard to fault NVIDIA right now, as this design remains the gold (well, aluminum) standard for a blower. Looks aside, after years of blowers that rattled, or were too loud, or didn’t cool discrete components very well, NVIDIA is sitting on a very solid design that I’m not really sure how anyone would top (but I’d love to see them try).

In any case, our favorite metal shroud is back once again. Composed of a cast aluminum housing and held together using a combination of rivets and screws, it’s as physically solid a shroud as we’ve ever seen. Meanwhile having already done a partial black dye job for GTX Titan Black and GTX 780 Ti – using black lettering a black-tinted polycarbonate window – NVIDIA has more or less completed the dye job by making the metal shroud itself almost completely black. What remains are aluminum accents and the Titan lettering (Titan, not Titan X, curiously enough) being unpainted aluminum as well. The card measures 10.5” long overall, which at this point is NVIDIA’s standard size for high-end GTX cards.

Drilling down we have the card’s primary cooling apparatus, composed of a nickel-tipped wedge-shaped heatsink and ringed radial fan. The heatsink itself is attached to the GPU via a copper vapor chamber, something that has been exclusive to GTX 780/Titan cards and provides the best possible heat transfer between the GPU and heatsink. Meanwhile the rest of the card is covered with a black aluminum baseplate, providing basic heatsink functionality for the VRMs and other components while also protecting them.

Finally at the bottom of the stack we have the card itself, complete with the GM200 GPU, VRAM chips, and various discrete components. Unlike the shroud and cooler, GM200’s PCB isn’t a complete carry-over from GK110, but it is none the less very similar with only a handful of changes made. This means we’re looking at the GPU and VRAM chips towards the front of the card, while the VRMs and other discrete components occupy the back. New specifically to GTX Titan X, NVIDIA has done some minor reworking to improve airflow to the discrete components and reduce temperatures, along with employing molded inductors.

As with GK110, NVIDIA still employs a 6+2 phase VRM design, with 6 phases for the GPU and another 2 for the VRAM. This means that GTX Titan X has a bit of power delivery headroom – NVIDIA allows the power limit to be increased by 10% to 275W – but hardcore overclockers will find that there isn’t an extreme amount of additional headroom to play with. Based on our sample the actual shipping voltage at the max boost clock is fairly low at 1.162v, so in non-TDP constrained scenarios there is some additional headroom through overvolting, up to 1.237v in the case of our sample.

In terms of overall design, the need to house 24 VRAM chips to get 12GB of VRAM means that the GTX Titan X has chips on the front as well as the back. Unlike the GTX 980 then, for this reason NVIDIA is once again back to skipping the backplate, leaving the back side of the card bare just as with the previous GTX Titan cards.

Moving on, in accordance with GTX Titan X’s 250W TDP and the reuse of the GTX Titan cooler, power delivery for the GTX Titan X is identical to its predecessors. This means a 6-pin and an 8-pin power connector at the top of the card, to provide up to 225W, with the final 75W coming from the PCIe slot. Interestingly the board does have another 8-pin PCIe connector position facing the rear of the card, but that goes unused for this specific GM200 card.

Meanwhile display I/O follows the same configuration we saw on GTX 980. This is 1x DL-DVI-I, 3x DisplayPort 1.2, and 1x HDMI 2.0, with a total limit of 4 displays. In the case of GTX Titan the DVI port is somewhat antiquated at this point – the card is generally overpowered for the relatively low maximum resolutions of DL-DVI – but on the other hand the HDMI 2.0 port is actually going to be of some value here since it means GTX Titan X can drive a 4K TV. Meanwhile if you have money to spare and need to drive more than a single 4K display, GTX Titan X also features a pair of SLI connectors for even more power.

In fact 4K will be a repeating theme for GTX Titan X, as this is one of the primary markets/use cases NVIDIA will be going after with the card. With GTX 980 generally good up to 2560x1440, the even more powerful GTX Titan X is best suited for 4K and VR, the two areas where GTX 980 came up short. In the case of 4K even a single GTX Titan X is going to struggle at times – we’re not at 60fps at 4K with a single GPU quite yet – but GTX Titan should be good enough for framerates between 30fps and 60fps at high quality settings. To fill the rest of the gap NVIDIA is also going to be promoting 4Kp60 G-Sync monitors alongside the GTX Titan X, as the 30-60fps range is where G-sync excels. And while G-sync can’t make up for lost frames it can take some of the bite out of sub-60fps framerates, making it a smoother/cleaner experience than it would otherwise be.

Longer term NVIDIA also sees the GTX Titan X as their most potent card for VR headsets., and they made sure that GTX Titan X was on the showfloor for GDC to drive a few of the major VR demos. Certainly VR will take just about whatever rendering power you can throw at it, if only in the name of reducing rendering latency. But overall we’re still very early in the game, especially with commercial VR headsets still being in development.

Finally, speaking of the long term, I wanted to hit upon the subject of the GTX Titan X’s 12GB of VRAM. With most other Maxwell cards already using 4Gb VRAM chips, the inclusion of 12GB of VRAM in NVIDIA’s flagship card was practically a given, especially since it doubles the 6GB of VRAM the original GTX Titan came with. At the same time however I’m curious to see just how long it takes for games to grow into this space. The original GTX Titan was fortunate enough to come out with 6GB right before the current-generation consoles launched, and with them their 8GB memory configurations, leading to a rather sudden jump in VRAM requirements that the GTX Titan was well positioned to handle. Much like 6GB in 2013, 12GB is overkill in 2015, but unlike the original GTX Titan I suspect 12GB will remain overkill for a much longer period of time, especially without a significant technology bump like the consoles to drive up VRAM requirements.

GM200 - All Graphics, Hold The Double Precision Our 2015 GPU Benchmark Suite & The Test
Comments Locked

276 Comments

View All Comments

  • Denithor - Wednesday, March 18, 2015 - link

    Correct, but then they should have priced it around $800, not $1k. The reason they could demand $1k for the original Titan was due to the FP64 compute functionality on board.

    This is exactly what they did when they made the GTX 560 Ti, chopped out the compute features to maximize gaming power at a low cost. The reason that one was such a great card was due to price positioning, not just performance.
  • chizow - Monday, March 23, 2015 - link

    @Denithor, I disagree, the reason they could charge $1K for the original Titan was because there was still considerable doubt there would ever be a traditionally priced GeForce GTX card based on GK110, the compute aspect was just add-on BS to fluff up the price.

    Since then of course, they released not 1, but 2 traditional GTX cards (780 and Ti) that were much better received by the gaming market in terms of both price and in the case of the Ti, performance. Most notably was the fact the original Titan price on FS/FT and Ebay markets quickly dropped below that of the 780Ti. If the allure of the Titan was indeed for DP compute, it would have held its price, but the fact Titan owners were dumping their cards for less than what it cost to buy a 780Ti clearly showed the demand and price justification for a Titan for compute alone simply wasn't there. Also, important to note Titan's drivers were still GeForce, so even if it did have better DP performance, there were still a lot of driver limitations related to CUDA preventing it from reaching Quadro/Tesla levels of performance.

    Simply put, Nvidia couldn't pull that trick again under the guise of compute this time around, and people like me who weren't willing to pay a penny for compute over gaming weren't willing to justify that price tag for features we had no use for. Titan X on the other hand, its 100% dedicated to gamers, not a single transistor budgeted for something I don't care about, and no false pretenses to go with it.
  • Samus - Thursday, March 19, 2015 - link

    The identity crisis this card has with itself is that for all the effort, it's still slower than two 980's in SLI, and when overclocked to try to catch up to them, ends up using MORE POWER than two 980's in SLI.

    So for the price (being identical) wouldn't you just pick up two 980's which offer more performance, less power consumption and FP64 (even if you don't need it, it'll help the resell value in the future)?
  • LukaP - Thursday, March 19, 2015 - link

    The 980 have the same 1/32 DP performance as the Titan X. And Titan never was a sensible card. Noone sensible buys it over a x80 of that generation (which i assume will be 1080 or whatever they call it, based on GM200 with less ram, and maybe some disabled ROPs).

    The Titan is a true flagship. making no sense economically, but increasing your penis size by miles
  • chizow - Monday, March 23, 2015 - link

    I considered going this route but ultimately decided against it despite having used many SLI setups in the past. There's a number of things to like about the 980 but ultimately I felt I didn't want to be hamstrung by the 4GB in the future. There are already a number of games that push right up to that 4GB VRAM usage at 1440p and in the end I was more interested in bringing up min FPS than absolutely maxing out top-end FPS with 980 SLI.

    Power I would say is about the same, 980 is super efficient but once overclocked, with 2 of them I am sure the 980 set-up would use as much if not more than the single Titan X.
  • naxeem - Saturday, March 21, 2015 - link

    You're forgetting three things:

    1. NO game uses even close to 8GB, let alone 12

    2. $1000/1300€ puts it to exactly double the price of exactly the same performance level you get with any other solution: 970 SLI kicks it with $750, 295x2 does the same, 2x290X also...
    In Europe, the card is even 30% more expensive than in US and than other cards so even less people will buy it there.

    3. In summer, when AMD releases 390X for $700 and gives even better performance, Nvidia will either have to drop TitanX to the same price or suffer being smashed around at the market.

    Keep in mind HBM is seriously a performance kicker for high resolutions, end-game gaming that TitanX is intended for. No amount of RAM can counter RAM bandwidth, especially when you don't really need over 6-7GB for even the most demanding games out there.
  • ArmedandDangerous - Saturday, March 21, 2015 - link

    Or they could just say fuck it and keep the Titan at it's exact price and release a x80 GM200 at a lower price with some features cut that will still compete with whatever AMD has to offer. This is the 3rd Titan, how can you not know this by now.
  • naxeem - Tuesday, March 24, 2015 - link

    Well, yes. But without any compute performance of previous Titans, who would any why buy a 1000 Titan X while having exact same performance in some 980Ti or alike?
    Those who need 12GB for rendering may as well buy Quadros with more VRAM... When you need 12, you need more anyway... For gaming, 12GB means jack sht.
  • Thetrav55 - Friday, March 20, 2015 - link

    Well its only the fastest card in the WORLD look at it that way the fattest card in the world ONLY 1000$ I know I know 1000 does not justify the performance but its the fastest card in the WORLD!!!
  • agentbb007 - Wednesday, June 24, 2015 - link

    LOL had to laugh @ farealstarfareal's comment that the 390X would likely blow the doors off the Titan X, the 390X is nowhere near the Titan X, it's closer to a 980. The all mighty R9 FuryX reviews posted this morning and it's not even beating the 980ti.

Log in

Don't have an account? Sign up now