Meet The GeForce GTX Titan X

Now that we’ve had a chance to look at the GM200 GPU at the heart of GTX Titan X, let’s take a look at the card itself.

From a design standpoint NVIDIA put together a very strong card with the original GTX Titan, combining a revised, magnesium-less version of their all-metal shroud with a high performance blower and vapor chamber assembly. The end result was a high performance 250W card that was quieter than some open-air cards, much quieter than a bunch of other blowers, and shiny to look at to boot. This design was further carried forward for the reference GTX 780 series, its stylings copied for the GTX Titan Z, and used with a cheaper cooling apparatus for the reference GTX 980.

For GTX Titan X, NVIDIA has opted to leave well enough alone, having made virtually no changes to the shroud or cooling apparatus. And truth be told it’s hard to fault NVIDIA right now, as this design remains the gold (well, aluminum) standard for a blower. Looks aside, after years of blowers that rattled, or were too loud, or didn’t cool discrete components very well, NVIDIA is sitting on a very solid design that I’m not really sure how anyone would top (but I’d love to see them try).

In any case, our favorite metal shroud is back once again. Composed of a cast aluminum housing and held together using a combination of rivets and screws, it’s as physically solid a shroud as we’ve ever seen. Meanwhile having already done a partial black dye job for GTX Titan Black and GTX 780 Ti – using black lettering a black-tinted polycarbonate window – NVIDIA has more or less completed the dye job by making the metal shroud itself almost completely black. What remains are aluminum accents and the Titan lettering (Titan, not Titan X, curiously enough) being unpainted aluminum as well. The card measures 10.5” long overall, which at this point is NVIDIA’s standard size for high-end GTX cards.

Drilling down we have the card’s primary cooling apparatus, composed of a nickel-tipped wedge-shaped heatsink and ringed radial fan. The heatsink itself is attached to the GPU via a copper vapor chamber, something that has been exclusive to GTX 780/Titan cards and provides the best possible heat transfer between the GPU and heatsink. Meanwhile the rest of the card is covered with a black aluminum baseplate, providing basic heatsink functionality for the VRMs and other components while also protecting them.

Finally at the bottom of the stack we have the card itself, complete with the GM200 GPU, VRAM chips, and various discrete components. Unlike the shroud and cooler, GM200’s PCB isn’t a complete carry-over from GK110, but it is none the less very similar with only a handful of changes made. This means we’re looking at the GPU and VRAM chips towards the front of the card, while the VRMs and other discrete components occupy the back. New specifically to GTX Titan X, NVIDIA has done some minor reworking to improve airflow to the discrete components and reduce temperatures, along with employing molded inductors.

As with GK110, NVIDIA still employs a 6+2 phase VRM design, with 6 phases for the GPU and another 2 for the VRAM. This means that GTX Titan X has a bit of power delivery headroom – NVIDIA allows the power limit to be increased by 10% to 275W – but hardcore overclockers will find that there isn’t an extreme amount of additional headroom to play with. Based on our sample the actual shipping voltage at the max boost clock is fairly low at 1.162v, so in non-TDP constrained scenarios there is some additional headroom through overvolting, up to 1.237v in the case of our sample.

In terms of overall design, the need to house 24 VRAM chips to get 12GB of VRAM means that the GTX Titan X has chips on the front as well as the back. Unlike the GTX 980 then, for this reason NVIDIA is once again back to skipping the backplate, leaving the back side of the card bare just as with the previous GTX Titan cards.

Moving on, in accordance with GTX Titan X’s 250W TDP and the reuse of the GTX Titan cooler, power delivery for the GTX Titan X is identical to its predecessors. This means a 6-pin and an 8-pin power connector at the top of the card, to provide up to 225W, with the final 75W coming from the PCIe slot. Interestingly the board does have another 8-pin PCIe connector position facing the rear of the card, but that goes unused for this specific GM200 card.

Meanwhile display I/O follows the same configuration we saw on GTX 980. This is 1x DL-DVI-I, 3x DisplayPort 1.2, and 1x HDMI 2.0, with a total limit of 4 displays. In the case of GTX Titan the DVI port is somewhat antiquated at this point – the card is generally overpowered for the relatively low maximum resolutions of DL-DVI – but on the other hand the HDMI 2.0 port is actually going to be of some value here since it means GTX Titan X can drive a 4K TV. Meanwhile if you have money to spare and need to drive more than a single 4K display, GTX Titan X also features a pair of SLI connectors for even more power.

In fact 4K will be a repeating theme for GTX Titan X, as this is one of the primary markets/use cases NVIDIA will be going after with the card. With GTX 980 generally good up to 2560x1440, the even more powerful GTX Titan X is best suited for 4K and VR, the two areas where GTX 980 came up short. In the case of 4K even a single GTX Titan X is going to struggle at times – we’re not at 60fps at 4K with a single GPU quite yet – but GTX Titan should be good enough for framerates between 30fps and 60fps at high quality settings. To fill the rest of the gap NVIDIA is also going to be promoting 4Kp60 G-Sync monitors alongside the GTX Titan X, as the 30-60fps range is where G-sync excels. And while G-sync can’t make up for lost frames it can take some of the bite out of sub-60fps framerates, making it a smoother/cleaner experience than it would otherwise be.

Longer term NVIDIA also sees the GTX Titan X as their most potent card for VR headsets., and they made sure that GTX Titan X was on the showfloor for GDC to drive a few of the major VR demos. Certainly VR will take just about whatever rendering power you can throw at it, if only in the name of reducing rendering latency. But overall we’re still very early in the game, especially with commercial VR headsets still being in development.

Finally, speaking of the long term, I wanted to hit upon the subject of the GTX Titan X’s 12GB of VRAM. With most other Maxwell cards already using 4Gb VRAM chips, the inclusion of 12GB of VRAM in NVIDIA’s flagship card was practically a given, especially since it doubles the 6GB of VRAM the original GTX Titan came with. At the same time however I’m curious to see just how long it takes for games to grow into this space. The original GTX Titan was fortunate enough to come out with 6GB right before the current-generation consoles launched, and with them their 8GB memory configurations, leading to a rather sudden jump in VRAM requirements that the GTX Titan was well positioned to handle. Much like 6GB in 2013, 12GB is overkill in 2015, but unlike the original GTX Titan I suspect 12GB will remain overkill for a much longer period of time, especially without a significant technology bump like the consoles to drive up VRAM requirements.

GM200 - All Graphics, Hold The Double Precision Our 2015 GPU Benchmark Suite & The Test
Comments Locked

276 Comments

View All Comments

  • Braincruser - Wednesday, March 18, 2015 - link

    The titan was teased 10 days ago...
  • Tunnah - Wednesday, March 18, 2015 - link

    It feels nVidia are just taking the pee out of us now. I was semi-miffed at the 970 controversy, I know for business reasons etc. it doesn't make sense to truly trounce the competition (and your own products) when you can instead hold something back and keep it tighter, and have something to release in case they surprise you.

    And I was semi-miffed when I heard it would be more like a 33% improvement over the current cream of the crop, instead of the closer to 50% increase the Titan was over the 680, because they have to worry about the 390x, and leave room for a Titan X White Y Grey SuperHappyTime version.

    But to still charge $1000 even though they are keeping the DP performance low, this is just too far. The whole reasoning for the high price tag was you were getting a card that was not only a beast of a gaming card, but it would hold its own as a workstation card too, as long as you didn't need the full Quadro service. Now it is nothing more than a high end card, a halo product...that isn't actually that good!

    When it comes down to it, you're paying 250% the cost for 33% more performance, and that is disgusting. Don't even bring RAM into it, it's not only super cheap and in no way a justification for the cost, but in fact is useless, because NO GAMER WILL EVER NEED THAT MUCH, IT WAS THE FLIM FLAMMING WORKSTATION CROWD WHO NEEDING THAT FLIM FLAMMING AMOUNT OF FLOOMING RAM YOU FLUPPERS!

    This feels like a big juicy gob of spit in our faces. I know most people bought these purely for the gaming option and didn't use the DP capability, but that's not the point - it was WORTH the $999 price tag. This simply is not, not in the slightest. $650, $750 tops because it's the best, after all..but $999 ? Not in this lifetime.

    I've not had an AMD card since way back in the days of ATi, I am well and truly part of the nVidia crowd, even when they had a better card I'd wait for the green team reply. But this is actually insulting to consumers.

    I was never gonna buy one of these, I was waiting on the 980Ti for the 384bit bus and the bumps that come along with it...but now I'm not only hoping the 390x is better than people say because then nVidia will have to make it extra good..I'm hoping it's better than they say so I can actually buy it.

    For shame nVidia, what you're doing with this card is unforgivable
  • Michael Bay - Wednesday, March 18, 2015 - link

    So you`re blaming a for-profit company for being for-profit.
  • maximumGPU - Wednesday, March 18, 2015 - link

    no he's not. He's blaming a for-profit compaby abusing it's position at the expense of its customers.
    Maxwell is great, and i've got 2 of them in my rig. But titan X is a bit of a joke. The only justification the previous titan had was that it could be viewed as a cheap professional cards. Now that's gone but you're still paying the same price.
    Unfortunately nvidia will put the highest price they can get away with, and 999$ doesn't seem to deter some hardcore fans no matter how much poor value it represents.
    I certainly hope the sales don't meet their expectations.
  • TheinsanegamerN - Wednesday, March 18, 2015 - link

    I would argue that the vram may be needed later on. 4GB is already tight with SoM, and future games will only push that up.
    people said that 6GB was too much for the OG titan, but SoM can eat that up at 4k, and other games are not far behind. especially for SLI setups, that memory will come in handy.
    Thats what really killed the 770. gpu was fine for me, but 2GB was way to little vram.
  • Tal Greywolf - Wednesday, March 18, 2015 - link

    Not being a gamer, I would like to see a review in which many of these top-of-the-line gaming cards are tested against a different sort of environment. For example, I'd love to see how cards compare handling graphics software packages such as Photoshop, Premier Pro, Lightwave, Cinema 4D, SolidWorks and others. If these cards are really pushing the envelope, then they should compare against the Quadro and FirePro lines.
  • Ranger101 - Wednesday, March 18, 2015 - link

    I think it's safe to say that Nvidia make technically superior cards as compared to AMD,
    at least as far as the last 2 generations of GPUs are concerned. While the AMD cards consume
    more power and produce more heat, this issue is not a determining factor when I upgrade unlike
    price and choice.

    I will not buy this card, despite the fact that I find it to be a very desirable and
    techically impressive card, because I don't like being price-raped and because
    I want AMD to be competitive.

    I will buy the 390X because I prefer a "consumer wins" situation where there are at least 2
    companies producing competitive products and lets be clear AMD GPUs are competitve, even when you factor in what is ultimately a small increase in heat and noise, not to mention lower prices.

    It was a pleasant surprise to see the R295X2 at one point described as "very impressive" yet
    I think it would have been fair if Ryan had drawn more attention to AMD "wins," even though they
    are not particularly significant, such as the most stressful Shadow of Mordor benchmarks.

    Most people favour a particular brand, but surely even the most ardent supporters wouldn't want to see a situation where there is ONLY Intel and ONLY Nvidia. We are reaping the rewards of this scenario already in terms of successive generations of Intel CPUs offering performance improvements that are mediocre at best.

    I can only hope that the 390X gets a positive review at Anandtech.
  • Mystichobo - Wednesday, March 18, 2015 - link

    Looking forward to a 390 with the same performance for 400-500. I certainly got my money's worth out of the r9 290 when it was released. Don't understand how anyone could advocate this $1000 single card price bracket created for "top tier".
  • Geforce man - Wednesday, March 18, 2015 - link

    What still frustrates me, is the lack of using a modern aftermarket r9 290/x.
  • Crunchy005 - Wednesday, March 18, 2015 - link

    I actually really like how the new titan looks, shows what can be done. The problem with this card at this price point is it defeats what the titan really should be. Without the couple precision performance this card becomes irrelevant I feel(overpriced gaming card). The original titan was an entry level compute card outside of the quadro lineup. I know there are drawbacks to multiGPU setups but I would go for 2 980's or 970's for same or less money than the Titan X.

    I also found these benchmarks very interesting because you can see how much each game can be biased to a certain card. AMDs 290x, an old card, beat out the 980 in some cases, mostly at 4k resolutions and lost in others at the same resolution. Just goes to show that you also have to look at individual game performance as well as overall performance when buying a card.

    Can't wait for the 390x from AMD that should be very interesting.

Log in

Don't have an account? Sign up now