Final Words

When NVIDIA introduced the original GTX Titan in 2013 they set a new bar for performance, quality, and price for a high-end video card. The GTX Titan ended up being a major success for the company, a success that the company is keen to repeat. And now with their Maxwell architecture in hand, NVIDIA is in a position to do just that.

For as much of a legacy as the GTX Titan line can have at this point, it’s clear that the GTX Titan X is as worthy a successor as NVIDIA could hope for. NVIDIA has honed the already solid GTX Titan design, and coupled it with their largest Maxwell GPU, and in the process has put together a card that once again sets a new bar for performance and quality. That said, from a design perspective GTX Titan X is clearly evolutionary as opposed to the revolution that was the original GTX Titan, but it is nonetheless an impressive evolution.

Overall then it should come as no surprise that from a gaming performance standpoint the GTX Titan X stands alone. Delivering an average performance increase over the GTX 980 of 33%, GTX Titan X further builds on what was already a solid single-GPU performance lead for NVIDIA. Meanwhile compared to its immediate predecessors such as the GTX 780 Ti and the original GTX Titan, the GTX Titan X represents a significant, though perhaps not-quite-generational 50%-60% increase in performance. However perhaps most importantly, this performance improvement comes without any further increase in noise or power consumption as compared to NVIDIA’s previous generation flagship.

Meanwhile from a technical perspective, the GTX Titan X and GM200 GPU represent an interesting shift in high-end GPU design goals for NVIDIA, one whose ramifications I’m not sure we fully understand yet. By building what’s essentially a bigger version of GM204, heavy on graphics and light on FP64 compute, NVIDIA has been able to drive up performance without a GM204-like increase in die size. At 601mm2 GM200 is still NVIDIA’s largest GPU to date, but by producing their purest graphics GPU in quite some time, it has allowed NVIDIA to pack more graphics horsepower than ever before into a 28nm GPU. What remains to be seen then is whether this graphics/FP32-centric design is a one-off occurrence for 28nm, or if this is the start of a permanent shift in NVIDIA GPU design.

But getting back to the video card at hand, there’s little doubt of the GTX Titan X’s qualifications. Already in possession of the single-GPU performance crown, NVIDIA has further secured it with the release of their latest GTX Titan card. In fact there's really only one point we can pick at with the GTX Titan X, and that of course is the price. At $999 it's priced the same as the original GTX Titan - so today's $999 price tag comes as no surprise - but it's still a high price to pay for Big Maxwell. NVIDIA is not bashful about treating GTX Titan as a luxury card line, and for better and worse GTX Titan X continues this tradition. GTX Titan X, like GTX Titan before it, is a card that is purposely removed from the price/performance curve.

Meanwhile, the competitive landscape is solidly in NVIDIA's favor we feel. We would be remiss not to mention multi-GPU alternatives such as the GTX 980 in SLI and AMD's excellent Radeon R9 295X2. But as we've mentioned before when reviewing these setups before, multi-GPU is really only worth chasing when you've exhausted single-GPU performance. R9 295X2 in turn is a big spoiler on price, but we continue to believe that a single powerful GPU is a better choice for consistent performance, at least if you can cover the cost of GTX Titan X.

Finally on a lighter note, with the launch of the GTX Titan X we wave good-bye to GTX Titan as an entry-level double precision compute card. NVIDIA dumping high-performance FP64 compute has made GTX Titan X a better graphics card and even a better FP32 compute card, but it means that the original GTX Titan's time as NVIDIA's first prosumer card was short-lived. I suspect that we haven't seen the end of NVIDIA's forays into entry-level FP64 compute cards like the original GTX Titan, but that next card will not be GTX Titan X.

Overclocking
Comments Locked

276 Comments

View All Comments

  • nos024 - Wednesday, March 18, 2015 - link

    Well lets see. Even when it launches, will it be readily available and not highly priced like the 290X. If the 290x was readily available when it was launched, I would've bought one.
  • eanazag - Wednesday, March 18, 2015 - link

    Based on leaked slides referencing Battlefield 4 at 4K resolution the 390X is 1.6x the 290X. In the context of this review results we could guess it comes up slightly short at 4K ultra and 10 fps faster than the Titan X at 4K medium. Far Cry 4 came in at 1.55 x the 290X.

    290X non-uber 4K ultra - BF4 - 35.5 fps x 1.6 = 56.8. >> Titan 58.3
    290X non-uber 4K medium - BF4 - 65.9 fps x 1.6 = 105.44 >> Titan 94.8

    290X non-uber 4K ultra - FC4 - 31.2 fps x 1.55 = 48.36 >> Titan 42.1
    290X non-uber 4K medium - FC4 - 40.9 fps x 1.55 = 63.395 >> Titan 60.5

    These numbers don't tell the whole story on how AMD arrived with the figures, but it paints the picture of a GPU that goes toe-to-toe with the Titan X. The slides also talk about a water cooler edition. I'm suspecting the wattage will be in the same ball park as the 290X and likely higher.

    With the Titan X full breadth compute muscle, I am not sure what the 980 Ti will look like. I suspect Nvidia is holding that back based on whatever AMD releases, so they can unload a smack down trump card. Rumored $700 for the 390X WCE with 8GB HBM (high bandwidth memory - 4096 bit width) and in Q2 (April-June). Titan X and 390X at the same price given what I know at the moment I would go with the Titan X.

    Stack your GPU $'s for July.
  • FlushedBubblyJock - Thursday, April 2, 2015 - link

    If the R9 390X doesn't come out at $499 months and months from now, it won't be worth it.
  • shing3232 - Tuesday, March 17, 2015 - link

    1/32 FP32? so, this is a big gaming core.
  • Railgun - Tuesday, March 17, 2015 - link

    Exactly why it's not a $999 card.
  • shing3232 - Tuesday, March 17, 2015 - link

    but, it was priced at 999.
  • Railgun - Tuesday, March 17, 2015 - link

    What I mean is that it's not worth being a 999 card. Yes, it's priced at that, but it's value doesn't support it.
  • Flunk - Tuesday, March 17, 2015 - link

    Plenty of dolts bought the first Titan as a gaming card so I'm sure someone will buy this. At least there's a bigger performance difference between the Titan X and GTX 980 than there was between the Titan and GTX 780.
  • Kevin G - Tuesday, March 17, 2015 - link

    Except the GTX 780 came after the Titan launched. Rather it was the original Titan compared to the GTX 680 and here we see a similar gap between the Titan X and the GTX 980. It is also widely speculated that we'll see a cut down GM200 to fit between the GTX 980 and the Titan X so history looks like it will repeat itself.
  • chizow - Tuesday, March 17, 2015 - link

    @Railgun, I'd disagree and I was very vocal against the original Titan for a number of reasons. Mainly because Nvidia used the 7970 launch as an opportunity to jump their 2nd fastest chip as mainstream. 2ndly, because they held back their flagship chip nearly a full year (GTX 680 launched Mar 2012, Titan Feb 2013) while claiming the whole time there was no bigger chip, they tried to justify the higher price point because it was a "compute" card and lastly because it was a cut down chip and we knew it.

    Titan X isn't being sold with any of those pretenses and now that the new pricing/SKU structure has settled in (2nd fastest chip = new $500 flagship), there isn't any of that sticker shock anymore. Its the full chip, there's no complaints about them holding anything back, and 12GB of VRAM is a ridiculous amount of VRAM to stick on a card, and that costs money. If EVGA wants to release an $800 Classified 980 and people see value in it, then certainly this Titan X does as well.

    At least for me, it is the more appealing option for me now than getting a 2nd 980 for SLI. Slightly lower performance, lower heat, no SLI/scaling issues, and no framebuffer VRAM concerns for the foreseeable future. I game at 2560x1440p on an ROG Swift btw, so that is right in this card's wheelhouse.

Log in

Don't have an account? Sign up now