Final Words

When NVIDIA introduced the original GTX Titan in 2013 they set a new bar for performance, quality, and price for a high-end video card. The GTX Titan ended up being a major success for the company, a success that the company is keen to repeat. And now with their Maxwell architecture in hand, NVIDIA is in a position to do just that.

For as much of a legacy as the GTX Titan line can have at this point, it’s clear that the GTX Titan X is as worthy a successor as NVIDIA could hope for. NVIDIA has honed the already solid GTX Titan design, and coupled it with their largest Maxwell GPU, and in the process has put together a card that once again sets a new bar for performance and quality. That said, from a design perspective GTX Titan X is clearly evolutionary as opposed to the revolution that was the original GTX Titan, but it is nonetheless an impressive evolution.

Overall then it should come as no surprise that from a gaming performance standpoint the GTX Titan X stands alone. Delivering an average performance increase over the GTX 980 of 33%, GTX Titan X further builds on what was already a solid single-GPU performance lead for NVIDIA. Meanwhile compared to its immediate predecessors such as the GTX 780 Ti and the original GTX Titan, the GTX Titan X represents a significant, though perhaps not-quite-generational 50%-60% increase in performance. However perhaps most importantly, this performance improvement comes without any further increase in noise or power consumption as compared to NVIDIA’s previous generation flagship.

Meanwhile from a technical perspective, the GTX Titan X and GM200 GPU represent an interesting shift in high-end GPU design goals for NVIDIA, one whose ramifications I’m not sure we fully understand yet. By building what’s essentially a bigger version of GM204, heavy on graphics and light on FP64 compute, NVIDIA has been able to drive up performance without a GM204-like increase in die size. At 601mm2 GM200 is still NVIDIA’s largest GPU to date, but by producing their purest graphics GPU in quite some time, it has allowed NVIDIA to pack more graphics horsepower than ever before into a 28nm GPU. What remains to be seen then is whether this graphics/FP32-centric design is a one-off occurrence for 28nm, or if this is the start of a permanent shift in NVIDIA GPU design.

But getting back to the video card at hand, there’s little doubt of the GTX Titan X’s qualifications. Already in possession of the single-GPU performance crown, NVIDIA has further secured it with the release of their latest GTX Titan card. In fact there's really only one point we can pick at with the GTX Titan X, and that of course is the price. At $999 it's priced the same as the original GTX Titan - so today's $999 price tag comes as no surprise - but it's still a high price to pay for Big Maxwell. NVIDIA is not bashful about treating GTX Titan as a luxury card line, and for better and worse GTX Titan X continues this tradition. GTX Titan X, like GTX Titan before it, is a card that is purposely removed from the price/performance curve.

Meanwhile, the competitive landscape is solidly in NVIDIA's favor we feel. We would be remiss not to mention multi-GPU alternatives such as the GTX 980 in SLI and AMD's excellent Radeon R9 295X2. But as we've mentioned before when reviewing these setups before, multi-GPU is really only worth chasing when you've exhausted single-GPU performance. R9 295X2 in turn is a big spoiler on price, but we continue to believe that a single powerful GPU is a better choice for consistent performance, at least if you can cover the cost of GTX Titan X.

Finally on a lighter note, with the launch of the GTX Titan X we wave good-bye to GTX Titan as an entry-level double precision compute card. NVIDIA dumping high-performance FP64 compute has made GTX Titan X a better graphics card and even a better FP32 compute card, but it means that the original GTX Titan's time as NVIDIA's first prosumer card was short-lived. I suspect that we haven't seen the end of NVIDIA's forays into entry-level FP64 compute cards like the original GTX Titan, but that next card will not be GTX Titan X.

Overclocking
Comments Locked

276 Comments

View All Comments

  • chizow - Wednesday, March 18, 2015 - link

    And custom-cooled, higher clocked cards should? It took months for AMD to bring those to market and many of them cost more than the original reference cards and are also overclocked.

    http://www.newegg.com/Product/ProductList.aspx?Sub...

    Like I said, AMD fanboys made this bed, time to lie in it.
  • Witchunter - Wednesday, March 18, 2015 - link

    I hope you do realize calling out AMD fanboys in each and every one of your comments essentially paints you as Nvidia fanboy in the eyes of other readers. I'm here to read some constructive comments and all I see is you bitching about fanboys and being one yourself.
  • chizow - Wednesday, March 18, 2015 - link

    @Witchunter, the difference is, I'm not afraid to admit I'm a fan of the best, but I'm going to at least be consistent on my views and opinions. Whereas these AMD fanboys are crying foul for the same thing they threw a tantrum over a few years ago, ultimately leading to this policy to begin with. You don't find that ironic, that what they were crying about 4 years ago is suddenly a problem when the shoe is on the other foot? Maybe that tells you something about yourself and where your own biases reside? :)
  • Crunchy005 - Wednesday, March 18, 2015 - link

    @chizow either way you don't really offer constructive criticism and you call people dishonest without proving them wrong in any way and offering facts. You are one of the biggest fanboys out there and it kind of makes you lose credibility.
  • Crunchy005 - Wednesday, March 18, 2015 - link

    Ok wanted to add to this, I do like some of the comments you make but you are so fan boyish I am unable to take much stock in what you say. If you could offer more facts and stop just bashing AMD and praising the all powerful Nvidia is better in every way, despite the fact that AMD has advantages and has outperformed Nvidia in many ways, so has Nvidia outperformed AMD, they leap frog...if you did that we might all like to hear what you have to say.
  • FlushedBubblyJock - Thursday, April 2, 2015 - link

    I know what the truth is so I greatly enjoy what he says.
    If you can't handle the truth, that should be your problem, not everyone else's, obviously.
  • chizow - Monday, March 23, 2015 - link

    Like I said, I'm not here to sugarcoat things or keep it constructive, I'm here to set the record straight and keep the discussion honest. If that involves bruising some fragile AMD fanboy egos and sensibilities, so be it.

    I'm completely comfortable in my own skin knowing I'm a fan of the best, and that just happens to be Nvidia for graphics cards for the last near-decade since G80, and I'm certainly not afraid to tell you why that's the case backed with my usual facts, references etc. etc. You're free to verify my sources and references if you like to come to your own conclusion, but at the end of the day, that's the whole point of the internet, isn't it? Lay out the facts, let informed people make their own conclusions?

    In any case, the entire discussion and you can be the judge of whether my take on the topic is fair, you can clearly see, AMD fanboys caused this dilemma for themselves, many of which are the ones you see crying in this thread. Queue that Alanis Morissette song....

    http://anandtech.com/comments/3987/amds-radeon-687...
    http://anandtech.com/show/3988/the-use-of-evgas-ge...
  • Phartindust - Wednesday, March 18, 2015 - link

    Um, AMD doesn't manufacture after market cards.
  • dragonsqrrl - Tuesday, March 17, 2015 - link

    "use less power"

    ...right, and why would these non reference cards consume less power? Just hypothetically speaking, ignoring for a moment all the benchmarks out there that suggest otherwise.
  • squngy - Tuesday, March 17, 2015 - link

    Undervolting?

Log in

Don't have an account? Sign up now