During today’s GDC session on Epic’s Unreal Engine, NVIDIA CEO Jen-Hsun Huang dropped in as a special guest to announce NVIDIA’s next high performance video card, the GeForce GTX Titan X.

In order to capitalize on the large audience of the Unreal session while not spoiling too much ahead of NVIDIA’s own event in 2 weeks – the NVIDIA GPU Technology Conference – NVIDIA is playing coy with details on the product, but they have released a handful of details along with a product image.

NVIDIA Titan Specification Comparison
  GTX Titan X GTX Titan Black GTX Titan
Stream Processors ? 2880 2688
Texture Units ? 240 224
ROPs 96? 48 48
Core Clock ? 889MHz 837MHz
Boost Clock ? 980MHz 876MHz
Memory Clock ? 7GHz GDDR5 6GHz GDDR5
Memory Bus Width 384-bit? 384-bit 384-bit
VRAM 12GB 6GB 6GB
FP64 ? 1/3 FP32 1/3 FP32
TDP ? 250W 250W
Transistor Count 8B 7.1B 7.1B
Architecture Maxwell Kepler Kepler
Manufacturing Process TSMC 28nm? TSMC 28nm TSMC 28nm
Launch Date Soon 2/18/14 02/21/13
Launch Price A Large Number $999 $999

The GPU underlying GTX Titan X is 8 billion transistors, which similar to the original GTX Titan’s launch means we’re almost certainly looking at Big Maxwell. NVIDIA will be pairing it with 12GB VRAM – indicating a 384-bit memory bus – and it will once again be using NVIDIA’s excellent metal cooler and shroud, originally introduced on the original GTX Titan.

No further details are being provided at this time, and we’re expecting to hear more about it at GTC. Meanwhile Epic’s master engine programmer Tim Sweeney was gifted the first GTX Titan X card, in recognition of NVIDIA and Epic’s long development partnership and the fact that Epic guys are always looking for more powerful video cards to push the envelope on Unreal Engine 4.

Comments Locked

104 Comments

View All Comments

  • abianand - Thursday, March 5, 2015 - link

    @kyuu: lol, thx. took some time to understand.
    wish thr was an edit button
  • D. Lister - Thursday, March 5, 2015 - link

    Latching 12GB of VRAM on a gaming GPU is just a terrible waste of resources. Keep the extra 4GBs Nvidia, and give us more raw horsepower instead.
  • ShieTar - Thursday, March 5, 2015 - link

    Funny you should say that. Just last week the German magazin "PC Games Hardware" was showing off some UHD Gaming Benchmarks with a 8GB FirePro and a 4GB 290X, and they noticed the FirePro using more than 6GB of VRAM, while the 290X was using an additional 1.5GB of System Memory.

    And we're just getting started to game in UHD, I think 12GB will be very much appreciated by enthusiast gamers over the next 2 years.
  • D. Lister - Thursday, March 5, 2015 - link

    http://www.digitalstormonline.com/unlocked/video-m...

    At 4K resolution, Ultra settings plus 4xAA, BF4 uses 3.167GB VRAM, while Crysis 3 uses 3.668GB.

    The main reason for the recent jump in VRAM usage is not 4K, but the fact that the new gen consoles have a uniform memory architecture to compensate for their otherwise much inferior hardware, and ports to PC are done lazily as always.

    That is why, 8GB VRAM is going to be the gold standard for the top-tier GPUs in the coming years. Putting an extra 4GB is just going to be wasteful, 4K or not, and the end users would be better off with higher bandwidth/lower latency 8GB. On the other hand, I suppose Nvidia is going to have to justify the ~$1,500 price tag, where that extra 4GB would be helpful.
  • CPUGPUGURU - Saturday, March 7, 2015 - link

    It all depends on what these new APIs DX12/Vulkan deliver, DX12 stacks VRAM so that should extend the usefulness of most DX12 capable GPUs, anyone with a Fermi, Kepler, and full DX12 Maxwell will be very happy their GPU investment in SLI. One big reason prices have come down is that TSMC's 28nm process is mature offering high yields, I expect Titan X prices to fall sharply after a few months that's when I'll time my new build, Skylake and Titan X will bring great performance for many years to come and I will add another Titan X for a SLI upgrade boost when needed at even lower prices. All GPUs when they are first released have a high price tag but drop within a few months, its all good for the patient gamer.
  • RussianSensation - Friday, March 6, 2015 - link

    Enthusiasts who can afford $1K cards in pairs don't keep them for 2 years sorry. By the time 12GB on a card is useful, the Titan X's performance could be purchased in a $300 Pascal/Volta. In 12 months alone we went from $700 GTX780Ti to $330 970 and $1500 R9 295X2 to $600 today. Anyone who buys $2000-2500 worth of graphics cards to "future proof" is doing it wrong. Most people who will buy this are those for whom $1350 or so is not a lot of money, they use it for work/PhD research/universities or early adopters who flip cards before they tank in value too much, which means selling it well before the 2 year mark.
  • AndrewJacksonZA - Thursday, March 5, 2015 - link

    Anandtech, please, I beg of you, use the internationally recognised ISO 8601 yyyy-mm-dd date format. Your current date format is confusing for non-Americans to read.

    *PRETTY* please?

    https://en.wikipedia.org/wiki/ISO_8601
  • Laststop311 - Thursday, March 5, 2015 - link

    Anandtech is based in the US so of course it's date format will be the US format. If we have to suffer the crappy format on euro websites you will suffer on US websites
  • Railgun - Thursday, March 5, 2015 - link

    I'm sure you can figure out that it's not yet April 3rd 2015. As an American in the UK, it's not too hard to figure out the context of the date based on what site I'm looking at.
  • AndrewJacksonZA - Thursday, March 5, 2015 - link

    As far as current posting dates go (not historical,) you're right Railgun. I was referring to the launch dates. Not these two specific launch dates, obviously, but my Date Format Frustration Level just happened to be reached with this specific article.

    The Radeon 6400 was launched on 02/07/2011. What date is that exactly? ( https://en.wikipedia.org/wiki/Radeon_HD_6000_Serie... )

Log in

Don't have an account? Sign up now