During today’s GDC session on Epic’s Unreal Engine, NVIDIA CEO Jen-Hsun Huang dropped in as a special guest to announce NVIDIA’s next high performance video card, the GeForce GTX Titan X.

In order to capitalize on the large audience of the Unreal session while not spoiling too much ahead of NVIDIA’s own event in 2 weeks – the NVIDIA GPU Technology Conference – NVIDIA is playing coy with details on the product, but they have released a handful of details along with a product image.

NVIDIA Titan Specification Comparison
  GTX Titan X GTX Titan Black GTX Titan
Stream Processors ? 2880 2688
Texture Units ? 240 224
ROPs 96? 48 48
Core Clock ? 889MHz 837MHz
Boost Clock ? 980MHz 876MHz
Memory Clock ? 7GHz GDDR5 6GHz GDDR5
Memory Bus Width 384-bit? 384-bit 384-bit
VRAM 12GB 6GB 6GB
FP64 ? 1/3 FP32 1/3 FP32
TDP ? 250W 250W
Transistor Count 8B 7.1B 7.1B
Architecture Maxwell Kepler Kepler
Manufacturing Process TSMC 28nm? TSMC 28nm TSMC 28nm
Launch Date Soon 2/18/14 02/21/13
Launch Price A Large Number $999 $999

The GPU underlying GTX Titan X is 8 billion transistors, which similar to the original GTX Titan’s launch means we’re almost certainly looking at Big Maxwell. NVIDIA will be pairing it with 12GB VRAM – indicating a 384-bit memory bus – and it will once again be using NVIDIA’s excellent metal cooler and shroud, originally introduced on the original GTX Titan.

No further details are being provided at this time, and we’re expecting to hear more about it at GTC. Meanwhile Epic’s master engine programmer Tim Sweeney was gifted the first GTX Titan X card, in recognition of NVIDIA and Epic’s long development partnership and the fact that Epic guys are always looking for more powerful video cards to push the envelope on Unreal Engine 4.

POST A COMMENT

104 Comments

View All Comments

  • Urizane - Thursday, March 05, 2015 - link

    The GAMERS who bought the original Titan should have done their own research and realized that was not a product for them, just like all successive Titans. The regret they felt was all on them. Reply
  • Urizane - Thursday, March 05, 2015 - link

    I should also mention that I believe gamers will buy whatever they want. Just realize that the product you're paying extra cash for has a different reason for its price than gaming. It's a heavier compute card that just so happens to also do gaming pretty well. Gaming alone on a Titan is a waste of the extra resources provided for workloads that gaming won't utilize. Reply
  • FlushedBubblyJock - Wednesday, March 11, 2015 - link

    But no one is whining it's not future proof because of lack of memory.

    Yes, I know, the core can't do it anyway (future games high res with massive mem useage and decent frames) , but telling the lemmings that doesn't work.
    Reply
  • deeps6x - Wednesday, March 04, 2015 - link

    Ya'all really should post a 980 and it's specs on that chart as well for reference purposes. Reply
  • Innokentij - Wednesday, March 04, 2015 - link

    You write in the article Ryan Smith "and it will once again be using NVIDIA’s excellent metal cooler and shroud, >originally< introduced on the original GTX Titan." This is nonsense, the GTX 690 had it 4 months before TITAN. Reply
  • Urizane - Thursday, March 05, 2015 - link

    The design for the 690 cooler was adjusted for the original Titan. The concepts existed 4 months before the Titan, but that specific cooler was created for the Titan. Reply
  • Urizane - Thursday, March 05, 2015 - link

    How did my comment get here? I meant to reply to the 690 cooler comment. Reply
  • Urizane - Thursday, March 05, 2015 - link

    OMG. The live comment updates showed my comment being in reply to "You a word." There's a bug, but good luck trying to replicate it. Reply
  • abianand - Thursday, March 05, 2015 - link

    damn, when i looked at the launch price of the Titan X i almost the coffee i was drinking on my keyboard. Reply
  • kyuu - Thursday, March 05, 2015 - link

    You a word. Reply

Log in

Don't have an account? Sign up now