During today’s GDC session on Epic’s Unreal Engine, NVIDIA CEO Jen-Hsun Huang dropped in as a special guest to announce NVIDIA’s next high performance video card, the GeForce GTX Titan X.

In order to capitalize on the large audience of the Unreal session while not spoiling too much ahead of NVIDIA’s own event in 2 weeks – the NVIDIA GPU Technology Conference – NVIDIA is playing coy with details on the product, but they have released a handful of details along with a product image.

NVIDIA Titan Specification Comparison
  GTX Titan X GTX Titan Black GTX Titan
Stream Processors ? 2880 2688
Texture Units ? 240 224
ROPs 96? 48 48
Core Clock ? 889MHz 837MHz
Boost Clock ? 980MHz 876MHz
Memory Clock ? 7GHz GDDR5 6GHz GDDR5
Memory Bus Width 384-bit? 384-bit 384-bit
FP64 ? 1/3 FP32 1/3 FP32
TDP ? 250W 250W
Transistor Count 8B 7.1B 7.1B
Architecture Maxwell Kepler Kepler
Manufacturing Process TSMC 28nm? TSMC 28nm TSMC 28nm
Launch Date Soon 2/18/14 02/21/13
Launch Price A Large Number $999 $999

The GPU underlying GTX Titan X is 8 billion transistors, which similar to the original GTX Titan’s launch means we’re almost certainly looking at Big Maxwell. NVIDIA will be pairing it with 12GB VRAM – indicating a 384-bit memory bus – and it will once again be using NVIDIA’s excellent metal cooler and shroud, originally introduced on the original GTX Titan.

No further details are being provided at this time, and we’re expecting to hear more about it at GTC. Meanwhile Epic’s master engine programmer Tim Sweeney was gifted the first GTX Titan X card, in recognition of NVIDIA and Epic’s long development partnership and the fact that Epic guys are always looking for more powerful video cards to push the envelope on Unreal Engine 4.



View All Comments

  • PatHeist - Friday, March 6, 2015 - link

    Except, that's not what happens at all. Either the assets requested are in the VRAM, and they're pointed to and accessed via the memory controller on the GPU, or they're not in the VRAM, and they need to be moved in taking the place of something else. There is absolutely no way in which having that extra RAM is going to cause more stuttering than if it were simply not there. Reply
  • Kutark - Sunday, March 8, 2015 - link

    People's logic amazes me sometimes. Thats like saying you can have 6x 512gb Samsung 850 PRO ssd's AND 2x 850 EVO's (that are slower), and you'd rather have JUST the PRO's. Reply
  • dragonsqrrl - Wednesday, March 4, 2015 - link

    There was an interesting article published about this recently, but it basically said that the 390X would be limited to 4GB at launch due to the HBM memory config. That's the current limit given a 4096-bit interface and 4-hi 1GB modules. The question is when will Hynix launch stacked modules with 2GB capacities? That's when you'll most likely see an 8GB 390X option. Reply
  • Laststop311 - Wednesday, March 4, 2015 - link

    HBM 1st gen is limited to 4GB so 390x will only be 4GB Reply
  • Kutark - Sunday, March 8, 2015 - link

    Perhaps someone who knows more about memory than i do can help me with this. But isn't using the HBM with such a high bandwidth kind of overkill at this point? I.e. it would be like having a water line that can carry 600 gallons/min, and having a pump that can do 550 gallons/min, but then upgrading that pump to one that can do 900 gallons/min.

    I mean, right now the memory already isn't the main bottleneck from what i understand, so while you will get performance gains from the HBM, isnt it relatively miniscule?

    I just have the feeling its being put in more for a "big number" on the tech specs sheet for the marketing department to use, than for actually significantly improving the video card.
  • FlushedBubblyJock - Wednesday, March 11, 2015 - link

    It should be faster accessing it, so it's better - at least that's the plan. Reply
  • dstockwell23 - Wednesday, March 4, 2015 - link

    looks nice, hopefully the full vapor chamber for the cooler at that price!
    "and it will once again be using NVIDIA’s excellent metal cooler and shroud, originally introduced on the original GTX Titan."
    would read better as
    "and it will once again be using NVIDIA’s excellent metal cooler and shroud, first introduced on the original GTX Titan."
  • ingwe - Wednesday, March 4, 2015 - link

    I love the launch price!

    This thing is going to be beastly. I buy AMD because I don't need the best, and I would like some competition. But Nvidia has just been better in terms of performance for power.
  • Yojimbo - Wednesday, March 4, 2015 - link

    Buying something you don't want for competition is sort of like having no competition to begin with. Reply
  • sr1030nx - Wednesday, March 4, 2015 - link

    Agreed, launch price is pretty funny. Reply

Log in

Don't have an account? Sign up now