Today at GTC NVIDIA announced their next GTX Titan family card. Dubbed the GTX Titan Z (no idea yet on why it’s Z), the card is NVIDIA's obligatory entry into the dual-GPU/single-card market, finally bringing NVIDIA’s flagship GK110 GPU into a dual-GPU desktop/workstation product.

While NVIDIA has not released the complete details about the product – in particular we don’t know precise clockspeeds or TDPs – we have been given some information on core configuration, memory, pricing, and availability.

  GTX Titan Z GTX Titan Black GTX 780 Ti GTX Titan
Stream Processors 2 x 2880 2880 2880 2688
Texture Units 2 x 240 240 240 224
ROPs 2 x 48 48 48 48
Core Clock 700MHz? 889MHz 875MHz 837MHz
Boost Clock ? 980MHz 928MHz 876MHz
Memory Clock 7GHz GDDR5 7GHz GDDR5 7GHz GDDR5 6GHz GDDR5
Memory Bus Width 2 x 384-bit 384-bit 384-bit 384-bit
VRAM 2 x 6GB 6GB 3GB 6GB
FP64 1/3 FP32 1/3 FP32 1/24 FP32 1/3 FP32
TDP ? 250W 250W 250W
Transistor Count 2 x 7.1B 7.1B 7.1B 7.1B
Manufacturing Process TSMC 28nm TSMC 28nm TSMC 28nm TSMC 28nm
Launch Date 04/XX/14 02/18/14 11/07/13 02/21/13
Launch Price $2999 $999 $699 $999

In brief, the GTX Titan Z is a pair of fully enabled GK110 GPUs. NVIDIA isn’t cutting any SMXes or ROP partitions to bring down power consumption, so each half of the card is equivalent to a GTX 780 Ti or GTX Titan Black, operating at whatever (presumably lower) clockspeeds NVIDIA has picked. And although we don’t have precise clockspeeds, NVIDIA has quoted the card as having 8 TFLOPS of FP32 performance, which would put the GPU clockspeed at around 700MHz, nearly 200MHz below GTX Titan Black’s base clock (to say nothing of boost clocks).

On the memory front GTX Titan Z is configured with 12GB of VRAM, 6GB per GPU. NVIDIA’s consumer arm has also released the memory clockspeed specifications, telling us that the card won’t be making any compromises there, operating at the same 7GHz memory clockspeed of the GTX Titan Black. This being something of a big accomplishment given the minimal routing space a dual-GPU card provides.

In terms of build the GTX Titan Z shares a lot of similarities to NVIDIA’s previous generation dual-GPU card, the GTX 690. NVIDIA is keeping the split blower design, with a single axial fan pushing out air via both the front and the back of the card, essentially exhausting half the hot air and sending the other half back into the case. We haven’t had any hands-on time with the card, but NVIDIA is clearly staying with the black metal styling of the GTX Titan Black.

The other major unknown right now is power consumption. GTX Titan Black is rated for 250W, and meanwhile NVIDIA was able to get a pair of roughly 200W GTX 680s into the 300W GTX 690 (with reduced clockspeeds). So it’s not implausible that GTX Titan Z is a 375W card, but we’ll have to wait and see.

But perhaps the biggest shock will be price. The GTX Titan series has already straddled the prosumer line with its $1000/GPU pricing; GTX Titan was by far the fastest thing on the gaming market in the winter of 2013, while GTX Titan Black is a bit more professional-leaning due to the existence of the GTX 780 Ti. With GTX Titan Z, NVIDIA will be asking for a cool $3000 for the card, or three-times the price of a GTX Titan Black.

It goes without saying then that GTX Titan Z is aimed at an even more limited audience than the GTX Titan and GTX Titan Black. To be sure, NVIDIA is still targeting both gamers and compute users with this card, and since it is a GeForce card it will use the standard GeForce driver stack, but the $3000 price tag is much more within the realm of compute users than gamers. For gamers this may as well be a specialty card, like an Asus ARES.

Now for compute users this will still be an expensive card, but potentially very captivating. Per FLOP GTX Titan Black is still a better deal, but with compute users there is a far greater emphasis on density. Meanwhile the GTX Titan brand has by all accounts been a success for NVIDIA, selling more cards to compute users than they had ever expected, so a product like GTX Titan Z is more directly targeted at those users. I have no doubt that there are compute users who will be happy with it – like the original GTX Titan it’s far cheaper per FP64 FLOP than any Tesla card, maintaining its “budget compute” status – but I do wonder if part of the $3000 pricing is in reaction to GTX Titan undercutting Tesla sales.

Anyhow, we should have more details next month. NVIDIA tells us that they’re expecting to launch the card in April, so we would expect to hear more about it in the next few weeks.

Source: NVIDIA

Comments Locked


View All Comments

  • aggiechase37 - Tuesday, March 25, 2014 - link

    I guess I'm just not seeing the draw. I can get two Titan blacks or this (essentially the exact same thing but less power draw) for a grand more? What am I missing here?
  • Nagorak - Tuesday, March 25, 2014 - link

    "What am I missing here?"

    You should be spending that money on something more worthwhile, like say a trip to Europe.
  • boeush - Tuesday, March 25, 2014 - link

    What you're missing, is that you can get a pair of Z's for $4,000 more, and thereby have 4 full GK110's running inside your dual-card box (and you'll probably need a refrigerator-sized cooling unit to service that box...)

  • aggiechase37 - Wednesday, March 26, 2014 - link

    Haha. Yeah I don't think I will need that sort of computational power anytime soon. Talk to me again in 2 years when we have this sort of thing in our phones though.
  • DIYEyal - Thursday, May 8, 2014 - link

    Lower clock speeds..
  • Alecanto - Tuesday, March 25, 2014 - link

    Does anyone else feel like Nvidia and Intel both need to step back and let software catch up to hardware? I mean is there any game out now or coming in the near future that's going to actually use 6 GB's of VRAM or 8 processor cores? If there is I would like to know because I want a reason to upgrade from a 4700k and a gtx 780, but I'm not seeing any right now(In fact it's overkill for the 99% of games). I understand that gamer's aren't their only target and that many will buy a card like this without having to worry about budget, but I'm rooting for Nvidia to take the throne from Sony and Microsoft (we all know they want to), but I'm not sure that releasing these ridiculously priced products will help them achieve that.
  • Assimilator87 - Wednesday, March 26, 2014 - link

    4K Surround with SSAA? I don't know. I'm just guessing.
  • sascha - Wednesday, March 26, 2014 - link

    Not everybody uses the system to play games.
  • tim851 - Thursday, March 27, 2014 - link

    "Does anyone else feel like Nvidia and Intel both need to step back and let software catch up to hardware?"

    Why? Does anybody force you to upgrade?

    "Let's stop the manufacturing lines, Alecanto doesn't need our products. He promised he'll call when he wants something new."

    Talk about entitlement issues...
  • SilthDraeth - Tuesday, March 25, 2014 - link


    Can it play Crysis?

Log in

Don't have an account? Sign up now