NVIDIA Releases GeForce GTX Titan Zby Ryan Smith on May 28, 2014 10:45 AM EST
Back in March at GTC 2014, NVIDIA announced their forthcoming flagship dual-GPU video card, the GeForce GTX Titan Z. Based on a pair of fully enabled GK110 GPUs, NVIDIA was shooting to deliver around twice the performance of a single Titan Black in a single card form factor.
At the time of NVIDIA’s initial announcement GTX Titan Z was scheduled for release in April. April of course came and went with no official word from NVIDIA on why it was delayed, and now towards the tail end of May the card is finally up for release. To that end NVIDIA sent out a release a bit ago announcing the availability of the card, along with putting up the card's product page and confirming the final specifications of the card.
|GTX Titan Z||GTX Titan Black||GTX 780 Ti||GTX Titan|
|Stream Processors||2 x 2880||2880||2880||2688|
|Texture Units||2 x 240||240||240||224|
|ROPs||2 x 48||48||48||48|
|Memory Clock||7GHz GDDR5||7GHz GDDR5||7GHz GDDR5||6GHz GDDR5|
|Memory Bus Width||2 x 384-bit||384-bit||384-bit||384-bit|
|VRAM||2 x 6GB||6GB||3GB||6GB|
|FP64||1/3 FP32||1/3 FP32||1/24 FP32||1/3 FP32|
|Width||Triple Slot||Double Slot||Double Slot||Double Slot|
|Transistor Count||2 x 7.1B||7.1B||7.1B||7.1B|
|Manufacturing Process||TSMC 28nm||TSMC 28nm||TSMC 28nm||TSMC 28nm|
First and foremost, with NVIDIA initially holding back on publishing some of the specifications of the GTX Titan Z at its announcement in March, we now have the final two pieces of the puzzle: the card’s official GPU clockspeeds, and the TDP. Our earlier estimation of the core clock, based on NVIDIA’s performance figures, turned out to be correct, with the card shipping at 706MHz. Meanwhile the boost clock is revealed to be at 876MHz.
This makes for an especially large delta between the base and boost clocks – 170MHz – which is consistent with the TDP-constrained nature of this card. NVIDIA’s last dual-GPU card, GTX 690, also had a larger than average clock delta, so this is not unexpected though it is the widest delta we’ve seen yet. What this means is that it’s reasonable to assume that the performance of GTX Titan Z is going to be more TDP sensitive than on GTX Titan Black; in TDP-heavy scenarios the card is going to have to fall back more often, while in TDP-light scenarios it should still have the chance to perform near its maximum boost clock. Speaking of which NVIDIA doesn’t publish the maximum boost clock, so from these figures it’s reasonable to expect the GTX Titan Z to underperform the GTX Titan Black in SLI, but it’s not possible to tell how well peak performance will compare.
Meanwhile we also have a final confirmation on the card’s TDP. As we suspected back in March, NVIDIA has configured the card with a 375W TDP, putting the TDP roughly 50% higher than a single GTX Titan Black and indicating that along with a wider range of clockspeeds NVIDIA is aggressively binning GPUs for this part. This lower TDP means that while we expect GTX Titan Z to underperform GTX Titan Black in SLI, it looks like it should significantly undercut the latter’s power consumption, improving overall power efficiency.
Looking at the power delivery mechanism itself, NVIDIA has also sent over a shot of the bare board itself, along with a bit of information on how it’s configured. GTX Titan Z uses 12 power phases (split in half for each GPU), which as we can see mostly reside at the center of the card between the two GPUs. Delivering power to these VRMs is a pair of 8pin PCIe power sockets, which combined with the PCIe slot itself allow up to 375W to be pulled, the card’s TDP.
This 375W beast will in turn be cooled via a triple slot cooler, owing to the greater amount of heat to dissipate. Triple slot cards are commonly seen in high-end partner designs, but this mark the first time we’ve seen a triple slot card as a reference design. The triple slot design is also going to be notable since when coupled with the split-blower design of the cooler, it further increases the amount of space the card occupies. Axial fan designs such as the one used on GTX Titan Z need a PCIe slot’s worth of breathing room to operate, which means that altogether the GTX Titan Z is going to take up 4 slots of space. Which in turn is notable because it means that in principle GTX Titan Z won’t save on any space compared to GTX Titan Black in SLI; the latter uses a tried and true blower design that allows the cards to be used directly next to each other (though it’s not preferable), consuming 4 slots of space in an SLI configuration.
Moving on, today’s announcement also sees confirmation of the I/O port configuration and the number of displays supported for the card. NVIDIA’s specs say that GTX Titan Z will support up to 4 displays, indicating that all I/O ports are being routed through a single GPU. However NVIDIA’s port configuration is downright odd for a $3000 card: 1x DVI-I, 1x DVI-D, 1x DisplayPort, and 1x HDMI. This is admittedly us being picky, but the inclusion of the HDMI port in a $3000 card is genuinely odd. The DVI ports make sense in as much as they work with legacy DVI displays at a time when a DisplayPort-to-DL-DVI adapter is $100, but the HDMI port offers neither flexibility nor cost savings. Replacing the HDMI port with a second DisplayPort would grant the card far more flexibility – including driving a second 4K@60Hz monitor – all the while still allowing HDMI through a simple passive DisplayPort-to-HDMI adapter. But I digress…
As far as pricing and availability are concerned, as per NVIDIA’s initial announcement the GTX Titan Z is retailing at $2999 (ed: or about £2350 in the UK), making it NVIDIA’s most expensive GeForce card yet. We’ve seen announcements from MSI, Zotac, and EVGA so far, so it looks like a decent selection of NVIDIA’s partners will be selling the card, though it’s not clear at this time which regions each of those partners will be selling in. With the GTX Titan cards thus far, NVIDIA has only let a couple of partners sell the card since they’re selling identical low volume products. In any case availability is immediate, with Newegg already listing the EVGA card as in stock as of press time.
Of course it goes without saying that $3000 is going to be a steep price to pay for GTX Titan Z, both compared to the AMD and even the NVIDIA competition. A pair of GTX Titan Blacks would run for $2000, a full $1000 less, and as we discussed before the triple slot design of the GTX Titan Black doesn’t afford much in the way of space savings over dual slot cards. Which doesn’t mean we’re writing off GTX Titan Z – NVIDIA is many things, and diligent about their research is one of those – but it will be interesting to see what their end users and OEM/boutique builders do with the card. The benefits of GTX Titan Z over two single-GPU cards are not as cut-and-dry as with NVIDIA’s other dual-GPU cards, which means that it’s more of a lateral move than usual.
A big part of how GTX Titan Z is going to be used will in turn depend on who the buyer is. NVIDIA’s compute group is pushing GTX Titan Z as the ultimate compute card at the same time as their gaming group is pushing it as the ultimate gaming card, and like NVIDIA’s other Titan cards this product will be serving two masters. That said it’s clear from NVIDIA’s presentations and discussions with the company that they intend it to be a compute product first and foremost (a fate similar to GTX Titan Black), in which case this is going to be the single most powerful CUDA card NVIDIA has ever released. NVIDIA’s Kepler compute products have been received very well by buyers so far, including the previous Titan cards, so there’s ample evidence that this will continue with GTX Titan Z. At the end of the day the roughly 2.66 TFLOPS of double precision performance on a single card (more than some low-end supercomputers, we hear) is going to be a big deal, especially for users invested in NVIDIA’s CUDA ecosystem.
Gaming on the other hand looks to be murkier. Certainly GTX Titan Z can and will be used as a gaming card (expect to see this one popular in high-end boutique systems), but NVIDIA faces extremely stiff competition from AMD’s recently released Radeon R9 295X2, which at $1500 retails for half the price of GTX Titan Z. Given GTX Titan Z’s sub-Titan Black clockspeeds and higher price, NVIDIA faces an uphill battle here on price and performance, and it makes a lot of sense in light of this why GTX Titan Z is first and foremost a compute card. None the less, with NVIDIA controlling around 2/3rds of the discrete GPU market and GTX Titan Z consuming around 25% less power (on paper), we certainly expect it to appear in gaming systems, especially in builds where price is no object and two cards can be installed.
Wrapping things up, the launch of GTX Titan Z appears to be the capstone for Kepler’s career over at NVIDIA. While we will likely see rebadges and reconfigurations over the coming generations, with NVIDIA now shipping a dual-GPU GK110 card they have assembled virtually every Kepler combination possible. And for that they go out with a bang, while on the long term we turn our eyes towards NVIDIA’s new Maxwell architecture and what it might mean for the high-end once it makes its way into NVIDIA’s most powerful GPUs.
|Spring 2014 GPU Pricing Comparison|
|$3000||GeForce GTX Titan Z|
|Radeon R9 295X2||$1500|
|$1100||GeForce GTX Titan Black|
|$650||GeForce GTX 780 Ti|
|Radeon R9 290X||$550|
|$500||GeForce GTX 780|
Post Your CommentPlease log in or sign up to comment.
View All Comments
HisDivineOrder - Thursday, May 29, 2014 - linkCurrently being manufactured to be released end of this year with delays probably pushing it into next year around March? ;)
Ettepet - Thursday, May 29, 2014 - linkThe main beauty of this device is its 6GB / GPU. If it was a bit more efficient (speed/power/size-wise) and not so expensive one or two would be in a pc or pci-e expension board I plan to buy within a month.
Where are the 4GB+ GTX 780 Ti's? Or 8GB+ GTX 790's? Bring them on... :-)
krskipp - Thursday, May 29, 2014 - linkI'm definitely jumping ship to AMD from now on. Nvidia seem to have put the brakes on double memory cards by essentially calling them Titan, and charging double the money. They routinely give their standard cards *just* enough memory for today's applications, guaranteeing you'll need to upgrade in 18 months or so. My 2 680s in SLI would be desperate for an upgrade if I hadn't been able to get the 4Gb version that I currently have (I have a 1440p monitor). So given I'll have a choice between a (likely) 4Gb 880, or an 8Gb Rip off Titan for my next GPU, I'll go with AMD's 290x follow up which will probably have at least 5Gb RAM. In fact since memory is relatively cheap, AMD are missing a trick by not releasing a 6Gb version of the 290X, for a small premium. It would be a complete Titan killer if it was priced right. NVidia have just got greedy and cynical with their specs and pricing structure.
piroroadkill - Thursday, May 29, 2014 - linkThere's an 8GiB 290X by Sapphire.
Evilwake - Sunday, June 8, 2014 - linkYea and in reviews it craps all over the Titan Black we need more companies like Sapphire that step out of the box for its gaming fans.
Evilwake - Sunday, June 8, 2014 - linkAt least someone sees the light i use to love nvidia but they gone crazy for that amount of money i could put 3 290x and blow that card to bits.O i forgot the cuda thing o well good thing i know how to code in opcl.
Hrel - Friday, May 30, 2014 - linkthat shit cray
jman9295 - Monday, March 16, 2015 - linklooking at the PCB in the 2nd pic, I'm assuming that if Nvidia or one of their vendors wanted to, they could fit all the components of a single Titan Black with the full 6GB on a half length PCB since that is essentially what they are doing here. Basically, the PCB will end where the PCIe teeth end. And, if they omitted one of the DVI ports, they could make it a single slot card assuming a thinner heat sink would be able to adequately cool a Titan Black with a single fan.