Meet The GeForce GTX Titan

As we briefly mentioned at the beginning of this article, the GeForce GTX Titan takes a large number of cues from the GTX 690. Chief among these is that it’s a luxury card, and as such is built to similar standards as the GTX 690. Consequently, like the GTX 690, Titan is essentially in a league of its own when it comes to build quality.

Much like the GTX 690 was to the GTX 590, Titan is an evolution of the cooler found on the GTX 580. This means we’re looking at a card roughly 10.5” in length using a double-wide cooler. The basis of Titan’s cooler is a radial fan (blower) sitting towards the back of the card, with the GPU, RAM, and most of the power regulation circuitry in front of the fan. As a result the bulk of the hot air generated by Titan is blown forwards and out of the card. However it’s worth noting that unlike most other blowers technically the back side isn’t sealed, and while there is relatively little circuitry behind the fan, it would be incorrect to state that the card is fully exhausting. With that said, leaving the back side of the card open seems to be more about noise and aesthetics than it does heat management.

Like the GTX 580 but unlike the GTX 680, heat transfer is provided by a nickel tipped aluminum heatsink attached to the GPU via a vapor chamber. We typically only see vapor chambers on premium cards due to their greater costs, but also when space is at a premium. Meanwhile NVIDIA seems to be pushing the limits of heatsink size here, with the fins on Titan’s heatsink actually running beyond the base of the vapor chamber. Meanwhile providing the thermal interface between the GPU itself and the vapor chamber is a silk screened application of a high-end Shin-Etsu thermal compound; NVIDIA claims this compound offers over twice the performance of GTX 680’s grease, although of all of NVIDIA’s claims this is the least possible to validate.

Moving on, catching what the vapor chamber doesn’t cover is an aluminum baseplate that runs along the card, not only providing structural rigidity but also providing cooling for the VRMs and for the RAM on the front side of the card. Baseplates aren’t anything new for NVIDIA, but again this is something that we don’t see a lot of except on their more premium cards.

Capping off Titan we have its most visible luxury aspects. Like the GTX 690 before it, NVIDIA has replaced virtually every bit of plastic with metal for aesthetic/perceptual purposes. This time the entire shroud and fan housing is composed of casted aluminum, which NVIDIA tells us is easier to cast than the previous mix of aluminum and magnesium that the GTX 690 used. Meanwhile the polycarbonate window makes its return allowing you to see Titan’s heatsink solely for the sake of it.

As for the back side of the card, keeping with most of NVIDIA’s cards Titan runs with a bare back. The GDDR5 RAM chips don’t require any kind of additional cooling, and a metal backplate while making for a great feeling card, occupies precious space that would otherwise impede cooling in tight spaces.

Moving on, let’s talk about the electrical details of Titan’s design. Whereas GTX 680 was a 4+2 power phase design – 4 power phases for the GPU and 2 for the VRAM – Titan improves on this by moving to a 6+2 power phase design. I suspect the most hardcore of overclockers will be disappointed with Titan only having 6 phases for the GPU, but for most overclocking purposes this would seem to be enough.

Meanwhile for RAM it should come as no particular surprise that NVIDIA is once more using 6GHz RAM here. Specifically, NVIDIA is using 24 6GHz Samsung 2Gb modules here, totaling up to the 6GB of RAM we see on the card. 12 modules are on front with the other 12 modules on the rear. The overclocking headroom on 6GHz RAM seems to vary from chip to chip, so while Titan should have some memory overclocking headroom it’s hard to say just what the combination of luck and the wider 384bit memory bus will do.

Providing power for all of this is a pair of PCIe power sockets, a 6pin and an 8pin, for a combined total of 300W of capacity. With Titan only having a TDP of 250W in the first place, this leaves quite a bit of headroom before ever needing to run outside of the PCIe specification.

At the other end of Titan we can see that NVIDIA has once again gone back to their “standard” port configuration for the GeForce 600 series: two DL-DVI ports, one HDMI port, and one full-size DisplayPort. Like the rest of the 600 family, Titan can drive up to four displays so this configuration is a good match. Though I would still like to see two mini-DisplayPorts in the place of the full size DisplayPort, in order to tap the greater functionality DisplayPort offers though its port conversion mechanisms.

Who’s Titan For, Anyhow? Titan For Compute


View All Comments

  • AeroJoe - Wednesday, February 20, 2013 - link

    Very good article - but now I'm confused. If I'm building an Adobe workstation to handle video and graphics, do I want a TITAN for $999 or the Quadro K5000 for $1700? Both are Kepler, but TITAN looks like more bang for the buck. What am I missing? Reply
  • Rayb - Wednesday, February 20, 2013 - link

    The extra money you are paying is for the driver support in commercial applications like Adobe CS6 with a Quadro card vs a non certified card. Reply
  • mdrejhon - Wednesday, February 20, 2013 - link

    Excellent! Geforce Titan will make it much easier to overclock an HDTV set to 120 Hz
    ( )

    Some HDTV’s such as Vizio e3d420vx can be successfully “overclocked” to a 120 Hz native PC signal from a computer. This was difficult because an EDID override was necessary. However, the Geforce Titan should make this a piece of cake!
  • Blazorthon - Wednesday, February 20, 2013 - link

    Purely as a gaming card, Titan is obviously way to overpriced to be worth considering. However, it's compute performance is intriguing. It can't totally replace a Quadro or Tesla, but there are still many compute workloads that you don't need those extremely expensive extra features such as ECC and Quadro/Tesla drivers to excel in. Many of them may be better suited to a Tahiti card's far better value, but stuff like CUDA workloads may find Titan to be the first card to truly succeed GF100/GF110 based cards as a gaming and compute-oriented card, although like I said, I think that the price could still be at least somewhat lower. I understand it not being around $500 like GF100/110 launched at for various reasons, but come on, at most give us an arpund $700-750 price... Reply
  • just4U - Thursday, February 21, 2013 - link

    Some one here have stated that AMD is at fault for pricing their 7x series so high las year. Perhaps many were disapointed with the $550 price range but that's still somewhat lower than previously released Nvidia products thru the years. Several of those cards (at various price points) handily beat the 580 (which btw never did get much of a price drop) and at the time that's what it was competing against.

    So I can't quite connect the dots in why they are saying that it's AMD's fault for originally pricing the 7x series so high when in reality it was still lower than newly released Nvidia product over the past several years.
  • CeriseCogburn - Monday, March 04, 2013 - link

    For the most part, correct.
    The 7970 came out at $579 though, not $550. And it was nearly not present for many months, till just the prior day to the 680's $499 launch.

    In any case, ALL these cards drop in price over the first six months or so, EXCEPT sometimes, if they are especially fast, like the 580, they hold at the launch price, which it did, until the 7970 was launched - the 580 was $499 till the day the 7970 launched.

    So what we have here is the tampon express. The tampon express has not paid attnetion to any but fps/price vs their revised and memory holed history, so it will continue forever.

    They have completely ignored capital factors like the extreme lack of production space in the node, ongoing prior to the 7970 release, and at emergency low levels prior to the months later 680 release, with the emergency board meeting, and multi-billion dollar borrowing buildout for die space production expansion, not to mention the huge change in wafer from dies payment which went from per good die to per wafer cost, thus placing the burden of failure on the GPU company side.

    It's not like they could have missed that, it was all over the place for months on end, the amd fanboys were bragging amd got diespace early and constantly hammering away at nVidia and calling them stupid for not having reserved space and screaming they would be bankrupt from low yields they had to pay for from the "housefires" dies.

    So what we have now is well trained (not potty trained) crybabies pooping their diapers over and over again, and let's face it, they do believe they have the power to lower the prices if they just whine loudly enough.

    AMD has been losing billions, and nVidia profit ratio is 10% - but the crying babies screams mean to assist their own pocketbooks at any expense, including the demise of AMD even though they all preach competition and personal CEO capitalist understanding after they spew out 6th grader information or even make MASSIVE market lies and mistakes with illiterate interpretation of standard articles or completely blissful denial of things like diespace (mentioned above) or long standing standard industry tapeout times for producing the GPU's in question.

    They want to be "critical reporters" but they fail miserably at it, and merely show crybaby ignorance with therefore false outrage. At least they consider themselves " the good hipster !"
  • clickonflick - Thursday, March 07, 2013 - link

    i agree that the price of this GPU is really high , one could easily assemble a fully mainstream laptop online with dell at this price tag or a desktop, but for gamers, to whom performance is above price. then it is a boon for them
    for more pics check this out

Log in

Don't have an account? Sign up now