Meet The GeForce GTX Titan

As we briefly mentioned at the beginning of this article, the GeForce GTX Titan takes a large number of cues from the GTX 690. Chief among these is that it’s a luxury card, and as such is built to similar standards as the GTX 690. Consequently, like the GTX 690, Titan is essentially in a league of its own when it comes to build quality.

Much like the GTX 690 was to the GTX 590, Titan is an evolution of the cooler found on the GTX 580. This means we’re looking at a card roughly 10.5” in length using a double-wide cooler. The basis of Titan’s cooler is a radial fan (blower) sitting towards the back of the card, with the GPU, RAM, and most of the power regulation circuitry in front of the fan. As a result the bulk of the hot air generated by Titan is blown forwards and out of the card. However it’s worth noting that unlike most other blowers technically the back side isn’t sealed, and while there is relatively little circuitry behind the fan, it would be incorrect to state that the card is fully exhausting. With that said, leaving the back side of the card open seems to be more about noise and aesthetics than it does heat management.

Like the GTX 580 but unlike the GTX 680, heat transfer is provided by a nickel tipped aluminum heatsink attached to the GPU via a vapor chamber. We typically only see vapor chambers on premium cards due to their greater costs, but also when space is at a premium. Meanwhile NVIDIA seems to be pushing the limits of heatsink size here, with the fins on Titan’s heatsink actually running beyond the base of the vapor chamber. Meanwhile providing the thermal interface between the GPU itself and the vapor chamber is a silk screened application of a high-end Shin-Etsu thermal compound; NVIDIA claims this compound offers over twice the performance of GTX 680’s grease, although of all of NVIDIA’s claims this is the least possible to validate.

Moving on, catching what the vapor chamber doesn’t cover is an aluminum baseplate that runs along the card, not only providing structural rigidity but also providing cooling for the VRMs and for the RAM on the front side of the card. Baseplates aren’t anything new for NVIDIA, but again this is something that we don’t see a lot of except on their more premium cards.

Capping off Titan we have its most visible luxury aspects. Like the GTX 690 before it, NVIDIA has replaced virtually every bit of plastic with metal for aesthetic/perceptual purposes. This time the entire shroud and fan housing is composed of casted aluminum, which NVIDIA tells us is easier to cast than the previous mix of aluminum and magnesium that the GTX 690 used. Meanwhile the polycarbonate window makes its return allowing you to see Titan’s heatsink solely for the sake of it.

As for the back side of the card, keeping with most of NVIDIA’s cards Titan runs with a bare back. The GDDR5 RAM chips don’t require any kind of additional cooling, and a metal backplate while making for a great feeling card, occupies precious space that would otherwise impede cooling in tight spaces.

Moving on, let’s talk about the electrical details of Titan’s design. Whereas GTX 680 was a 4+2 power phase design – 4 power phases for the GPU and 2 for the VRAM – Titan improves on this by moving to a 6+2 power phase design. I suspect the most hardcore of overclockers will be disappointed with Titan only having 6 phases for the GPU, but for most overclocking purposes this would seem to be enough.

Meanwhile for RAM it should come as no particular surprise that NVIDIA is once more using 6GHz RAM here. Specifically, NVIDIA is using 24 6GHz Samsung 2Gb modules here, totaling up to the 6GB of RAM we see on the card. 12 modules are on front with the other 12 modules on the rear. The overclocking headroom on 6GHz RAM seems to vary from chip to chip, so while Titan should have some memory overclocking headroom it’s hard to say just what the combination of luck and the wider 384bit memory bus will do.

Providing power for all of this is a pair of PCIe power sockets, a 6pin and an 8pin, for a combined total of 300W of capacity. With Titan only having a TDP of 250W in the first place, this leaves quite a bit of headroom before ever needing to run outside of the PCIe specification.

At the other end of Titan we can see that NVIDIA has once again gone back to their “standard” port configuration for the GeForce 600 series: two DL-DVI ports, one HDMI port, and one full-size DisplayPort. Like the rest of the 600 family, Titan can drive up to four displays so this configuration is a good match. Though I would still like to see two mini-DisplayPorts in the place of the full size DisplayPort, in order to tap the greater functionality DisplayPort offers though its port conversion mechanisms.

Who’s Titan For, Anyhow? Titan For Compute
POST A COMMENT

157 Comments

View All Comments

  • WhoppingWallaby - Thursday, February 21, 2013 - link

    Dude, you have some gall calling another person a fanboy. We could all do without your ranting and raving, so go troll elsewhere or calm down a little. Reply
  • CeriseCogburn - Sunday, February 24, 2013 - link

    Oh shut up yourself you radeon rager.

    You idiots think you have exclusive rights to spew your crap all over the place, and when ANYONE disagrees you have a ***** fit and demand they stop.

    How about all you whining critical diaper pooping fanatics stop instead ?
    Reply
  • IanCutress - Tuesday, February 19, 2013 - link

    It's all about single card performance. Everything just works eaier with a single card. Start putting SLI into the mix and you need to take into account for drivers, or when doing compute it requires a complete reworking of code. Not to mention the potentially lower power output and OC capabilities of Titan over a dual GPU card.

    At any given price point, getting two cards up to that cost will always be quicker than a single card in any scenario that can take advantage, if you're willing to put up with it. So yes, two GTX 680s, a 690, or a Titan is a valid question, and it's up to the user preference which one to get.

    I need to double check my wallet, see if it hasn't imploded after hearing the price.
    Reply
  • wreckeysroll - Tuesday, February 19, 2013 - link

    lost their minds?
    how about fell and cracked their head after losing it. Smoking too much of that good stuff down there in California.

    How stupid do they take us for. Way to thumb your customers in the eye nvidia. $1000 on a single gpu kit.

    Good laugh for the morning.
    Reply
  • B3an - Tuesday, February 19, 2013 - link

    Use some ****ing common sense. You get what you pay for.

    6GB with 386-bit memory bus, and a 551mm2 size GPU. Obviously this wont be cheap and theres no way this could be sold for anywhere near the price of a 680 without losing tons of money.

    Nvidia already had this thing in super computers anyway so why not turn it in to a consumer product? Some people WILL buy this. If you have the money why not. Atleast NV are not sitting on their arses like AMD are with no new high-end GPU's this year. Even though i have AMD cards i'm very disappointed with AMD's crap lately as an enthusiast and someone who's just interested in GPU tech. First they literally give up on competitive performance CPU's and now it's looking like they're doing it with GPU's.
    Reply
  • siliconfiber - Tuesday, February 19, 2013 - link

    Common sense is what you are missing.

    GTX 580, 480, 285 were all sold to for much less than this card and were all used in HPC applications, had the same or much bigger dies sizes, and the same or bigger bus. DDR memory is dirt cheap as well

    I have seen it all now. Largest rip-off in the history of video cards right here.
    Reply
  • Genx87 - Tuesday, February 19, 2013 - link

    Oh look I have never seen this argument before. Biggest rip off in history of video cards. Pre-ceded only by every high end video card release since the introduction of high end discrete GPUs. And will remain a ripoff until the next high end GPU is released surpassing this card ripoff factor. Reply
  • Blibbax - Tuesday, February 19, 2013 - link

    It's not a rip off because you don't have to buy it. The 680 hasn't gotten any slower.

    Just like with cars and anything else, when you add 50% more performance to a high-end product, it's gunna be a lot more than 50% more expensive.
    Reply
  • johnthacker - Tuesday, February 19, 2013 - link

    The largest rip-off in the history of video cards are some of the Quadro cards. This is extremely cheap for a card with so good FP64 performance. Reply
  • TheJian - Wednesday, February 20, 2013 - link

    GTX580 (40nm) was not in the same league as this and only had 3b transistors. Titan has 7.1B on 28nm. 512cuda cores compared to 2880? It came with 1.5GB memory too, this has 6. etc etc..The 580 did not run like a $2500 pro card @ a 1500 discount either. Also a chip this complicated doesn't YIELD well. It's very expensive to toss out the bad ones.

    Do you know the difference between system memory and graphics memory (you said ddr). They do not cost the same. You meant GDDR? Well this stuff is 4x as much running 6ghz not 4ghz.

    Ref clock is 876 but these guys got theirs to 1176:
    http://www.guru3d.com/articles-pages/geforce_gtx_t...

    The card is a monster value vs. $2500 K20. Engineering is not FREE. Ask AMD. They lost 1.18B last year selling crap at prices that would make you happy I guess. That's how you go out of business. Get it? They haven't made money in 10yrs (lost 3-4B over that time as a whole). Think they should've charged more for their cards/chips the last ten years? I DO. If Titan is priced wrong, they will remain on the shelf. Correct? So if you're right they won't sell. These will be gone in a day, because there are probably enough people that would pay $1500 for them they'll sell out quickly. You have to pay $2500 to get this on the pro side.
    Reply

Log in

Don't have an account? Sign up now