Meet The GeForce GTX Titan

As we briefly mentioned at the beginning of this article, the GeForce GTX Titan takes a large number of cues from the GTX 690. Chief among these is that it’s a luxury card, and as such is built to similar standards as the GTX 690. Consequently, like the GTX 690, Titan is essentially in a league of its own when it comes to build quality.

Much like the GTX 690 was to the GTX 590, Titan is an evolution of the cooler found on the GTX 580. This means we’re looking at a card roughly 10.5” in length using a double-wide cooler. The basis of Titan’s cooler is a radial fan (blower) sitting towards the back of the card, with the GPU, RAM, and most of the power regulation circuitry in front of the fan. As a result the bulk of the hot air generated by Titan is blown forwards and out of the card. However it’s worth noting that unlike most other blowers technically the back side isn’t sealed, and while there is relatively little circuitry behind the fan, it would be incorrect to state that the card is fully exhausting. With that said, leaving the back side of the card open seems to be more about noise and aesthetics than it does heat management.

Like the GTX 580 but unlike the GTX 680, heat transfer is provided by a nickel tipped aluminum heatsink attached to the GPU via a vapor chamber. We typically only see vapor chambers on premium cards due to their greater costs, but also when space is at a premium. Meanwhile NVIDIA seems to be pushing the limits of heatsink size here, with the fins on Titan’s heatsink actually running beyond the base of the vapor chamber. Meanwhile providing the thermal interface between the GPU itself and the vapor chamber is a silk screened application of a high-end Shin-Etsu thermal compound; NVIDIA claims this compound offers over twice the performance of GTX 680’s grease, although of all of NVIDIA’s claims this is the least possible to validate.

Moving on, catching what the vapor chamber doesn’t cover is an aluminum baseplate that runs along the card, not only providing structural rigidity but also providing cooling for the VRMs and for the RAM on the front side of the card. Baseplates aren’t anything new for NVIDIA, but again this is something that we don’t see a lot of except on their more premium cards.

Capping off Titan we have its most visible luxury aspects. Like the GTX 690 before it, NVIDIA has replaced virtually every bit of plastic with metal for aesthetic/perceptual purposes. This time the entire shroud and fan housing is composed of casted aluminum, which NVIDIA tells us is easier to cast than the previous mix of aluminum and magnesium that the GTX 690 used. Meanwhile the polycarbonate window makes its return allowing you to see Titan’s heatsink solely for the sake of it.

As for the back side of the card, keeping with most of NVIDIA’s cards Titan runs with a bare back. The GDDR5 RAM chips don’t require any kind of additional cooling, and a metal backplate while making for a great feeling card, occupies precious space that would otherwise impede cooling in tight spaces.

Moving on, let’s talk about the electrical details of Titan’s design. Whereas GTX 680 was a 4+2 power phase design – 4 power phases for the GPU and 2 for the VRAM – Titan improves on this by moving to a 6+2 power phase design. I suspect the most hardcore of overclockers will be disappointed with Titan only having 6 phases for the GPU, but for most overclocking purposes this would seem to be enough.

Meanwhile for RAM it should come as no particular surprise that NVIDIA is once more using 6GHz RAM here. Specifically, NVIDIA is using 24 6GHz Samsung 2Gb modules here, totaling up to the 6GB of RAM we see on the card. 12 modules are on front with the other 12 modules on the rear. The overclocking headroom on 6GHz RAM seems to vary from chip to chip, so while Titan should have some memory overclocking headroom it’s hard to say just what the combination of luck and the wider 384bit memory bus will do.

Providing power for all of this is a pair of PCIe power sockets, a 6pin and an 8pin, for a combined total of 300W of capacity. With Titan only having a TDP of 250W in the first place, this leaves quite a bit of headroom before ever needing to run outside of the PCIe specification.

At the other end of Titan we can see that NVIDIA has once again gone back to their “standard” port configuration for the GeForce 600 series: two DL-DVI ports, one HDMI port, and one full-size DisplayPort. Like the rest of the 600 family, Titan can drive up to four displays so this configuration is a good match. Though I would still like to see two mini-DisplayPorts in the place of the full size DisplayPort, in order to tap the greater functionality DisplayPort offers though its port conversion mechanisms.

Who’s Titan For, Anyhow? Titan For Compute
Comments Locked

157 Comments

View All Comments

  • vacaloca - Tuesday, February 19, 2013 - link

    A while ago when K20 released and my advisor didn't want to foot the bill, I ended up doing it myself. Looks like the K20 might be going to eBay since I don't need HyperQ MPI and GPU Direct RDMA or ECC for that matter. I do suspect that it might be possible to crossflash this card with a K20 or K20X BIOS and mod the softstraps to enable the missing features... but probably the video outputs would be useless (and warranty void, and etc) so it's not really an exercise worth doing.

    Props to NVIDIA for releasing this for us compute-focused people and thanks to AnandTech for the disclosure on FP64 enabling. :)
  • extide - Tuesday, February 19, 2013 - link

    Can you please run some F@H benchmarks on this card? I would be very very interested to see how well it folds. Also if you could provide some power consumption numbers (watts @ system idle and watts when gpu only is folding).

    That would be great :)
    Thanks!
  • Ryan Smith - Tuesday, February 19, 2013 - link

    OpenCL is broken with the current press drivers. So I won't have any more information until NVIDIA issues new drivers.
  • jimhans1 - Tuesday, February 19, 2013 - link

    Alright, the whining about this being a $1000 card is just stupid; nVidia has priced this right in my eyes on the performance/noise/temperature front, they have never billed this as being anything other than an Extreme style GPU, just like the 690, yes the 690 will outperform this in raw usage, but not by much I'm guessing, and it will run hotter, louder and use more power than the Titan, not to mention possible SLI issues that have plagued ALL SLI/CF on one PCB cards to date. If you want THE high end MAINSTREAM card, you get the 680, if you wan't the EXTREME card(s), you get the Titan or 690.

    Folks, we don't yell at Ferrari or Bugatti for pricing their vehicles to their performance capabilities; nobody yelled at Powercolor for pricing the Devil 13 at $1000 even though the 690 spanks it on ALMOST all fronts for $100 LESS.

    Yes, I wish I could afford 1 or 3 of the Titans; but' I am not going to yell and whine about the $1000 price because I CAN'T afford them, it gives me a goal to try and save my sheckles to get at least 2 of them before years end, hopefully the price may (but probably won't) have dropped by then.
  • chizow - Tuesday, February 19, 2013 - link

    The problem with your car analogy is that Nvidia is now charging you Bugatti prices for the same BMW series you bought 2 years ago. Maybe an M3 level of trim this time around, but it's the same class of car, just 2x the price.
  • Sandcat - Wednesday, February 20, 2013 - link

    The high end 28nm cards have all been exercises in gouging. At least they're being consistent with the 'f*ck the customer because we have a duopoly' theme.
  • Kevin G - Tuesday, February 19, 2013 - link

    The card is indeed a luxury product. Like all consumer cards, this is crippled in in some way compared to the Quadro and Tesla lines. Not castrating FP64 performance is big. I guess nVidia finally realized that the HPC market values reliability more than raw computer and hence why EDC/ECC is disabled. ditto for RMDA, though I strongly suspect that RMDA is still used for SLI between Geforce cards - just a lock out to another vendor's hardware.

    The disabling of GPU Boost for FP64 workloads is odd. Naturally it should consumer a bit more energy to do FP64 workloads which would either result in higher temps at the same frequency as FP32 or lower clocks at the same frequency as FP32. The surprise is that users are don't have the flexibility to choose or adjust those settings.

    Display overclocking has me wondering exactly what is being altered. DVI and DP operate at distinct frequencies and moving to a higher refresh rate at higher resolutions should also increase this. Cable quality would potentially have an impact here as well. Though for lower resolutions, driving them at a higher refresh rate should still be within the cabling spec.
  • Kepe - Tuesday, February 19, 2013 - link

    The comment section is filled with NVIDIA hate, on how they dropped the ball, lost their heads, smoked too much and so on. What you don't seem to understand is that this is not a mainstream product. It's not meant for those who look at performance/$ charts when buying their graphics cards. This thing is meant for those who have too much money on their hands. Not the average Joe building his next gaming rig. And as such, this is a valid product at a valid price point. A bit like the X-series Intel processors. If you look at the performance compared to their more regular products the 1000+ dollar price is completely ridiculous.

    You could also compare the GTX Titan to a luxury phone. They use extravagant building materials, charge a lot of extra for the design and "bling", but raw performance isn't on the level of what you'd expect by just looking at the price tag.
  • jimhans1 - Tuesday, February 19, 2013 - link

    I agree, the pricing is in line with the EXPECTED user base for the card; it is NOT a mainstream card.
  • Sandcat - Tuesday, February 19, 2013 - link

    The disconnect regards the Gx110 chip. Sure, it's a non-mainstream card, however people do have the impression that it is the lock-step successor to the 580, and as such should be priced similarly.

    Nvidia does need to be careful here, they enjoy a duopoly in the market but goodwill is hard to create and maintain. I've been waiting for the 'real' successor to the 580 to replace my xfire 5850's and wasn't impressed with the performance increase of the 680. Looks like it'll be another year....at least.

    :(

Log in

Don't have an account? Sign up now