Who’s Titan For, Anyhow?

Having established performance expectations, let’s talk about where Titan fits into NVIDIA’s product stack. First and foremost, though Titan is most certainly geared in part as a gaming video card (and that’s largely how we’ll be looking at it), that’s not the only role it serves. Titan is also going to be NVIDIA’s entry-level compute card. We’ll dive more into why that is in a bit in our feature breakdown, but the biggest factor is that for the first time on any consumer-level NVIDIA card, double precision (FP64) performance is uncapped. That means 1/3 FP32 performance, or roughly 1.3TFLOPS theoretical FP64 performance. NVIDIA has taken other liberties to keep from this being treated as a cheap Tesla K20, but for lighter workloads it should fit the bill.

As compared to the server and high-end workstation market that Tesla carves out, NVIDIA will be targeting the compute side of Titan towards researchers, engineers, developers, and others who need access to (relatively) cheap FP64 performance, and don’t need the scalability or reliability that Tesla brings. To that end Titan essentially stands alone in NVIDIA’s product stack; the next thing next to a FP64-constrained consumer card is the much more expensive Tesla K20.

Far more complex will be the gaming situation. Titan will not be pushing anything down in NVIDIA’s product stack, rather NVIDIA’s product stack will be growing up to accompany Titan. Like the GTX 690, NVIDIA is going to position Titan as a premium/luxury product, releasing it at the same $999 price point. GTX 690 itself will continue to exist at the same $999 price point, meanwhile GTX 680 will continue at its current price point of roughly $450.

Continuing the GTX 690 analogies, Titan will not only be sharing in GTX 690’s price point, but also in its design principles and distribution. This means Titan is a well-built card with its housing composed primarily of metals, with the same kind of luxury finish as the GTX 690. On the distribution side of things Asus and EVGA are once again NVIDIA’s exclusive partners for North America, and they will essentially be distributing  reference Titan cards. In time we will see some specialized variation, with water-cooling in particular being an obvious route for EVGA to go. Factory overclocks are also on the table.

With the above in mind, it goes without saying that while GTX 690 had some historical precedence in its price, the same cannot be said for Titan. The price of NVIDIA’s top-tier single-GPU video cards hovered around $500 for the GeForce 400/500 series, and while they attempted to launch at $650 for the GTX 280, the launch of the Radeon HD 4870 quickly brought that price down to earth. As such this will be the most expensive single-GPU product out of NVIDIA yet.


Top To Bottom: GTX 680, GTX Titan, GTX 690

Ultimately NVIDIA is not even going to try to compete on a price-performance basis with Titan. There are a number of potential reasons for this – ranging from the competitive landscape to yields to needs for GK110 GPUs elsewhere within NVIDIA – and all of those reasons are probably true to some extent. Regardless, NVIDIA believes that like the GTX 690 they can sell Titan as a luxury product, and hence $999. The GTX 680 and below will compose NVIDIA’s more traditional price-performance competitive fare.

As to be expected from such a price, NVIDIA’s marketing department will be handling Titan in a similar fashion as they did GTX 690. This not only includes reiterating the fact that Titan is intended to be a luxury product, but also focusing on markets where luxury products are accepted, and where Titan in particular makes sense.

Perhaps not surprisingly, with $999 video cards the makeup of consumers shifts away from both traditional big-box OEMs and DIY builders, and towards boutique builders. Boutique builders are essentially already in the business of providing luxury computers, offering an outlet for luxury buyers who need not spend their time building their own computer, and want something of higher quality than what the typical OEM provides. As such while Titan will be sold on the open market just as like any other card, NVIDIA tells us that they expect a lot of those buyers are going to be the boutiques.

For Titan in particular, NVIDIA is going to be focusing on two boutique computer concepts, reflecting the blower design of Titan as opposed to the front/back exhausting design of the GTX 690. The first concept will be SFF PCs, where blowers are a necessity due to a lack of space (and often, a lack of heavy sound dampening), and where such cards can draw fresh air in from outside the chassis.

On the other end of the spectrum will be the ultra-enthusiast market where one Titan isn’t enough, and even two may come up short. Again thanks to the fact that it’s a blower, Titan can easily be fit in an ATX motherboard for tri-SLI operation, which NVIDIA envisions not just as the ultimate gaming computer, but the ultimate NV/3D Surround computer in particular. Multi-monitor gaming with graphically intensive games can quickly nullify the performance of even a single Titan card, so tri-SLI is NVIDIA’s solution to driving three monitors as well as one Titan can drive one monitor. At the same time however, NVIDIA intends to showcase that a tri-SLI system doesn’t need to be loud, despite the cramped conditions and despite the 750W+ that 3 Titans will pull, thanks to the high quality construction of the cards. Tri-SLI has been possible for a number of years, but NVIDIA believes with Titan in particular they have a solid grip on the heat and noise concerns it typically comes with.

To that end, as part of the Titan launch NVIDIA has shipped out a number of boutique systems in either a SFF or tri-SLI full tower configuration to reviewers, in order to show off their usage concepts in completed and well-constructed systems. Anand received a SFF Tiki from Falcon Northwest, while I have received a tri-SLI equipped Genesis from Origin PC. Like Titan itself we can’t talk about the performance of these systems, but we’ll be able to go into greater detail on Thursday when the complete NDA lifts. In the meantime we’ve been able to post a few impressions, which we’ve put up on their respective articles.

Moving on, with a $999 launch price NVIDIA’s competition will be rather limited. The GTX 690 is essentially a companion product; NVIDIA’s customers can either get the most powerful single-GPU card NVIDIA offers in a blower design, or an alternative design composed of two lesser GPUs in SLI, in a front and rear exhausting design. The GTX 690 will be the faster card, but at a higher TDP and with the general drawbacks of SLI. On the other hand Titan will be the more consistent card, the lower TDP card, the easier to cool card, but also the slower card. Meanwhile though it’s not a singular product, the GTX 680 SLI will also be another option, offering higher performance, higher TDP, more noise, and a cheaper price tag of around $900.

As for AMD, with their fastest single-GPU video card being the 7970 GHz Edition, offering performance closer to the GTX 680 than Titan, Titan essentially sits in a class of its own on the single-GPU front. AMD’s competition for Titan will be the 7970GE in CrossFire, and then the officially unofficial 7990 family, composed of the air-cooled PowerColor 7990, and the closed loop water-cooled Asus Ares II. But with NVIDIA keeping GTX 690 around, these are probably closer competitors to the multi-GPU 690 than they are the single-GPU Titan.

Finally, let’s talk launch availability. By scheduling the launch of Titan during the Chinese New Year, NVIDIA has essentially guaranteed this is a delayed availability product. Widespread availability is expected on the 25th, though cards may start popping up a couple of days earlier. NVIDIA hasn’t gone into depth for launch quantities, but they did specifically shoot down the 10,000 card rumor; this won’t be a limited run product and we don’t have any reason at this time to believe this will be much different from the GTX 690’s launch (tight at first, but available and increasingly plentiful).

February 2013 GPU Pricing Comparison
AMD Price NVIDIA
  $1000 GeForce GTX Titan/690
(Unofficial) Radeon HD 7990 $900  
Radeon HD 7970 GHz Edition $450 GeForce GTX 680
Radeon HD 7970 $390  
  $350 GeForce GTX 670
Radeon HD 7950 $300  

 

GeForce GTX Titan Meet The GeForce GTX Titan
Comments Locked

157 Comments

View All Comments

  • vacaloca - Tuesday, February 19, 2013 - link

    A while ago when K20 released and my advisor didn't want to foot the bill, I ended up doing it myself. Looks like the K20 might be going to eBay since I don't need HyperQ MPI and GPU Direct RDMA or ECC for that matter. I do suspect that it might be possible to crossflash this card with a K20 or K20X BIOS and mod the softstraps to enable the missing features... but probably the video outputs would be useless (and warranty void, and etc) so it's not really an exercise worth doing.

    Props to NVIDIA for releasing this for us compute-focused people and thanks to AnandTech for the disclosure on FP64 enabling. :)
  • extide - Tuesday, February 19, 2013 - link

    Can you please run some F@H benchmarks on this card? I would be very very interested to see how well it folds. Also if you could provide some power consumption numbers (watts @ system idle and watts when gpu only is folding).

    That would be great :)
    Thanks!
  • Ryan Smith - Tuesday, February 19, 2013 - link

    OpenCL is broken with the current press drivers. So I won't have any more information until NVIDIA issues new drivers.
  • jimhans1 - Tuesday, February 19, 2013 - link

    Alright, the whining about this being a $1000 card is just stupid; nVidia has priced this right in my eyes on the performance/noise/temperature front, they have never billed this as being anything other than an Extreme style GPU, just like the 690, yes the 690 will outperform this in raw usage, but not by much I'm guessing, and it will run hotter, louder and use more power than the Titan, not to mention possible SLI issues that have plagued ALL SLI/CF on one PCB cards to date. If you want THE high end MAINSTREAM card, you get the 680, if you wan't the EXTREME card(s), you get the Titan or 690.

    Folks, we don't yell at Ferrari or Bugatti for pricing their vehicles to their performance capabilities; nobody yelled at Powercolor for pricing the Devil 13 at $1000 even though the 690 spanks it on ALMOST all fronts for $100 LESS.

    Yes, I wish I could afford 1 or 3 of the Titans; but' I am not going to yell and whine about the $1000 price because I CAN'T afford them, it gives me a goal to try and save my sheckles to get at least 2 of them before years end, hopefully the price may (but probably won't) have dropped by then.
  • chizow - Tuesday, February 19, 2013 - link

    The problem with your car analogy is that Nvidia is now charging you Bugatti prices for the same BMW series you bought 2 years ago. Maybe an M3 level of trim this time around, but it's the same class of car, just 2x the price.
  • Sandcat - Wednesday, February 20, 2013 - link

    The high end 28nm cards have all been exercises in gouging. At least they're being consistent with the 'f*ck the customer because we have a duopoly' theme.
  • Kevin G - Tuesday, February 19, 2013 - link

    The card is indeed a luxury product. Like all consumer cards, this is crippled in in some way compared to the Quadro and Tesla lines. Not castrating FP64 performance is big. I guess nVidia finally realized that the HPC market values reliability more than raw computer and hence why EDC/ECC is disabled. ditto for RMDA, though I strongly suspect that RMDA is still used for SLI between Geforce cards - just a lock out to another vendor's hardware.

    The disabling of GPU Boost for FP64 workloads is odd. Naturally it should consumer a bit more energy to do FP64 workloads which would either result in higher temps at the same frequency as FP32 or lower clocks at the same frequency as FP32. The surprise is that users are don't have the flexibility to choose or adjust those settings.

    Display overclocking has me wondering exactly what is being altered. DVI and DP operate at distinct frequencies and moving to a higher refresh rate at higher resolutions should also increase this. Cable quality would potentially have an impact here as well. Though for lower resolutions, driving them at a higher refresh rate should still be within the cabling spec.
  • Kepe - Tuesday, February 19, 2013 - link

    The comment section is filled with NVIDIA hate, on how they dropped the ball, lost their heads, smoked too much and so on. What you don't seem to understand is that this is not a mainstream product. It's not meant for those who look at performance/$ charts when buying their graphics cards. This thing is meant for those who have too much money on their hands. Not the average Joe building his next gaming rig. And as such, this is a valid product at a valid price point. A bit like the X-series Intel processors. If you look at the performance compared to their more regular products the 1000+ dollar price is completely ridiculous.

    You could also compare the GTX Titan to a luxury phone. They use extravagant building materials, charge a lot of extra for the design and "bling", but raw performance isn't on the level of what you'd expect by just looking at the price tag.
  • jimhans1 - Tuesday, February 19, 2013 - link

    I agree, the pricing is in line with the EXPECTED user base for the card; it is NOT a mainstream card.
  • Sandcat - Tuesday, February 19, 2013 - link

    The disconnect regards the Gx110 chip. Sure, it's a non-mainstream card, however people do have the impression that it is the lock-step successor to the 580, and as such should be priced similarly.

    Nvidia does need to be careful here, they enjoy a duopoly in the market but goodwill is hard to create and maintain. I've been waiting for the 'real' successor to the 580 to replace my xfire 5850's and wasn't impressed with the performance increase of the 680. Looks like it'll be another year....at least.

    :(

Log in

Don't have an account? Sign up now