Final Thoughts

Bringing things to a close, most of what we’ve seen with Titan has been a long time coming. Since the introduction of GK110 back at GTC 2012, we’ve had a solid idea of how NVIDIA’s grandest GPU would be configured, and it was mostly a question of when it would make its way to consumer hands, and at what clockspeeds and prices.

The end result is that with the largest Kepler GPU now in our hands, the performance situation closely resembles the Fermi and GT200 generations. Which is to say that so long as you have a solid foundation to work from, he who builds the biggest GPU builds the most powerful GPU. And at 551mm2, once more NVIDIA is alone in building massive GPUs.

No one should be surprised then when we proclaim that GeForce GTX Titan has unquestionably reclaimed the single-GPU performance crown for NVIDIA. It’s simply in a league of its own right now, reaching levels of performance no other single-GPU card can touch. At best, at its very best, AMD’s Radeon HD 7970GE can just match Titan, which is quite an accomplishment for AMD, but then at Titan’s best it’s nearly a generation ahead of the 7970GE. Like its predecessors, Titan delivers the kind of awe-inspiring performance we have come to expect from NVIDIA’s most powerful video cards.

With that in mind, as our benchmark data has shown, Titan’s performance isn’t quite enough to unseat this generation’s multi-GPU cards like the GTX 690 or Radeon HD 7990. But with that said this isn’t a new situation for us, and we find our editorial stance has not changed: we still suggest single-GPU cards over multi-GPU cards when performance allows for it. Multi-GPU technology itself is a great way to improve performance beyond what a single GPU can do, but as it’s always beholden to the need for profiles and the inherent drawbacks of AFR rendering, we don’t believe it’s desirable in situations such as Titan versus the GTX 690. The GTX 690 may be faster, but Titan is going to deliver a more consistent experience, just not quite at the same framerates as the GTX 690.

Meanwhile in the world of GPGPU computing Titan stands alone. Unfortunately we’re not able to run a complete cross-platform comparison due to Titan’s outstanding OpenCL issue, but from what we have been able to run Titan is not only flat-out powerful, but NVIDIA has seemingly delivered on their compute efficiency goals, giving us a Kepler family part capable of getting far closer to its theoretical efficiency than GTX 680, and closer than any other GPU before it. We’ll of course be taking a further look at Titan in comparison to other GPUs once the OpenCL situation is resolved in order to come to a better understanding of its relative strengths and weaknesses, but for the first wave of Titan buyers I’m not sure that’s going to matter. If you’re doing GPU computing, are invested in CUDA, and need a fast compute card, then Titan is the compute card CUDA developers and researchers have been dreaming of.

Back in the land of consumer gaming though, we have to contend with the fact that unlike any big-GPU card before it, Titan is purposely removed from the price/performance curve. NVIDIA has long wanted to ape Intel’s ability to have an extreme/luxury product at the very top end of the consumer product stack, and with Titan they’re going ahead with that.

The end result is that Titan is targeted at a different demographic than GTX 580 or other such cards, a demographic that has the means and the desire to purchase such a product. Being used to seeing the best video cards go for less we won’t call this a great development for the competitive landscape, but ultimately this is far from the first luxury level computer part, so there’s not much else to say other than that this is a product for a limited audience. But what that limited audience is getting is nothing short of an amazing card.

Like the GTX 690, NVIDIA has once again set the gold standard for GPU construction, this time for a single-GPU card. GTX 680 was a well-built card, but next to Titan it suddenly looks outdated. For example, despite Titan’s significantly higher TDP it’s no louder than the GTX 680, and the GTX 680 was already a quiet card. Next to price/performance the most important metric is noise, and by focusing on build quality NVIDIA has unquestionably set the new standard for high-end, high-TDP video cards.

On a final note, normally I’m not one for video card gimmicks, but after having seen both of NVIDIA’s Titan concept systems I have to say NVIDIA has taken an interesting route in justifying the luxury status of Titan. With the Radeon HD 7970 GHz Edition only available with open air or exotic cooling, Titan has been put into a position where it’s the ultimate blower card by a wide margin. The end result is that in scenarios where blowers are preferred and/or required, such as SFF PCs or tri-SLI, Titan is even more of an improvement over the competition than it is for traditional desktop computers. Or as Anand has so eloquently put it with his look at Falcon Northwest’s Tiki, when it comes to Titan “The days of a high end gaming rig being obnoxiously loud are thankfully over.”

Wrapping things up, on Monday we’ll be taking a look at the final piece of the puzzle: Origin’s tri-SLI full tower Genesis PC. The Genesis has been an interesting beast for its use of water cooling with Titan, and with the Titan launch behind us we can now focus on what it takes to feed 3 Titan video cards and why it’s an impeccable machine for multi-monitor/surround gaming. So until then, stay tuned.

Power, Temperature, & Noise
Comments Locked

337 Comments

View All Comments

  • JeBarr - Thursday, February 21, 2013 - link

    I would guess because as time goes by the reviewers here (and elsewhere) think they need to bench at settings used by the "majority". Even when that majority doesn't frequent, or even know the existance of, Anandtech.com. Go figure.

    I don't like it any more than you do...but for different reasons.

    I for one was happy to have a review site still benching at 16:10...which is what the long-time hardware enthusiasts/gamers prefer, that is, when they can't find a good CRT monitor ;)

    Just think of this review as the new bench standard going forward. A new starting point, if you will.
  • Ryan Smith - Monday, February 25, 2013 - link

    Bench 2013 will be going live soon. The backend is done (it's what I used to store and generate the charts here), but the frontend is part of a larger project...

    As for why the settings change, when we refresh our suite we sometimes change our settings to match what the latest generation of cards can do. When Titan sets the high bar for example, running 2560 at Ultra with 4xMSAA is actually practical.
  • TheJian - Thursday, February 21, 2013 - link

    NO Borderlands 2 (~6 million copies sold rated 89! not counting the addons rated high also)
    No Diablo3 (I hate the DRM but 10million+ sold of course rated high, but not by users)
    No Guild 2 (MMO with 3million copies sold rated 90!) even WOW Mists of pandaria has 3million or so now and 11 million playing the game's total content. I don't play WOW but it's still got a TON of users.
    No Assassin's Creed 3 (brings 680/7970 to low 30's 2560x1600)
    Crysis 3, warhead needs to die, and this needs to replace it (at the very LEAST). As shown below NOBODY is playing warhead. Wasted page space, and time spend benching it.

    Instead we get Crysis warhead...ROFL Well what can we expect Ryan still loves AMD.
    http://www.gametracker.com/search/warhead/
    Notice all the empty servers? Go ahead list them by players only 3 had over 10!..Most are ZERO players...LOL...Why even waste your time benchmarking this ignored game? Just to show NV weakness?
    Dirt Showdown - Raise your hand if you play this...Nope, you're all playing Dirt3 (wisely, or F1 etc anything that rates better than showdown)
    User ratings on metacritic of 70/4.7 (out of TEN not 5) and best summarized by gamespy (rated it a 40/100 on the frontpage of the metacritic site: http://www.metacritic.com/game/pc/dirt-showdown
    "DiRT: Showdown delivers bargain-basement entertainment value for the high, high price of $50. With its neutered physics, limited driving venues, clunky multiplayer, and diminished off-road racing options, discerning arcade racing fans should just write this one off as an unanticipated pothole in Codemaster's trailblazing DiRT series. "
    If you're going to use a racing game, at least make it a good one, not just the one AMD wins in. Why not F1 2012 (scored 80 at metacritic/6.8 from users). AMD wins in warhead which is also why crysis warhead is chosen even though nobody plays it (it's from 2008!). Again check the server list, who are you testing this for? What does it represent today? What other game based on it's engine? It's representing nothing correct? Nobody plays showdown either.

    How about adding some games people actually PLAY. I thought the whole point of benchmarking is to show us how games WE PLAY will run, is that not true at anandtech?

    Also no discussion of the frame delay ala Techreport:
    http://techreport.com/review/24381/nvidia-geforce-...
    No discussion of the frame latency issues that AMD is working on game by game. Their current beta I think just fixed the skyrim/borderland/guild wars2 issues which were awful.
    http://techreport.com/review/24218/a-driver-update...
    This has been an ongoing problem Anantech (ryan?) seems to just ignore. AMD is just getting to fixing this stuff in Jan...LOL. You can read more about it in the rematch of the 660TI/7950 here:
    http://techreport.com/review/23981/radeon-hd-7950-...
    Of course you can start at the beginning but this is where they recommend the 660TI and why (dec 2012 article).
    "The FPS average suggests near-parity performance between the 7950 and the GTX 660 Ti, with a tiny edge to the GeForce. The 99th percentile frame time, though, captures the impact of the Radeon's frame latency issues and suggests the GTX 660 Ti is easily the superior performer."
    More:
    "Instead, we have a crystal clear recommendation of the GeForce GTX 660 Ti over the Radeon HD 7950 for this winter's crop of blockbuster games. Perhaps AMD will smooth out some of the rough patches in later driver releases, but the games we've tested are already on the market—and Nvidia undeniably delivers the better experience in them, overall. "
    Even Tomshardware reports on delays now (albeit the wrong metric...LOL). Read the comments at techreport for why they're using the wrong one.

    No wonder they left out the xmas blockbusters and diablo3 (which will still sell probably 15million over it's life even though I would never buy it). I can name other games that are hot and new also:
    Dishonored, Deadspace 3, max payne 3, all highly rated. Max 3 barely hits 50's on top cards at 2560x1600 (7970ghz, 680 even lower), excellent test game and those are NOT the minimums (which can bring you to 20's/teens on lower cards). Witcher 2 (witcher 3 is coming), with uber sampling ENABLED is a taxer also.

    Dragon Age 2 at 2560x1600 will bring 7970/680 to teens/20's at minimums also, barely hits 40's avg (why use ONLY AVG at techspot I don't know, but better than maxes).
    http://www.techspot.com/review/603-best-graphics-c...

    START reporting MIN FPS for every game benched! There should be more discussion of the fact that in a lot of these games you hit teens for even $500 cards at 2560x1600 maxed out. Max fps means NOTHING. IF you hit 10-20fps a lot in a game your max means nothing. You won't want to play at that res, so what have you shown me? NOTHING. You should ALWAYS report MIN FPS as that dictates our gameplay experience and if it isn't always above 30 life sucks usually. Farcry 3 hits below 30 on both 680/7970 at 2560x1600.
    http://www.hardocp.com/article/2013/02/21/nvidia_g...
    And they don't have them on ULTRA, only titan is and none on 4xmsaa. At least they're giving max details/res you can expect to play and what it's min will be (better, you at least have USEFUL info after reading their benchmarks).

    From your article:
    "This is enough to get Titan to 74fps at 2560 with 4xMSAA, which is just fast enough to make BF3 playable at those settings with a single GPU."
    Why didn't you just report the minimums so we can see when ALL cards hit 30fps or less in all resolutions tested? If the game doesn't give a way to do this use fraps while running it (again, for ALL games). So it takes 74fps to get playable in BF3? It's easier to just give the minimums so people can see, otherwise are we supposed to attempt to extrapolate every one of your games without MINS listed? You did it for us in this sentence, but for ONE card and even then it's just a comment, not a number we can work with. It's YOU extrapolating your own guess that it would be playable given 74fps. What kind of benchmarking is this? I won't even get into your other comments throughout the articles on titan, It's more important to me to key on what you totally ignore that is VERY important to anyone picking ANY gpu. SMOOTHNESS of gameplay (latency testing) and MIN FPS so we know where we have no prayer of playing or what to expect playable on a given gpu. This is why Hardocp actually points to you guys as why your benchmarks suck. It's linked in most of their articles...LOL. FIX IT.
    http://www.hardocp.com/article/2008/02/11/benchmar...
    They have that in nearly every gpu article including the titan article. It's a valid point. But if you're not going to use IN GAME play, at least give min fps for canned etc. That link is in the test setup page of nearly every article on hardocp, you'd think you'd fix this so they'd stop. Your benchmarks represent something that doesn't reflect gameplay in most cases. The maxfps doesn't dictate fun factor. MIN does.

    One comment on Titan, I'd think about it at $800-850. Compute isn't important today at home for me, and won't be until more games use it like civ5 (they're just scratching surface here). At that point this card could become a monster compared to 690 without heat, noise etc. One day it may be worth $1000 to me, but for now it's not worth more than $800 (to me, no SFF needed, no compute needed). I don't like any dual chips or running multiple cards (see microstutter, latency delays etc), so once cheaper this would be tops on my list, but I don't usually spend over $360 on a card anyway...LOL. Most of the first run will go to boutique shops (20K first run I think). Maybe they'll drop it after that.

    LOL at anyone thinking the price sucks. Clearly you are NOT the target market. If you're product sells out at a given price, you priced it right. That's good business, and actually you probably should have asked more if it's gone in hours. You can still an SLI of titan in SFF, what other card can do that? You always pay a premium for the TOP card. Intel's extreme chips are $1000 too...No surprise. Same thing on the pro side is $2500 and not much different. IT's 20% slower than 690, but 690 can't go into SFF for the most part and certainly not as quiet or controllable. Also blows away 690 in compute if someone is after that. Though they need APPS that test this, not some home made anandtech benchmark. How about testing something I can actually USE and is relevant (no I don't count folding@home or bitcoin mining either, they don't make me money-a few coins?...LOL).
  • JeBarr - Thursday, February 21, 2013 - link

    I'm pretty sure Ryan has mentioned the benches you want are forthcoming. Maybe they haven't figured it all out yet...i dunno....but like you, I've been waiting what seems like a year or more for Anandtech to catch up with reality in GPU benching.
  • CeriseCogburn - Tuesday, February 26, 2013 - link

    Yes, well I've found Frame Rate Target to be an absolute GEM in this area:

    " START reporting MIN FPS for every game benched! There should be more discussion of the fact that in a lot of these games you hit teens for even $500 cards at 2560x1600 maxed out. Max fps means NOTHING. IF you hit 10-20fps a lot in a game your max means nothing. "

    If you crank to max settings then have frame drop issues, FRAME RATE TARGET by nVidia of course, is excellent for minimizing and eliminating that issue.
    It really is a great and usable feature, and of course is for the most now already completely ignored.

    It was ported back to at least the top 500 series cards I don't remember exactly which ones right now, but that feature should have an entire article dedicated to it at every review site. It is AWESOME, and directly impacts minimum frame rates lofting nVidia to absolutely playable vs amd.

    I really think the bias won't ever be overcome. We used to hear nothing but eyefinity, yet now with nvidia cards capable of 4 monitors out of the box, it has suddenly become very unpopular for reviewers to mention eyefinity, surround, and surround plus ONE MORE in the nVidia case, without the need for any special adapters in many of nViida's partners card releases.

    So, it's really a sick situation.
  • Urbanos - Friday, February 22, 2013 - link

    he went through all the trouble of benchmarking in order to show that entry points for budget conscious users can get through Titan, but it doesn't actually prove that Titan is even worth the money without comparing it to at least 1 of its bigger competitors in the GPGPU market. Can you please consider adding that or having a new review based on the compute only.
  • codedivine - Friday, February 22, 2013 - link

    I am certainly interested in looking at the Xeon Phi if I can find the time and if we can arrange the resources to do so.

    My performance expectation (based on Intel white papers) is about 1700-1800 GFLops for SGEMM and 800-900 GFlops for DGEMM on the Xeon Phi 5110P. However, there are also a few benchmarks where I am expecting them to win as well thanks to the large cache on the Phi. Stay tuned.
  • Ryan Smith - Monday, February 25, 2013 - link

    This is really a consumer/prosumer level review, so the cards we're going to judge it against need to be comparable in price and intended audience. Not only can we not get some of those parts, but all of them cost many times more than Titan.

    If we were ever able to review K20, then they would be exactly the kinds of parts we'd try to include though.
  • kivig - Friday, February 22, 2013 - link

    There is a whole community of 3D people interested.
    Or when it will get added to bench table?
  • etriky - Saturday, February 23, 2013 - link

    +1
    Since this card at this price point is pointless for gaming I figured the article would be heavy on compute applications in order to give us a reason for it's existence.

    But then, nothing. No SmallLuxGpu or Cycles. Not even any commercial packages like Octane, or any of the Adobe products. I know LuxGPU and Blender used to be in the test suite. What happened?

Log in

Don't have an account? Sign up now