Final Thoughts

NVIDIA is primarily pitching the GeForce GTX 780 as the next step in their high-end x80 line of video cards, a role it fits into well. At the same time however I can’t help but to keep going back to GTX Titan comparisons due to the fact that the GTX 780 is by every metric a cut-down GTX Titan card. Whether this is a good thing or not is open to debate, but with NVIDIA’s emergence into the prosumer market with GTX Titan and the fact that there’s now a single-GPU video card above the traditionally top-tier x80 card, this complicates things as compared to past x80 card launches.

Anyhow, we’ll start with the obvious: the GeForce GTX 780 is a filler card whose most prominent role will be filling the game between sub-$500 cards and this odd prosumer/luxury/ultra-enthusiast market that has taken root above $500. If there’s to be a $1000 single-GPU card in NVIDIA’s product stack then it’s simply good business to have something between that and the sub-$500 market, and that something is the GTX 780.

For the small number of customers that can afford a card in this price segment, the GTX 780 is an extremely strong contender. In fact it’s really the only contender – at least as far as single-GPU cards go – as AMD won’t directly be competing with GK110. The end result is that with the GTX 780 delivering an average of 90% of Titan’s gaming performance for 65% of the price, this is by all rights the Titan Mini, the cheaper video card Titan customers have been asking for. From that perspective the GTX 780 is nothing short of an amazing deal for the level of performance offered, especially since it maintains the high build quality and impressive acoustics that helped to define Titan.

On the other hand, as an x80 card the GTX 780 is pretty much a tossup. The full generational performance improvement is absolutely there, as the GTX 780 beats the last-generation GTX 580 by an average of 80%. NVIDIA knows their market well, and for most buyers in a 2-3 year upgrade cycle this is the level of performance necessary to spur on an upgrade.

The catch comes down to pricing. $650 for the GTX 780 makes all the sense in the world from NVIDIA’s perspective – GTX Titan sales have exceeded NVIDIA’s expectations – so between that and Tesla K20 sales the GK110 GPU is in high demand right now. At the same time the performance of the GTX 780 is high enough that AMD can’t directly compete with the card, leaving NVIDIA without competition and free to set prices as they would like, and this is exactly what they have done.

This doesn’t make GTX 780 a bad card, and on the contrary it’s probably a better card than any x80 card before it, particularly when it comes to build quality. But it’s $650 for a product tier that for the last 5 years was a $500 product tier. To that end no one likes a price increase, ourselves included. Ultimately some fraction of the traditional x80 market will make the jump to $650, and for the rest there will be the remainder of the GeForce 700 family or holding out for the eventual GeForce 800 family.

Moving on, it’s interesting to note that with the launch of Titan and now the GTX 780, the high-end single-GPU market looks almost exactly like it did back in 2011. The prices have changed, but otherwise we’ve returned to unchallenged NVIDIA domination of the high end, with AMD fighting the good fight at lower price points. The 22% performance advantage that the GTX 780 enjoys over the Radeon HD 7970GHz Edition cements NVIDIA’s performance lead, while the price difference between the cards means that the 7970GE is still a very strong contender in its current $400 market and a clear budget-saving spoiler like the 6970 before it.

Finally, to bring things to a close we turn our gaze towards the future of the rest of the GeForce 700 family.  The GTX 780 is the first of the GeForce 700 family but it clearly won’t be the last. A cut-down GK110 card as GTX 780 was the logical progression for NVIDIA, but what to use to replace GTX 670 is a far murkier question as NVIDIA has a number of good choices at their disposal. Mull that over for a bit, and hopefully we’ll be picking up the subject soon.

Power, Temperature, & Noise
Comments Locked

155 Comments

View All Comments

  • Stuka87 - Thursday, May 23, 2013 - link

    The video card does handle the decoding and rendering for the video. Anand has done several tests over the years comparing their video quality. There are definite differences between AMD/nVidia/Intel.
  • JDG1980 - Thursday, May 23, 2013 - link

    Yes, the signal is digital, but the drivers often have a bunch of post-processing options which can be applied to the video: deinterlacing, noise reduction, edge enhancement, etc.
    Both AMD and NVIDIA have some advantages over the other in this area. Either is a decent choice for a HTPC. Of course, no one in their right mind would use a card as power-hungry and expensive as a GTX 780 for a HTPC.

    In the case of interlaced content, either the PC or the display device *has* to apply post-processing or else it will look like crap. The rest of the stuff is, IMO, best left turned off unless you are working with really subpar source material.
  • Dribble - Thursday, May 23, 2013 - link

    To both of you above, on DVD yes, not on bluray - there is no interlacing, noise or edges to reduce - bluray decodes to a perfect 1080p picture which you send straight to the TV.

    All the video card has to do is decode it, which why a $20 bluray player with $5 cable will give you exactly the same picture and sound quality as a $1000 bluray player with $300 cable - as long as TV can take the 1080p input and hifi can handle the HD audio signal.
  • JDG1980 - Thursday, May 23, 2013 - link

    You can do any kind of post-processing you want on a signal, whether it comes from DVD, Blu-Ray, or anything else. A Blu-Ray is less likely to get subjective quality improvements from noise reduction, edge enhancement, etc., but you can still apply these processes in the video driver if you want to.

    The video quality of Blu-Ray is very good, but not "perfect". Like all modern video formats, it uses lossy encoding. A maximum bit rate of 40 Mbps makes artifacts far less common than with DVDs, but they can still happen in a fast-motion scene - especially if the encoders were trying to fit a lot of content on a single layer disc.

    Most Blu-Ray content is progressive scan at film rates (1080p23.976) but interlaced 1080i is a legal Blu-Ray resolution. I believe some variants of the "Planet Earth" box set use it. So Blu-Ray playback devices still need to know how to deinterlace (assuming they're not going to delegate that task to the display).
  • Dribble - Thursday, May 23, 2013 - link

    I admit it's possible to post process but you wouldn't, a real time post process is highly unlikely to add anything good to the picture - fancy bluray players don't post process, they just pass on the signal. As for 1080i that's a very unusual case for bluray, but as it's just the standard HD TV resolution again pass it to the TV - it'll de-interlace it just like it does all the 1080i coming from your cable/satelight box.
  • Galidou - Sunday, May 26, 2013 - link

    ''All the video card has to do is decode it, which why a $20 bluray player with $5 cable will give you exactly the same picture and sound quality as a $1000 bluray player with $300 cable - as long as TV can take the 1080p input and hifi can handle the HD audio signal.''

    I'm an audiophile and a professionnal when it comes to hi-end home theater, I myself built tons of HT system around PCs and or receivers and I have to admit this is the funniest crap I've had to read. I'd just like to know how many blu-ray players you've personnally compared up to let's say the OPPO BDP -105(I've dealt with pricier units than this mere 1200$ but still awesome Blu-ray player).

    While I can certainly say that image quality not affected by much, the audio on the other side sees DRASTIC improvements. Hardware not having an effect on sound would be like saying: there's no difference between a 200$ and a 5000$ integrated amplifier/receiver, pure non sense.

    ''the same picture and sound quality''

    The part speaking about sound quality should really be removed from your comment as it really astound me to think you can beleive what you said is true.
  • eddman - Thursday, May 23, 2013 - link

    http://i.imgur.com/d7oOj7d.jpg
  • EzioAs - Thursday, May 23, 2013 - link

    If I were a Titan owner (and I actually purchase the card, not some free giveaway or something), I would regret that purchase very, very badly. $650 is still a very high price for the normal GTX x80 cards but it makes the Titan basically a product with incredibly bad pricing (not that we don't know that already). Still, I'm no Titan owner, so what do I know...

    On the other hand, when I look at the graphs, I think the HD7970 is an even better card than ever despite it being 1.5 years older. However, as Ryan pointed out for previous GTX500 users who plan on sticking with Nvidia and are considering high end cards like this, it may not be a bad card at all since there are situations (most of the time) where the performance improvements are about twice the GTX580.
  • JeffFlanagan - Thursday, May 23, 2013 - link

    I think $350 is almost pocket-change to someone who will drop $1000 on a video card. $1K is way out of line with what high-quality consumer video cards go for in recent years, so you have to be someone who spends to say they spent, or someone mining one of the bitcoin alternatives in which case getting the card months earlier is a big benefit.
  • mlambert890 - Thursday, May 23, 2013 - link

    I have 3 Titans and don't regret them at all. While I wouldn't say $350 is "pocket change" (or in this case $1050 since its x3), it also is a price Im willing to pay for twice the VRAM and more perf. With performance at this level "close" doesn't count honestly if you are looking for the *highest* performance possible. Gaming in 3D surround even 3xTitan actually *still* isn't fast enough, so no way I'd have been happy with 3x780s for $1000 less.

Log in

Don't have an account? Sign up now