Final Thoughts

NVIDIA is primarily pitching the GeForce GTX 780 as the next step in their high-end x80 line of video cards, a role it fits into well. At the same time however I can’t help but to keep going back to GTX Titan comparisons due to the fact that the GTX 780 is by every metric a cut-down GTX Titan card. Whether this is a good thing or not is open to debate, but with NVIDIA’s emergence into the prosumer market with GTX Titan and the fact that there’s now a single-GPU video card above the traditionally top-tier x80 card, this complicates things as compared to past x80 card launches.

Anyhow, we’ll start with the obvious: the GeForce GTX 780 is a filler card whose most prominent role will be filling the game between sub-$500 cards and this odd prosumer/luxury/ultra-enthusiast market that has taken root above $500. If there’s to be a $1000 single-GPU card in NVIDIA’s product stack then it’s simply good business to have something between that and the sub-$500 market, and that something is the GTX 780.

For the small number of customers that can afford a card in this price segment, the GTX 780 is an extremely strong contender. In fact it’s really the only contender – at least as far as single-GPU cards go – as AMD won’t directly be competing with GK110. The end result is that with the GTX 780 delivering an average of 90% of Titan’s gaming performance for 65% of the price, this is by all rights the Titan Mini, the cheaper video card Titan customers have been asking for. From that perspective the GTX 780 is nothing short of an amazing deal for the level of performance offered, especially since it maintains the high build quality and impressive acoustics that helped to define Titan.

On the other hand, as an x80 card the GTX 780 is pretty much a tossup. The full generational performance improvement is absolutely there, as the GTX 780 beats the last-generation GTX 580 by an average of 80%. NVIDIA knows their market well, and for most buyers in a 2-3 year upgrade cycle this is the level of performance necessary to spur on an upgrade.

The catch comes down to pricing. $650 for the GTX 780 makes all the sense in the world from NVIDIA’s perspective – GTX Titan sales have exceeded NVIDIA’s expectations – so between that and Tesla K20 sales the GK110 GPU is in high demand right now. At the same time the performance of the GTX 780 is high enough that AMD can’t directly compete with the card, leaving NVIDIA without competition and free to set prices as they would like, and this is exactly what they have done.

This doesn’t make GTX 780 a bad card, and on the contrary it’s probably a better card than any x80 card before it, particularly when it comes to build quality. But it’s $650 for a product tier that for the last 5 years was a $500 product tier. To that end no one likes a price increase, ourselves included. Ultimately some fraction of the traditional x80 market will make the jump to $650, and for the rest there will be the remainder of the GeForce 700 family or holding out for the eventual GeForce 800 family.

Moving on, it’s interesting to note that with the launch of Titan and now the GTX 780, the high-end single-GPU market looks almost exactly like it did back in 2011. The prices have changed, but otherwise we’ve returned to unchallenged NVIDIA domination of the high end, with AMD fighting the good fight at lower price points. The 22% performance advantage that the GTX 780 enjoys over the Radeon HD 7970GHz Edition cements NVIDIA’s performance lead, while the price difference between the cards means that the 7970GE is still a very strong contender in its current $400 market and a clear budget-saving spoiler like the 6970 before it.

Finally, to bring things to a close we turn our gaze towards the future of the rest of the GeForce 700 family.  The GTX 780 is the first of the GeForce 700 family but it clearly won’t be the last. A cut-down GK110 card as GTX 780 was the logical progression for NVIDIA, but what to use to replace GTX 670 is a far murkier question as NVIDIA has a number of good choices at their disposal. Mull that over for a bit, and hopefully we’ll be picking up the subject soon.

Power, Temperature, & Noise
Comments Locked

155 Comments

View All Comments

  • milkman001 - Friday, May 24, 2013 - link

    FYI,

    On the "Total War: Shogun 2" page, you have the 2650x1440 graph posted twice.
  • JDG1980 - Saturday, May 25, 2013 - link

    I don't think that the release of this card itself is problematic for Titan owners - everyone knows that GPU vendors start at the top and work their way down with new silicon, so this shouldn't have come as much of a surprise.

    What I do find problematic is their refusal to push out BIOS-based fan controller improvements to Titan owners. *That* comes off as a slap in the face. Someone spends $1000 on a new video card, they deserve top-notch service and updates.
  • inighthawki - Saturday, May 25, 2013 - link

    The typically swapchain format is something like R8G8B8A8 and the alpha channel is typically ignored (value of 0xFF typically written), since it is of no use to the OS, since it will not alpha blend with the rest of the GUI. You can create a 24-bit format, but it's very likely that for performance reasons, the driver will allocate it as if it were a 32-bit format, and not write to the upper 8 bits. The hardware is often only capable of writing to 32 bit aligned places, so its more beneficial for the hardware to just waste 8 bits of data and not have to do any fancy shifting to read or write from each pixel. I've actually seen cases where some drivers will allocate 8-bit formats as 32-bit formats, wasting 4x the space the user thought they were allocating.
  • jeremyshaw - Saturday, May 25, 2013 - link

    As a current GTX580 owner running at 2560x1440, I don't have any want of upgrade, especially in compute performance. I think I'll hold out for at least one more generation, before deciding.
  • ahamling27 - Saturday, May 25, 2013 - link

    As a GTX 560 Ti owner, I am chomping at the bit to pick up an upgrade. The Titan was out of the question, but the 780 looks a lot better at 65% of the cost for 90% of the performance. The only thing holding me back is that I'm still on z67 with a 2600k overclocked to 4.5 ghz. I don't see a need to rebuild my entire system as it's almost on par with the z77/3770. The problem is that I'm still on PCIe 2.0 and I'm worried that it would bottleneck a 780.

    Considering a 780 is aimed at us with 5xx or lower cards, it doesn't make sense if we have to abandon our platform just to upgrade our graphics card. So could you maybe test a 780 on PCIe 2.0 vs 3.0 and let us know if it's going to bottleneck on 2.0?
  • Ogdin - Sunday, May 26, 2013 - link

    There will be no bottleneck.
  • mapesdhs - Sunday, May 26, 2013 - link


    Ogdin is right, it shouldn't be a bottleneck. And with a decent air cooler, you ought to be
    able to get your 2600K to 5.0, so you have some headroom there aswell.

    Lastly, you say you have a GTX 560 Ti. Are you aware that adding a 2nd card will give
    performance akin to a GTX 670? And two 560 Tis oc'd is better than a stock 680 (VRAM
    capacity not withstanding, ie. I'm assuming you have a 1GB card). Here's my 560Ti SLI
    at stock:

    http://www.3dmark.com/3dm11/6035982

    and oc'd:

    http://www.3dmark.com/3dm11/6037434

    So, if you don't want the expense of an all new card for a while at the cost level of a 780,
    but do want some extra performance in the meantime, then just get a 2nd 560Ti (good
    prices on eBay these days), it will run very nicely indeed. My two Tis were only 177 UKP
    total - less than half the cost of a 680, though atm I just run them at stock speed, don't
    need the extra from an oc. The only caveat is VRAM, but that shouldn't be too much of
    an issue unless you're running at 2560x1440, etc.

    Ian.
  • ahamling27 - Wednesday, May 29, 2013 - link

    Thanks for the reply! I thought about SLI but ultimately the 1 GB of vram is really going to hurt going forward. I'm not going to grab a 780 right away, because I want to see what custom models come out in the next few weeks. Although, EVGA's ACX cooler looks nice, I just want to see some performance numbers before I bite the bullet.

    Thanks again!
  • inighthawki - Tuesday, May 28, 2013 - link

    Your comment is inaccurate. Just because a game requires "only 512MB" of video ram doesn't mean that's all it'll use. Video memory can be streamed in on the fly as needed off the hard drive, and as a result you can easily use a lot if you wanted as a performance optimization. I would not be the least bit surprised to see games on next gen consoles using WAY more video memory than regular memory. Running a game that "requires" 512MB of VRAM on a GPU with 4GB of VRAM gives it 3.5GB more storage to cache higher resolution textures.
  • AmericanZ28 - Tuesday, May 28, 2013 - link

    NVIDIA=FAIL....AGAIN! 780 Performs on par with a 7970GE, yet the GE costs $100 LESS than the 680, and $250 LESS than the 780.

Log in

Don't have an account? Sign up now