Final Words

Bringing this review to a close, after having seen NVIDIA upstage themselves a couple of times this year already with GK110, it’s admittedly getting a bit harder each time to write about NVIDIA’s flagship GPU. NVIDIA won’t break significant new ground just by refreshing GK110, but gradual performance increases in conjunction with periodic price drops have kept the market fresh while making NVIDIA’s high-end cards a bit faster and a bit cheaper each time. So in that respect we’re enthusiastic about seeing NVIDIA finally release a fully enabled GK110 GeForce card and the performance improvements it brings.

With that in mind, with the release of the GeForce GTX 780 Ti NVIDIA is once more left solidly in control of the single-GPU performance crown. It won’t quite get to claim a massive performance advantage over its closest competitors, but at the end of the day it’s going to be faster than any other single-GPU card out there. This will break down to being 11% faster than Radeon R9 290X, 9% faster than GTX Titan, and a full 20% faster than the original GTX 780 that it formally replaces.

To that end, while NVIDIA can still charge top dollar for their flagship card it’s a sign of the times and of the competition that they released their fully enabled GK110 part as a cheaper GTX 780 series card. At $700 it’s by no means cheap – and this has and always will be the drawback to NVIDIA’s flagships so long as NVIDIA can hold the lead – but it also means that NVIDIA does need to take AMD’s Radeon R9 290 series into account. As such the 290X and the GTX 780, though lesser performing parts, will remain as spoilers for GTX 780 Ti due to their better balance of performance and pricing. All the while GTX 780 Ti stands at the top of the heap for those who want the best.

Meanwhile we bid au revoir to the original GK110 GeForce card, GTX Titan. Though GTX Titan will still be on the market as an entry level compute card, it is finally dethroned as the fastest single-GPU gaming card in NVIDIA’s lineup. At least for the time being GTX Titan is still very secure in its place in the market as a compute card, and so there it will continue, a position that reflects the fact that there’s little need for NVIDIA to keep their gaming and compute products commingled together as a single product. Though we wouldn’t be the least bit surprised if NVIDIA made additional prosumer products of this nature in the future, as GTX Titan clearly worked out well for the company.

And though GTX Titan is falling off of our radar, we’re glad to see that NVIDIA has kept around Titan’s second most endearing design element, the Titan cooler. We won’t hazard to guess just how much it costs NVIDIA over a cheaper design (or what it adds to the final price tag), but with GTX 780 Ti NVIDIA has once again proven just how capable the cooler is when paired with GK110. Even with the slightly higher power consumption of GTX 780 Ti versus the cards that have come before it, thanks to that cooler GTX 780 Ti still hits an excellent sweet spot between performance and noise, offering the flexibility and simplicity of a blower without the noise that has traditionally accompanied such a cooler. And all the while still delivering more than enough performance to hold on to the performance crown.

Finally, let’s talk about SLI for a moment. Much like GTX Titan before it, GTX 780 Ti is so fast that it’s almost more than enough on its own for any standard single-monitor resolution. Even 2560x1440 with high settings isn’t enough to bog down GTX 780 Ti in most games, which makes a pair of GTX 780 Tis in SLI overkill by any definition. Properly using that much power requires multiple monitors, be it an Eyefinity/Surround setup, or more recently a tiled 4K monitor.

In either scenario a GTX 780 Ti is going to be a solid performer for those segments, but NVIDIA is going to have to deal with the fact that their performance advantage is going to melt away with the resolution increase. Right now a single GTX 780 Ti has a solid lead over a single 290X, but a pair of GTX 780 Tis is going to tie with a pair of cheaper 290Xs at 4K resolutions. And with 290X’s frame pacing under control NVIDIA no longer has that advantage to help build their case. GTX 780 Ti still has other advantages – power and noise in particular – but it does mean we’re in an interesting situation where NVIDIA can claim the single-GPU performance crown while the crown for the dual-GPU victor remains up for grabs. It's still very early in the game for 4K and NVIDIA isn't under any great pressure, but it will be an area of improvement for the next generation when Maxwell arrives in 2014.

Overclocking
Comments Locked

302 Comments

View All Comments

  • A5 - Thursday, November 7, 2013 - link

    BF4 has a built-in benchmark too, but I have no idea how good it is. I'd guess they're waiting on a patch?

    If nothing else, there will be BF4 results if/when that Mantle update comes out.
  • IanCutress - Thursday, November 7, 2013 - link

    BF4 has a built in benchmark tool? I can't find any reference to one.
  • Ryan Smith - Thursday, November 7, 2013 - link

    BF3 will ultimately get replaced with BF4 later this month. For the moment with all of the launches in the past few weeks, we haven't yet had the time to sit down and validate BF4, let alone collect all of the necessary data.
  • 1Angelreloaded - Thursday, November 7, 2013 - link

    Hell man people run FEAR still as a benchmark because of how brutal it is against GPU/CPU/HDD.
  • Bakes - Thursday, November 7, 2013 - link

    I think it's better to wait until driver performance stabilizes for new applications before basing benchmarks on them. If you don't then early benchmark numbers become useless for comparison sake.
  • TheJian - Thursday, November 7, 2013 - link

    I would argue warhead needs to go. Servers for that game have been EMPTY for ages and ZERO people play it. You can ask to add BF4, but to remove BF3 given warhead is included (while claiming bf3 old) is ridiculous. How old is Warhead? 7-8 years? People still play BF3. A LOT of people. I would argue they need to start benchmarking based on game sales.
    Starcraft2, Diablo3, World of Warcraft Pandaria, COD Black ops 2, SplinterCell Blacklist, Assassins Creed 3 etc etc... IE, black ops 2 has over 5x the sales of Hitman Absolution. Which one should you be benchmarking?
    Warhead...OLD.
    Grid 2 .03 total sales for PC says vgchartz
    StarCraft 2 5.2.mil units (just PC).
    Which do you think should be benchmarked?

    Even Crysis 3 only has .27mil units says vgchartz.
    Diablo 3? ROFL...3.18mil for PC. So again, 11.5x Crysis 3.

    Why are we not benchmarking games that are being sold in the MILLIONS of units?
    WOW still has 7 million people playing and it can slow down a lot with tons of people doing raids etc.
  • TheinsanegamerN - Friday, November 8, 2013 - link

    because any halfway decent machine can run WoW? they use the most demanding games to show how powerful the gpu really is. 5760x1080p with 4xMSAA gets 69 FPS with the 780ti.
    why benchmark hitman over black ops? simple, it is not what we call demanding.
    they use demanding games. not the super popular games thatll run on hardware from 3 years ago.
  • powerarmour - Thursday, November 7, 2013 - link

    Well, that time on the throne for the 290X lasted about as long as Ned Stark...
  • Da W - Thursday, November 7, 2013 - link

    I look at 4K gaming since i play in 3X1 eyefinity (being +/- 3.5K gaming).
    At these resolution i see an average of 1FPS lead for 780Ti over 290X. For 200$ more.
    Power consumption is about the same.
    And as far as temperature go, it's temperature AT THE CHIP level. Both cards will heat your room equally if they consume as much power.

    The debate is really about the cooler, and Nvidia got an outright lead as far as cooling goes.
  • JDG1980 - Thursday, November 7, 2013 - link

    It seems to me that both Nvidia and AMD are charging too much of a price premium for their top-end cards. The GTX 780 Ti isn't worth $200 more than the standard GTX 780, and the R9 290X isn't worth $150 more than the standard R9 290.

    For gamers who want a high-end product but don't want to unnecessarily waste money, it seems like the real competition is between the R9 290 ($399) and the GTX 780 ($499). At the moment the R9 290 has noise issues, but once non-reference cards become available (supposedly by the end of this month), AMD should hold a comfortable lead. That said, the Titan Cooler is indeed a really nice piece of industrial design, and I can see someone willing to pay a bit extra for it.

Log in

Don't have an account? Sign up now