Final Words

Bringing this review to a close, after having seen NVIDIA upstage themselves a couple of times this year already with GK110, it’s admittedly getting a bit harder each time to write about NVIDIA’s flagship GPU. NVIDIA won’t break significant new ground just by refreshing GK110, but gradual performance increases in conjunction with periodic price drops have kept the market fresh while making NVIDIA’s high-end cards a bit faster and a bit cheaper each time. So in that respect we’re enthusiastic about seeing NVIDIA finally release a fully enabled GK110 GeForce card and the performance improvements it brings.

With that in mind, with the release of the GeForce GTX 780 Ti NVIDIA is once more left solidly in control of the single-GPU performance crown. It won’t quite get to claim a massive performance advantage over its closest competitors, but at the end of the day it’s going to be faster than any other single-GPU card out there. This will break down to being 11% faster than Radeon R9 290X, 9% faster than GTX Titan, and a full 20% faster than the original GTX 780 that it formally replaces.

To that end, while NVIDIA can still charge top dollar for their flagship card it’s a sign of the times and of the competition that they released their fully enabled GK110 part as a cheaper GTX 780 series card. At $700 it’s by no means cheap – and this has and always will be the drawback to NVIDIA’s flagships so long as NVIDIA can hold the lead – but it also means that NVIDIA does need to take AMD’s Radeon R9 290 series into account. As such the 290X and the GTX 780, though lesser performing parts, will remain as spoilers for GTX 780 Ti due to their better balance of performance and pricing. All the while GTX 780 Ti stands at the top of the heap for those who want the best.

Meanwhile we bid au revoir to the original GK110 GeForce card, GTX Titan. Though GTX Titan will still be on the market as an entry level compute card, it is finally dethroned as the fastest single-GPU gaming card in NVIDIA’s lineup. At least for the time being GTX Titan is still very secure in its place in the market as a compute card, and so there it will continue, a position that reflects the fact that there’s little need for NVIDIA to keep their gaming and compute products commingled together as a single product. Though we wouldn’t be the least bit surprised if NVIDIA made additional prosumer products of this nature in the future, as GTX Titan clearly worked out well for the company.

And though GTX Titan is falling off of our radar, we’re glad to see that NVIDIA has kept around Titan’s second most endearing design element, the Titan cooler. We won’t hazard to guess just how much it costs NVIDIA over a cheaper design (or what it adds to the final price tag), but with GTX 780 Ti NVIDIA has once again proven just how capable the cooler is when paired with GK110. Even with the slightly higher power consumption of GTX 780 Ti versus the cards that have come before it, thanks to that cooler GTX 780 Ti still hits an excellent sweet spot between performance and noise, offering the flexibility and simplicity of a blower without the noise that has traditionally accompanied such a cooler. And all the while still delivering more than enough performance to hold on to the performance crown.

Finally, let’s talk about SLI for a moment. Much like GTX Titan before it, GTX 780 Ti is so fast that it’s almost more than enough on its own for any standard single-monitor resolution. Even 2560x1440 with high settings isn’t enough to bog down GTX 780 Ti in most games, which makes a pair of GTX 780 Tis in SLI overkill by any definition. Properly using that much power requires multiple monitors, be it an Eyefinity/Surround setup, or more recently a tiled 4K monitor.

In either scenario a GTX 780 Ti is going to be a solid performer for those segments, but NVIDIA is going to have to deal with the fact that their performance advantage is going to melt away with the resolution increase. Right now a single GTX 780 Ti has a solid lead over a single 290X, but a pair of GTX 780 Tis is going to tie with a pair of cheaper 290Xs at 4K resolutions. And with 290X’s frame pacing under control NVIDIA no longer has that advantage to help build their case. GTX 780 Ti still has other advantages – power and noise in particular – but it does mean we’re in an interesting situation where NVIDIA can claim the single-GPU performance crown while the crown for the dual-GPU victor remains up for grabs. It's still very early in the game for 4K and NVIDIA isn't under any great pressure, but it will be an area of improvement for the next generation when Maxwell arrives in 2014.

Overclocking
Comments Locked

302 Comments

View All Comments

  • Kutark - Friday, November 8, 2013 - link

    The argument was never whether one or the other was a better value but rather which is the better card. Thats what this discussion is about. Anybody with their head not firmly planted in their ass can see the 290x is a good value. The problem is value changes from person to person. Some people don't care how noisy their video card is. Some people don't care how hot it runs under load. Those people would find that card to be an excellent value. Other people do care. For them it might be worth it to pay extra for a quieter card that runs 8-10c cooler. Just like some people don't give 2 shits about having leather seats in their car, and don't think a $2000 option for leather would be a good value. Others think its great.

    Secondly, the Ti only loses to the 290x at 4k resolution which is a complete non point as there is all of 1 4k monitor out right now and it costs 2x of what most people spend on their computer. Lets not also mention that to get decent framerates you need a minimum of 2x of the 780/Ti's or the 290/Xs, So we're talking about a 5k investment outside of the rest of your shit to play at 4k? Id be willing to bet less than 1 in million PC gamers have that much money into their rigs
  • Mondozai - Friday, December 13, 2013 - link

    AMD fucked up with their reference cooler, and they fucked up with not providing data early enough to OEMs for aftermarket coolers. But comparing stock 290/290X to the GTX 780 Ti is misleading. You have to compare both cards when there are aftermarket coolers to both of them.

    But you don't do that. Why? Because you know an aftermarket cooler isn't going to be a big difference to 780 Ti, but it will make a massive difference to 290/290X.

    Stock versions of both cards with aftermarket cooler from the same OEM will show very little variance in performance at higher(1440p and above) resolutions. Except that 780 Ti will remain completely overpriced.

    Price to performance ratio isn't just about budget and middle segment cards. You can do the same to high-end cards. A 290X with an aftermarket cooler is simply going to beat out 780 Ti in anybody's but a fanboy's eyes. Sorry, but you're fanboying.

    And I am saying that as an Nvidia card owner myself, but the fact remains that Nvidia has been able to rape the wallets of a lot of people for so long, and I blame AMD for this, that some people like you have internalized the raping and come to defend it.

    I can only look at you with pity.
  • Owls - Thursday, November 7, 2013 - link

    Ryan I'm sorry but the video card reviews as of late have been very poor in quality and objectivity. Stop rushing to be the first. I don't go to Anand to read a crappy review, that's what HardOCP is for.

    That said your testing is flawed with old games and comparing the Ti to be faster than a 290x that is in silent mode is disingenuous. We all expect better from this site.
  • ol1bit - Thursday, November 7, 2013 - link

    First, Fantastic Review as always!

    As a side note, it's amazing to me that AMD can't get the performance with the same heat output as Nivida. After all they are a semi-big chip company, what gives?
  • FuriousPop - Thursday, November 7, 2013 - link

    wow! as a current AMD owner, i must say it is impressive. Temps and noise great! - power consumption not as much as expected, but hey a good card indeed. Now if only we could see the benchmarks in the higher areas of 1600p+. If your going to make a comment of overkill at 1440p with SLi of these, then why not show 1600p then and see how it really matches up!?

    i know that it in the past has been AMD for >1440p and Nvid for 1080< but as of late, we are starting to see that change dramatically.

    Can we please see some 1600p and 1600p+ res being tested on benchmarks! if there were already those benchmarks out and showed impressive results at higher resos i prob would of gone out and bought 2x of these ASAP. however will have to wait and see...
  • AnotherGuy - Thursday, November 7, 2013 - link

    So Ryan, you made the R9 290 as a total disappointment because it had a little higher noise than the rest at the time, but now when you see the 780 getting close at 52dB this is a little high but ok for nVidia and they won...That is not fair.
    You need to control those emotions, think many times and then finally find the right words to describe a product, not let your first thought into the final conclusion of a review.
  • Ma Deuce - Thursday, November 7, 2013 - link

    You need to control those emotions. Think many times and then finally find the right words to criticize a review, not let your first thought into the comment...

    290 is loud and hot and he didn't recommend it. Sorry man, it's not the end of the world though. You can buy it without their expressed written consent.
  • sf101 - Friday, November 8, 2013 - link

    Anotherguy your right though on the 290x review they made it sound like the end of the world over the noise /temp and power use levels.

    Yet when Nvidia does it its ok.

    And everyone just does a point in other direction look its super man type thing and pretends it didn't happen.

    If your going to be nit picky about noise and heat then at-least be consistent from what I've seen both the 290 and 780ti are fairly close in wattage use now so there goes your TDP arguments.

    And just because Nvidia has a better reference cooler doesn't change the fact that it still is using similar wattage to the 290/290x the only difference is Nvidia's cooler deals with it a bit better.

    just finding the review a tad bias ,,,,,,,
  • Morawka - Friday, November 8, 2013 - link

    the 290 got up to 62 db
  • formulav8 - Monday, November 11, 2013 - link

    Ryan's reviews has stunk for awhile now. I used to defend him and anand on appearing bias and such. But there does seem to be something. And the thing is that if it is true, they could care less. Literally

Log in

Don't have an account? Sign up now