Final Words

Bringing this review to a close, after having seen NVIDIA upstage themselves a couple of times this year already with GK110, it’s admittedly getting a bit harder each time to write about NVIDIA’s flagship GPU. NVIDIA won’t break significant new ground just by refreshing GK110, but gradual performance increases in conjunction with periodic price drops have kept the market fresh while making NVIDIA’s high-end cards a bit faster and a bit cheaper each time. So in that respect we’re enthusiastic about seeing NVIDIA finally release a fully enabled GK110 GeForce card and the performance improvements it brings.

With that in mind, with the release of the GeForce GTX 780 Ti NVIDIA is once more left solidly in control of the single-GPU performance crown. It won’t quite get to claim a massive performance advantage over its closest competitors, but at the end of the day it’s going to be faster than any other single-GPU card out there. This will break down to being 11% faster than Radeon R9 290X, 9% faster than GTX Titan, and a full 20% faster than the original GTX 780 that it formally replaces.

To that end, while NVIDIA can still charge top dollar for their flagship card it’s a sign of the times and of the competition that they released their fully enabled GK110 part as a cheaper GTX 780 series card. At $700 it’s by no means cheap – and this has and always will be the drawback to NVIDIA’s flagships so long as NVIDIA can hold the lead – but it also means that NVIDIA does need to take AMD’s Radeon R9 290 series into account. As such the 290X and the GTX 780, though lesser performing parts, will remain as spoilers for GTX 780 Ti due to their better balance of performance and pricing. All the while GTX 780 Ti stands at the top of the heap for those who want the best.

Meanwhile we bid au revoir to the original GK110 GeForce card, GTX Titan. Though GTX Titan will still be on the market as an entry level compute card, it is finally dethroned as the fastest single-GPU gaming card in NVIDIA’s lineup. At least for the time being GTX Titan is still very secure in its place in the market as a compute card, and so there it will continue, a position that reflects the fact that there’s little need for NVIDIA to keep their gaming and compute products commingled together as a single product. Though we wouldn’t be the least bit surprised if NVIDIA made additional prosumer products of this nature in the future, as GTX Titan clearly worked out well for the company.

And though GTX Titan is falling off of our radar, we’re glad to see that NVIDIA has kept around Titan’s second most endearing design element, the Titan cooler. We won’t hazard to guess just how much it costs NVIDIA over a cheaper design (or what it adds to the final price tag), but with GTX 780 Ti NVIDIA has once again proven just how capable the cooler is when paired with GK110. Even with the slightly higher power consumption of GTX 780 Ti versus the cards that have come before it, thanks to that cooler GTX 780 Ti still hits an excellent sweet spot between performance and noise, offering the flexibility and simplicity of a blower without the noise that has traditionally accompanied such a cooler. And all the while still delivering more than enough performance to hold on to the performance crown.

Finally, let’s talk about SLI for a moment. Much like GTX Titan before it, GTX 780 Ti is so fast that it’s almost more than enough on its own for any standard single-monitor resolution. Even 2560x1440 with high settings isn’t enough to bog down GTX 780 Ti in most games, which makes a pair of GTX 780 Tis in SLI overkill by any definition. Properly using that much power requires multiple monitors, be it an Eyefinity/Surround setup, or more recently a tiled 4K monitor.

In either scenario a GTX 780 Ti is going to be a solid performer for those segments, but NVIDIA is going to have to deal with the fact that their performance advantage is going to melt away with the resolution increase. Right now a single GTX 780 Ti has a solid lead over a single 290X, but a pair of GTX 780 Tis is going to tie with a pair of cheaper 290Xs at 4K resolutions. And with 290X’s frame pacing under control NVIDIA no longer has that advantage to help build their case. GTX 780 Ti still has other advantages – power and noise in particular – but it does mean we’re in an interesting situation where NVIDIA can claim the single-GPU performance crown while the crown for the dual-GPU victor remains up for grabs. It's still very early in the game for 4K and NVIDIA isn't under any great pressure, but it will be an area of improvement for the next generation when Maxwell arrives in 2014.

Overclocking
POST A COMMENT

302 Comments

View All Comments

  • fewafwwaefwa - Thursday, November 7, 2013 - link

    sterven.. Reply
  • althaz - Thursday, November 7, 2013 - link

    G-Sync a game-changer, seriously? I admit to not having seen it in action, but it seems like a small advantage at best and something nobody in the whole world has a monitor that supports it at worst. Reply
  • MonkeyM - Sunday, November 10, 2013 - link

    780 isn't nearly as overprice as the ti. It's 500 now, not 650. Which, in all honesty, is a pretty fair price for a card that draw almost 70 watts less than the 290 or 290x. Badly overpriced? False. Overpriced? That's more than fair for the Ti, but a bit of a stretch for the 780. Meagre gain is also bullshit. You get the last 3 missing SMX's, an extra 1,000Mhz on the GDDR5's clock, and you also get a sizable 576 more stream processors. Other than those, it's a fair comment. I do wish they would feel the need to drop prices more, but you certainly get consistency when you buy from big green... Reply
  • Da W - Thursday, November 7, 2013 - link

    Hey look, an Nvidia Fanboy! So happy to get a few framerate advantage like if he owned the company or worked for it.
    WHO GIVES A DAMN?
    At the end of the day i'm looking at performance/price/temperature/noise. That being said, living in Canada, every degree of heat my videocard produce, i save in heating bill.
    Reply
  • euskalzabe - Thursday, November 7, 2013 - link

    hahahaha... I totally understand, that is one of the reasons I still keep my GTX470: the heat it provides during cold Chicago winters is a plus until I move elsewhere next year and buy a 8xx Maxwell :) Reply
  • EzioAs - Thursday, November 7, 2013 - link

    The GTX 780ti is also quite power hungry and loud and you would know that if you read the review Reply
  • Wreckage - Thursday, November 7, 2013 - link

    I'm guessing you ignored the "uber mode" setting for the 290x, it is off the charts compared to the 780ti.

    Nothing I said in my above post is wrong. I think it's the truth that is upsetting people.
    Reply
  • EzioAs - Thursday, November 7, 2013 - link

    You also didn't clarify that it was the Uber mode...and it is still one the charts.

    Without the "uber mode", the 290X is still quite close to the GTX 780ti in terms of gaming performance, power consumption and noise.
    Reply
  • TheJian - Thursday, November 7, 2013 - link

    You must not be reading anywhere but here, and even then, 290x isn't close:
    Oddly Anandtech doesn’t seem to know it has special tech in it that allows better OCing – power balancing (unbalancing?). You guys not using it or something? :)

    http://www.bit-tech.net/hardware/graphics/2013/11/...
    “A new power management feature for the GTX 780 Ti related to clock speeds and overclocking in particular is called Power Balancing. A card like the GTX 780 Ti draws power across three rails: the PCI-Express lane and the two additional PCI-E power connections. Power is balanced between the three but can become unbalanced when overclocking and possibly limit your overclocks if you max out one rail while having headroom elsewhere. Power Balancing simply allows the balance to be maintained when overclocking, potentially allowing for higher overclocks than previous GK110 cards, on top of the already higher clock speeds.”
    They only hit 1152, but in practice saw it hitting 1230. Mem hit 1950!
    http://www.guru3d.com/articles_pages/geforce_gtx_7...
    More on power balancing. They hit 1276 boost 7948mem.

    http://www.legitreviews.com/nvidia-geforce-gtx-780...
    1289 OC/1900 mem

    https://www.youtube.com/watch?v=m1JOhT015ww
    Linustechtips, as always both cards Oc’ed to the wall. He mentions Over 1200 core (not sure if that’s base or boost). But as you can see when both 780ti/290x are clocked to max 780ti dominates everything. Benchmarks at 8:35 or so. Also note Luke says 1080p will still be tough in upcoming games like star citizen etc as he shows. Pretty much a landslide by 15-25% “crushing everything” Luke says. He actually discusses 1080p and shows Farcry 3 (55avg, 290x hits 47avg)/Crysis 3 (50fps vs. 40fps 290x) maxed not hitting above 55fps and at 2560 shows they don’t even hit 30fps avg and this is OC’ed to the max and already kicking the crap out of AMD here (24fps crysis3 for 290x max oc’ed). So if you like to MAX everything in your game, these both are not even playable in crysis 3 or farcry3 at 2560 and many other games. You will constantly be turning stuff down at 1600p, so not quite sure how anyone can say these cards are overkill for 1080p when as he notes games like star citizen will no doubt slow you down even more than Crysis 3 (same engine, later game, well duh). You’ll need 20nm to max 2560 or always run things on low, medium etc like anandtech does. You can play there but with how many sacrifices?

    http://www.overclockersclub.com/reviews/nvidia_gtx...
    1304 OC/1940 mem
    Note also these guys show the quiet mode dropped 290x to 669mhz!

    While Anandtech still uses very few games and a useless warhead game:
    Games 780 wins or dominates in 2560 ALL vs. UBER 290x (of course all worse for quiet mode, note bit-tech only does 1080p and 5760):
    Skyrim (bit-tech w/hires texture packs, techpowerup without)
    Assassins Creed 3 (techpowerup, 5.3%)
    SplinterCell Blacklist (techpowerup, blows away 690, crushes UBER 36%, also same shown at overclockersclub even 5760)
    Battlefield 3 (techpowerup, legitreviews, overclockersclub 1080/5760)
    Battlefield 4 (bit-tech, but barely, same 1080p, tweaktown shows big loss? But guru3d shows big win@2xMSAA…LOL – guru3d shows losses below)
    Batman Arkham City (overclockersclub at both 1080/5760)
    Tombraider (legitreviews, techpowerup, tweaktown etc)
    WOW Mysts of Pandaria (techpowerup, over 25% faster, over 20% 5760)
    StarCraft 2 HOS (techpowerup, over 15%, beat 690 too)
    Diablo 3 (techpowerup, over 15%, 20% in 1080p also)
    COD Black Ops 2 (techpowerup 17%, again over 22% in 1080p also)
    Sleeping Dogs (techpowerup)
    Crysis 3 (techpowerup, bit-tech)
    Bioshock Infinite (Techpowerup, bit-tech etc – everyone I guess)
    Phantasy Star online 2 (tweaktown, 17%+, even beats 1065mhz OC 290x)
    Lost Planet 2 (tweaktown, over 34%! Same vs. 1065mhz 290x, same 1080p)
    F1 2012 (tweaktown, beats 1065mhz 290x also, all resolutions)
    Dirt Showdown (tweaktown tie 2560, but wins 1200p/1680x1050)
    Far Cry2 (tweaktown, anyone play this? Still they show it over 10% NV)
    Guild Wars 2 (techreport, dominated by old 780, so 780ti will be better)
    Medal of Honor Warfighter (guru3d 17%).

    Maybe there's a reason anandtech has chosen their games? Still waiting for the NVIDIA PORTAL.

    The point here? Gsync, GeforceExp, Physx, Cuda, streaming, shadowplay, lower noise, power, heat, 3 AAA games, massive OCing and all the games above with some major victories (BEFORE and overclock). This is without mentioning all the driver issues, including AMD admitting they have a current problem with “VARIANCE” with 290/290x and will fix that with a driver supposedly in response to Tomshardware article, Techreport etc about retails perf being lower than press cards. For anyone thinking $700 is a rip-off, I suggest you look at the numbers/features above. On top you need a new fan or wait for better models before I'd even touch 290x/290 due to noise.

    Only disappointment I can see as a buyer, is no full DP. Titan still has that and 6GB, though nobody can show a game using more than 3GB and run into the problem while being OVER 30fps. To force this into a problem (not sure you can, skyrim modded out?), you will be CRAWLING in fps.
    Reply
  • Galidou - Thursday, November 7, 2013 - link

    Wow dude, hardcore fan or working for nvidia or I don't know, took the time to find every link, type that to make us realise this: nvidia's reference cooler is amazing like before, we know how GTX 780 ti pushed to the max performs(i don't think custom coolers will go much past 1300mhz on the core), 290x reference cooler is crap(like we didn't already know) and for that we still don't know how it performs pushed to the max.

    Oh and maybe YOU didn't chose your games for comparison... And yes it is close, from your carefully handpicked games it's averaging 15-20% faster while costing 28% more.

    700$ is not a ripoff for totl performance but still too much for 99.8% of us(pc gamers that still use 1080p monitors).

    For the 3gb argument, did you travel in the future, 2 years from now if the games will never use more? Skyrim with a couple mods goes close to 2gb in 1080p!! Heavily modded over 2gb easily.i'm right now on the limit of mods with my GTX 660 ti 2gb, sometimes it suffers from a little lack of memory...
    Reply

Log in

Don't have an account? Sign up now