With the launch of AMD’s new flagship Radeon R9 290X only a couple of days behind us, NVIDIA has wasted surprisingly little time in responding the latest salvo in the unending GPU wars. Intended to coincide with the launch of NVIDIA’s holiday GeForce game bundle, the launch of ShadowPlay (more on that later today), and the final (non-beta) release of GameStream, NVIDIA has rounded out their Monday by announcing a pair of price cuts for their high-end consumer video cards, and set a launch date and a launch price for their recently announced GTX 780 Ti.

First and foremost, both GeForce GTX 780 and GeForce GTX 770 are getting price cuts, effective tomorrow (October 29th). GTX 780 will be reduced by $150 to $499, and meanwhile GTX 770 will be getting smaller $70 trim, bringing the price of that card down to $329.

For the GTX 770 this is something of a delayed price cut – AMD launched their competitive Radeon R9 280X just shy of 3 weeks ago – but as the saying goes it’s never too late. Between the two GTX 770 is about 5% faster while 280X has the 3GB memory advantage, so $329 won’t significantly threaten the 280X but it is where we would have expected NVIDIA to place it given their performance advantage.

For the GTX 780 on the other hand, this is a rapid response for NVIDIA, coming just days after the launch of the Radeon R9 290X. The 290X, its $550 price tag, and its superior performance unquestionably left NVIDIA with little choice but to cut prices. But we had not been expecting NVIDA to drop the GTX 780 below $500, even with 290X’s performance advantage. The end result is that now 290X is the more expensive part by 10% (or $50), which coincidentally is also the 290X’s performance advantage. This puts the two cards on equal footing on the price/performance continuum with NVIDIA’s kicker – their superior build quality and cooling performance – remaining. Furthermore we were also able to confirm with NVIDIA that the metal reference cooler will still be available even after the price cut, so alongside the collection of custom designs we’ve seen the high performance reference blower will still be an option for buyers seeking a quiet blower.

Fall 2013 GPU Pricing Comparison
AMD Price NVIDIA
  $700 GeForce GTX 780 Ti (Nov. 7th)
Radeon R9 290X $550  
  $500 GeForce GTX 780
  $330 GeForce GTX 770
Radeon R9 280X $300  
  $250 GeForce GTX 760
Radeon R9 270X $200  
  $180 GeForce GTX 660
  $150 GeForce GTX 650 Ti Boost
Radeon R7 260X $140  

Meanwhile, as previously mentioned today’s announcement also coincides with the launch of NVIDIA’s “The Way It’s Meant to Be Played Holiday Bundle with SHIELD” promotion, which for both the GTX 780 and GTX 770 will consist of Assassins’ Creed IV, Batman: Arkham Origins, Splinter Cell: Blacklist, and the $100 SHIELD discount. So on top of NVIDIA’s price cuts they will also be offering an unusually strong bundle in direct opposition to AMD’s price premium 290X Battlefield 4 bundle. The true value/meaningfulness of a bundle will as always ultimately depend on the buyer, but it’s very unusual to see such a significant bundle attached to what’s already a competitively priced card. So come tomorrow when these price cuts hit, NVIDIA is going to be in a very good position to counter 280X and 290X.

NVIDIA Holiday Game Bundles
Video Card Bundle Shield Discount
GeForce GTX 770/780/Titan Assassin's Creed IV, Batman: Arkham Origins, Splinter Cell: Blacklist $100
GeForce GTX 660/660Ti/670/680/760 Assassin's Creed IV, Splinter Cell: Blacklist $50
GeForce GTX 650 Series $75 Free-To-Play (Continuing) None
GeForce GT 640 (& Below) None None

Finally, along with the announcement of tomorrow’s price cuts NVIDIA has also announced the launch date for the previously announced GeForce GTX 780 Ti: November 7th (next Thursday). Furthermore NVIDIA has also announced that it will be priced at $699, placing it $200 above the GTX 780 and $150 above the 290X. We still don’t have the specs for the GTX 780 Ti, but the fact that NVIDIA is pricing it so far above the 290X indicates that they have a lot of confidence that they will be able to beat 290X’s performance, and will do so by enough of a margin to justify the price. This isn’t wholly unexpected – after all, GTX 780 wasn’t a fully enabled GK110 consumer part – so it should be interesting to see just what NVIDIA has prepared to carry on as their new gamer flagship card.

Comments Locked

146 Comments

View All Comments

  • jnad32 - Monday, October 28, 2013 - link

    Correct me if I'm wrong, but I thought every review I read said that quiet mode was basically inaudible. I also haven't really read anything about bad crossfire performance from the 290X's in any reviews either.
  • MrSpadge - Monday, October 28, 2013 - link

    It's "not noisy" only while being idle. Under load in quiet mode it's "okay, better than HD7970GE", but very clearly audible. Unless you've got some other massive noise source nearby.. which would make talking about noise almost pointless.
  • Torashin - Monday, October 28, 2013 - link

    What, like the game audio? *facepalm*
  • Gigaplex - Monday, October 28, 2013 - link

    Not all games have loud sounds at all times. And some of us don't turn the volume up all that loud.
  • Klimax - Wednesday, October 30, 2013 - link

    Not even game with massive explosions will overlay such noise and that level of loudness is already quite problematic...
  • Gadgety - Monday, October 28, 2013 - link

    "One 290X is noisy but not a big deal, 2 or 3 of them goes from "noisy" to "completely intolerable"." Well, going multi GPU I would definitely put them under water. In my opinion, multi GPU are all completely intolreable regardless of brand.
  • eanazag - Monday, October 28, 2013 - link

    We'll know how the Ti fares after it is released. It is interesting knowing that AMD has laid its cards out and they still have it priced so high. The Titan didn't get a price cut yet and likely should. I am having a hard time believing that the Ti will out perform the Titan. Yet, what we do know is that there is plenty of thermal room for nvidia to ratchet up their existing lineup to parity with AMD and edge them on build quality with a performance advantage.

    I guess what we need to see are performance, heat, and noise numbers for a GTX 780 with a TDP of 300W's - before the 7th.

    Back on Titan - it is too easy to get two 290x's for the price of a Titan. And the numbers are very much in favor of AMD.

    All-in-all this is great for customers, especially compared to the drought we are getting on the CPU side.
  • Gadgety - Monday, October 28, 2013 - link

    "The Titan didn't get a price cut yet and likely should. I am having a hard time believing that the Ti will out perform the Titan." Well, not necessarily. The Titan offers high capacity double precision, which I assume the 780ti won't. My impression is in the compute/workstation segment the Titan already is a high value option. No price cut necessary.
  • b3nzint - Monday, October 28, 2013 - link

    I think you are wrong, take a look at this. http://www.sisoftware.co.uk/?d=qa&f=gpu_financ...
    R9 290x also shines in compute area!
  • TheJian - Tuesday, October 29, 2013 - link

    Can you make money with Sisoftware, folding@home, bitmining, etc? NOPE. CUDA is where the money is in pro apps. No AMD card can do it and Titan excels at it for $1500 off Tesla pricing ($2500).

    From your link...LOL
    "Note: OpenCL was used as it is supported by all GPUs/APUs. The tests are also available through CUDA which provides better optimisations for nV hardware."

    Gee lets run NV cards in their worst case scenario ignoring CUDA so AMD looks reasonable...Lets run them in that crap that's not funded by anybody called OPENCL. Why not test CUDA vs. OpenCL here? They are already telling you they can do it, but then NV would blow away AMD...Toms/anandtech both refuse to pit them against each other while it is EASY to do with any pro app (adobe, cinema4d, 3dsmax, blender etc etc). Just swap plugins and bench. Luxrender for AMD and Furryball/Octane etc for NV. What excuse is there for never pitting them against each other? OpenCL for AMD (or OpenGL whichever is needed) vs. Cuda in the same app for NV. They all hate cuda and love OPEN crap which is why they avoid this scenario that can easily be shown. Cuda is taught in 600+ universities for a reason...LOL.

    Your link is running NV in CRAP MODE. Nice try. What moron purposely turns off 7yrs of CUDA funding and runs Titan (or any NV card) without Cuda when as he notes it is already in there. TURN IT ON. Useless results.

Log in

Don't have an account? Sign up now