While it was roughly 2 years from Maxwell 2 to Pascal, the journey to Turing has felt much longer despite a similar 2 year gap. There’s some truth to the feeling: looking at the past couple years, there’s been basically every other possible development in the GPU space except next-generation gaming video cards, like Intel’s planned return to discrete graphics, NVIDIA’s Volta, and cryptomining-specific cards. Finally, at Gamescom 2018, NVIDIA announced the GeForce RTX 20 series, built on TSMC’s 12nm “FFN” process and powered by the Turing GPU architecture. Launching today with full general availability is just the GeForce RTX 2080, as the GeForce RTX 2080 Ti was delayed a week to the 27th, while the GeForce RTX 2070 is due in October. So up for review today is the GeForce RTX 2080 Ti and GeForce RTX 2080.

But a standard new generation of gaming GPUs this is not. The “GeForce RTX” brand, ousting the long-lived “GeForce GTX” moniker in favor of their announced “RTX technology” for real time ray tracing, aptly underlines NVIDIA’s new vision for the video card future. Like we saw last Friday, Turing and the GeForce RTX 20 series are designed around a set of specialized low-level hardware features and an intertwined ecosystem of supporting software currently in development. The central goal is a long-held dream of computer graphics researchers and engineers alike – real time ray tracing – and NVIDIA is aiming to bring that to gamers with their new cards, and willing to break some traditions on the way.

NVIDIA GeForce Specification Comparison
  RTX 2080 Ti RTX 2080 RTX 2070 GTX 1080
CUDA Cores 4352 2944 2304 2560
Core Clock 1350MHz 1515MHz 1410MHz 1607MHz
Boost Clock 1545MHz
FE: 1635MHz
1710MHz
FE: 1800MHz
1620MHz
FE: 1710MHz
1733MHz
Memory Clock 14Gbps GDDR6 14Gbps GDDR6 14Gbps GDDR6 10Gbps GDDR5X
Memory Bus Width 352-bit 256-bit 256-bit 256-bit
VRAM 11GB 8GB 8GB 8GB
Single Precision Perf. 13.4 TFLOPs 10.1 TFLOPs 7.5 TFLOPs 8.9 TFLOPs
Tensor Perf. (INT4) 430TOPs 322TOPs 238TOPs N/A
Ray Perf. 10 GRays/s 8 GRays/s 6 GRays/s N/A
"RTX-OPS" 78T 60T 45T N/A
TDP 250W
FE: 260W
215W
FE: 225W
175W
FE: 185W
180W
GPU TU102 TU104 TU106 GP104
Transistor Count 18.6B 13.6B 10.8B 7.2B
Architecture Turing Turing Turing Pascal
Manufacturing Process TSMC 12nm "FFN" TSMC 12nm "FFN" TSMC 12nm "FFN" TSMC 16nm
Launch Date 09/27/2018 09/20/2018 10/2018 05/27/2016
Launch Price MSRP: $999
Founders $1199
MSRP: $699
Founders $799
MSRP: $499
Founders $599
MSRP: $599
Founders $699

As we discussed at the announcement, one of the major breaks is that NVIDIA is introducing GeForce RTX as the full upper tier stack with x80 Ti/x80/x70 stack, where it has previously tended towards the x80/x70 products first, and the x80 Ti as a mid-cycle refresh or competitive response. More intriguingly, each GeForce card has their own distinct GPU (TU102, TU104, and TU106), with direct Quadro and now Tesla variants of TU102 and TU104. While we covered the Turing architecture in the preceding article, the takeaway is that each chip is proportionally cut-down, including the specialized RT Cores and Tensor Cores; with clockspeeds roughly the same as Pascal, architectural changes and efficiency enhancements will be largely responsible for performance gains, along with the greater bandwidth of 14Gbps GDDR6.

And as far as we know, Turing technically did not trickle down from a bigger compute chip a la GP100, though at the architectural level it is strikingly similar to Volta/GV100. Die size brings more color to the story, because with TU106 at 454mm2, the smallest of the bunch is frankly humungous for a FinFET die nominally dedicated for a x70 GeForce product, and comparable in size to the 471mm2 GP102 inside the GTX 1080 Ti and Pascal Titans. Even excluding the cost and size of enabled RT Cores and Tensor Cores, a slab of FinFET silicon that large is unlikely to be packaged and priced like the popular $330 GTX 970 and still provide the margins NVIDIA is pursuing.

These observations are not so much to be pedantic, but more so to sketch out GeForce Turing’s positioning in relation to Pascal. Having separate GPUs for each model is the most expensive approach in terms of research and development, testing, validation, extra needed fab tooling/capacity – the list goes on. And it raises interesting questions on the matter of binning, yields, and salvage parts. Though NVIDIA certainly has the spare funds to go this route, there’s surely a better explanation than Turing being primarily designed for a premium-priced consumer product that cannot command the margins of professional parts. These all point to the known Turing GPUs as oriented for lower-volume, and NVIDIA’s financial quarterly reports indicate that GeForce product volume is a significant factor, not just ASP.

And on that note, the ‘reference’ Founders Edition models are no longer reference; the GeForce RTX 2080 Ti, 2080, and 2070 Founders Editions feature 90MHz factory overclocks and 10W higher TDP, and NVIDIA does not plan to productize a reference card themselves. But arguably the biggest change is the move from blower-style coolers with a radial fan to an open air cooler with dual axial fans. The switch in design improves cooling capacity and lowers noise, but with the drawback that the card can no longer guarantee that it can cool itself. Because the open air design re-circulates the hot air back into the chassis, it is ultimately up to the chassis to properly exhaust the heat. In contrast, a blower pushes all the hot air through the back of the card and directly out of the case, regardless of the chassis airflow or case fans.

All-in-all, NVIDIA is keeping the Founders Edition premium, which is now $200 over the baseline ‘reference.’ Though AIB partner cards are also launching today, in practice the Founders Edition pricing is effectively the retail price until the launch rush has subsided.

The GeForce RTX 20 Series Competition: The GeForce GTX 10 Series

In the end, the preceding GeForce GTX 10 series ended up occupying an odd spot in the competitive landscape. After its arrival in mid-2016, only the lower end of the stack had direct competition, due to AMD’s solely mainstream/entry Polaris-based Radeon RX 400 series. AMD’s RX 500 series refresh in April 2017 didn’t fundamentally change that, and it was only until August 2017 that the higher-end Pascal parts had direct competition with their generational equal in RX Vega. But by that time, the GTX 1080 Ti (not to mention the Pascal Titans) was unchallenged. And all the while, an Ethereum-led resurgence of mining cryptocurrency on video cards was wreaking havoc on GPU pricing and inventory, first on Polaris products, then general mainstream parts, and finally affecting any and all GPUs.

Not that NVIDIA sat on their laurels with Vega, releasing the GTX 1070 Ti anyhow. But what was constant was how the pricing models evolved with the Founders Editions schema, the $1200 Titan X (Pascal), and then $700 GTX 1080 Ti and $1200 Titan Xp. Even the $3000 Titan V maintained gaming cred despite diverging greatly from previous Titan cards as firmly on the professional side of prosumer, basically allowing the product to capture both prosumers and price-no-object enthusiasts. Ultimately, these instances coincided with the rampant cryptomining price inflation and was mostly subsumed by it.

So the higher end of gaming video cards has been Pascal competing with itself and moving up the price brackets. For Turing, the GTX 1080 Ti has become the closest competitor. RX Vega performance hasn’t fundamentally changed, and the fallout appears to have snuffed out any Vega 10 parts, as well as Vega 14nm+ (i.e. 12nm) refreshes. As a competitive response, AMD doesn’t have many cards up their sleeves except the ones already played – game bundles (such as the current “Raise the Game” promotion), FreeSync/FreeSync 2, other hardware (CPU, APU, motherboard) bundles. Other than that, there’s a DXR driver in the works and a machine learning 7nm Vega on the horizon, but not much else is known, such as mobile discrete Vega. For AMD graphics cards on shelves right now, RX Vega is still hampered by high prices and low inventory/selection, remnants of cryptomining.

For the GeForce RTX 2080 Ti and 2080, NVIDIA would like to sell you the RTX cards as your next upgrade regardless of what card you may have now, essentially because no other card can do what Turing’s features enable: real time raytracing effects ((and applied deep learning) in games. And because real time ray tracing offers graphical realism beyond what rasterization can muster, it’s not comparable to an older but still performant card. Unfortunately, none of those games have support for Turing’s features today, and may not for some time. Of course, NVIDIA maintains that the cards will provide expected top-tier performance in traditional gaming. Either way, while Founders Editions are fixed at their premium MSRP, custom cards are unsurprisingly listed at those same Founders Edition price points or higher.

Fall 2018 GPU Pricing Comparison
AMD Price NVIDIA
  $1199 GeForce RTX 2080 Ti
  $799 GeForce RTX 2080
  $709 GeForce GTX 1080 Ti
Radeon RX Vega 64 $569  
Radeon RX Vega 56 $489 GeForce GTX 1080
  $449 GeForce GTX 1070 Ti
  $399 GeForce GTX 1070
Radeon RX 580 (8GB) $269/$279 GeForce GTX 1060 6GB
(1280 cores)
Meet The New Future of Gaming: Different Than The Old One
Comments Locked

337 Comments

View All Comments

  • V900 - Thursday, September 20, 2018 - link

    That’s plain false.

    Tomb Raider is a title out now with RTX enabled in the game.

    Battlefield 5 is out in a month or two (though you can play it right now) and will also utilize RTX.

    Sorry to destroy your narrative with the fact, that one of the biggest titles this year is supporting RTX.

    And that’s of course just one out of a handful of titles that will do so, just in the next few months.

    Developer support seems to be the last thing that RTX2080 owners need to worry about, considering that there are dozens of titles, many of them big AAA games, scheduled for release just in the first half of 2019.
  • Skiddywinks - Friday, September 21, 2018 - link

    Unless I'm mistaken, TR does not support RTX yet. Obviously, otherwise it would be showing up in reviews everywhere. There is a reason every single reviewer is only benchmarking traditional games; that's all there is right now.
  • Writer's Block - Monday, October 1, 2018 - link

    Exactly.
    Is supporting or enabled.
    However - neiher actually have it now to see, to experience.
  • eva02langley - Thursday, September 20, 2018 - link

    These cards are nothing more than a cheap magic trick show. Nvidia knew about the performances being lackluster, and based their marketing over gimmick to square the competition by affirming that these will be the future of gaming and you will be missing out without it.

    Literally, they basically tried to create a need... and if you are defending Nvidia over this, you have just drinking the coolaid at this point.

    Quote me on this, this will be the next gameworks feature that devs will not bother touching. Why? Because devs are developing games on consoles and transit them to PC. The extra time in development doesn't bring back any additional profit.
  • Skiddywinks - Friday, September 21, 2018 - link

    Here's the thing though, I don't the performance is that lacklustre, the issue is we have this huge die and half of it does not do what most people want; give us more frames. If they had made the same size die with nothing but traditional CUDA cores, the 2080 Ti would be an absolute beast. And I'd imagine it would be a lot cheaper as well.

    But nVidia (maybe not mistakenly) have decided to push the raytracing path, and those of us you just want maximum performance for the price (me) and were waiting for the next 1080 Ti are basically left thinking "... oh well, skip".
  • eva02langley - Friday, September 21, 2018 - link

    DOn't get me wrong, these cards are a normal upgrade performance jump, however it is not the second christ sent that Nvidia is marketing.

    The problem here is Nvidia want to corner AMD and their tactic they choose is RTX. However RTX is nothing else than a FEATURE. The gamble could cost them a lot.

    If AMD gaming and 7nm strategy pays off, devs will develop on AMD hardware and transit to PC architecture leaving devs no incentive to put the extra work for a FEATURE.

    The extra cost of the bigger die should have been for gaming performances, but Nvidia strategy is to disrupt competition and further their stand as a monopoly as they can.

    Physx didn't work, hairwork didn't work and this will not work. As cool as it is, this should have been a feature for pro cards only, not consumers.
  • mapesdhs - Thursday, September 27, 2018 - link

    That's the thing though, they aren't a "normal" upgrade performance jump, because the prices make no sense.
  • AnnoyedGrunt - Thursday, September 20, 2018 - link

    This reminds me quite a bit of the original GeForce 256 launch. Not sure how many of you were following Anandtech back then, but it was my go-to site then just as it is now. Here are links to some of the original reviews:

    GeForce256 SDR: https://www.anandtech.com/show/391
    GeForce256 DDR: https://www.anandtech.com/show/429

    Similar to the 20XX series, the GeForce256 was Nvidia's attempt to change the graphics card paradigm, adding hardware tranformation and lighting to the graphics card (and relieving the CPU from those tasks). The card was faster than the contemporary cards, but also much more expensive, making the value questionable for many.

    At the time I was a young mechanical engineer, and I remember feeling that Nvidia was brilliant for creating this card. It let me run Pro/E R18 on my $1000 home computer, about as fast as I could on my $20,000 HP workstation. That card basically destroyed the market of workstation-centric companies like SGI and Sun, as people could now run CAD packages on a windows PC.

    The 20XX series gives me a similar feeling, but with less obvious benefit to the user. The cards are as fast or faster than the previous generation, but are also much more expensive. The usefulness is likely there for developers and some professionals like industrial designers who would love to have an almost-real-time, high quality, rendered image. For gamers, the value seems to be a stretch.

    While I was extremely excited about the launch of the original GeForce256, I am a bit "meh" about the 20XX series. I am looking to build a new computer and replace my GTX 680/i5-3570K, but this release has not changed the value equation at all.

    If I look at Wolfenstein, then a strong argument could be made for the 2080 being more future proof, but pretty much all other games are a wash. The high price of the 20XX series means that the 1080 prices aren't dropping, and I doubt the 2070 will change things much since it looks like it would be competing with the vanilla 1080, but costing $100 more.

    Looks like I will wait a bit more to see how that price/performance ends up, but I don't see the ray-tracing capabilities bringing immediate value to the general public, so paying extra for it doesn't seem to make a lot of sense. Maybe driver updates will improve performance in today's games, making the 20XX series look better than it does now, but I think like many, I was hoping for a bit more than an actual reduction in the performance/price ratio.

    -AG
  • eddman - Thursday, September 20, 2018 - link

    How much was a 256 at launch? I couldn't find any concrete pricing info but let's go with $500 to be safe. That's just $750 by today's dollar for something that is arguably the most revolutionary nvidia video card.
  • Ananke - Thursday, September 20, 2018 - link

    Yep, and it was also not selling well among "gamers" novelty, that became popular after falling under $100 a pop years later. Same here, financial analysts say the expected revenue from gaming products will drop in the near future, and Wall Street already dropped NVidia. Product is good, but expensive, it is not going to sell in volume, their revenue will drop in the imminent quarters.
    Apple's XS phone was the same, but Apple started a buy-one-get-one campaign on the very next day, plus upfront discount and solid buyback of iPhones. Yet, not clear whether they will achieve volume and revenue growth within the priced in expectations.
    These are public companies - they make money from Wall Street, and they /NVidia/ can lose much more and much faster on the capital markets, versus what they would gain in profitability from lesser volume high end boutique products. This was relatively sh**y launch - NVidia actually didn't want to launch anything, they want to sell their glut of GTX inventory first, but they have silicon ordered and made already at TSMC, and couldn't just sit on it waiting...

Log in

Don't have an account? Sign up now