While it was roughly 2 years from Maxwell 2 to Pascal, the journey to Turing has felt much longer despite a similar 2 year gap. There’s some truth to the feeling: looking at the past couple years, there’s been basically every other possible development in the GPU space except next-generation gaming video cards, like Intel’s planned return to discrete graphics, NVIDIA’s Volta, and cryptomining-specific cards. Finally, at Gamescom 2018, NVIDIA announced the GeForce RTX 20 series, built on TSMC’s 12nm “FFN” process and powered by the Turing GPU architecture. Launching today with full general availability is just the GeForce RTX 2080, as the GeForce RTX 2080 Ti was delayed a week to the 27th, while the GeForce RTX 2070 is due in October. So up for review today is the GeForce RTX 2080 Ti and GeForce RTX 2080.

But a standard new generation of gaming GPUs this is not. The “GeForce RTX” brand, ousting the long-lived “GeForce GTX” moniker in favor of their announced “RTX technology” for real time ray tracing, aptly underlines NVIDIA’s new vision for the video card future. Like we saw last Friday, Turing and the GeForce RTX 20 series are designed around a set of specialized low-level hardware features and an intertwined ecosystem of supporting software currently in development. The central goal is a long-held dream of computer graphics researchers and engineers alike – real time ray tracing – and NVIDIA is aiming to bring that to gamers with their new cards, and willing to break some traditions on the way.

NVIDIA GeForce Specification Comparison
  RTX 2080 Ti RTX 2080 RTX 2070 GTX 1080
CUDA Cores 4352 2944 2304 2560
Core Clock 1350MHz 1515MHz 1410MHz 1607MHz
Boost Clock 1545MHz
FE: 1635MHz
FE: 1800MHz
FE: 1710MHz
Memory Clock 14Gbps GDDR6 14Gbps GDDR6 14Gbps GDDR6 10Gbps GDDR5X
Memory Bus Width 352-bit 256-bit 256-bit 256-bit
Single Precision Perf. 13.4 TFLOPs 10.1 TFLOPs 7.5 TFLOPs 8.9 TFLOPs
Tensor Perf. (INT4) 430TOPs 322TOPs 238TOPs N/A
Ray Perf. 10 GRays/s 8 GRays/s 6 GRays/s N/A
"RTX-OPS" 78T 60T 45T N/A
TDP 250W
FE: 260W
FE: 225W
FE: 185W
GPU TU102 TU104 TU106 GP104
Transistor Count 18.6B 13.6B 10.8B 7.2B
Architecture Turing Turing Turing Pascal
Manufacturing Process TSMC 12nm "FFN" TSMC 12nm "FFN" TSMC 12nm "FFN" TSMC 16nm
Launch Date 09/27/2018 09/20/2018 10/2018 05/27/2016
Launch Price MSRP: $999
Founders $1199
MSRP: $699
Founders $799
MSRP: $499
Founders $599
MSRP: $599
Founders $699

As we discussed at the announcement, one of the major breaks is that NVIDIA is introducing GeForce RTX as the full upper tier stack with x80 Ti/x80/x70 stack, where it has previously tended towards the x80/x70 products first, and the x80 Ti as a mid-cycle refresh or competitive response. More intriguingly, each GeForce card has their own distinct GPU (TU102, TU104, and TU106), with direct Quadro and now Tesla variants of TU102 and TU104. While we covered the Turing architecture in the preceding article, the takeaway is that each chip is proportionally cut-down, including the specialized RT Cores and Tensor Cores; with clockspeeds roughly the same as Pascal, architectural changes and efficiency enhancements will be largely responsible for performance gains, along with the greater bandwidth of 14Gbps GDDR6.

And as far as we know, Turing technically did not trickle down from a bigger compute chip a la GP100, though at the architectural level it is strikingly similar to Volta/GV100. Die size brings more color to the story, because with TU106 at 454mm2, the smallest of the bunch is frankly humungous for a FinFET die nominally dedicated for a x70 GeForce product, and comparable in size to the 471mm2 GP102 inside the GTX 1080 Ti and Pascal Titans. Even excluding the cost and size of enabled RT Cores and Tensor Cores, a slab of FinFET silicon that large is unlikely to be packaged and priced like the popular $330 GTX 970 and still provide the margins NVIDIA is pursuing.

These observations are not so much to be pedantic, but more so to sketch out GeForce Turing’s positioning in relation to Pascal. Having separate GPUs for each model is the most expensive approach in terms of research and development, testing, validation, extra needed fab tooling/capacity – the list goes on. And it raises interesting questions on the matter of binning, yields, and salvage parts. Though NVIDIA certainly has the spare funds to go this route, there’s surely a better explanation than Turing being primarily designed for a premium-priced consumer product that cannot command the margins of professional parts. These all point to the known Turing GPUs as oriented for lower-volume, and NVIDIA’s financial quarterly reports indicate that GeForce product volume is a significant factor, not just ASP.

And on that note, the ‘reference’ Founders Edition models are no longer reference; the GeForce RTX 2080 Ti, 2080, and 2070 Founders Editions feature 90MHz factory overclocks and 10W higher TDP, and NVIDIA does not plan to productize a reference card themselves. But arguably the biggest change is the move from blower-style coolers with a radial fan to an open air cooler with dual axial fans. The switch in design improves cooling capacity and lowers noise, but with the drawback that the card can no longer guarantee that it can cool itself. Because the open air design re-circulates the hot air back into the chassis, it is ultimately up to the chassis to properly exhaust the heat. In contrast, a blower pushes all the hot air through the back of the card and directly out of the case, regardless of the chassis airflow or case fans.

All-in-all, NVIDIA is keeping the Founders Edition premium, which is now $200 over the baseline ‘reference.’ Though AIB partner cards are also launching today, in practice the Founders Edition pricing is effectively the retail price until the launch rush has subsided.

The GeForce RTX 20 Series Competition: The GeForce GTX 10 Series

In the end, the preceding GeForce GTX 10 series ended up occupying an odd spot in the competitive landscape. After its arrival in mid-2016, only the lower end of the stack had direct competition, due to AMD’s solely mainstream/entry Polaris-based Radeon RX 400 series. AMD’s RX 500 series refresh in April 2017 didn’t fundamentally change that, and it was only until August 2017 that the higher-end Pascal parts had direct competition with their generational equal in RX Vega. But by that time, the GTX 1080 Ti (not to mention the Pascal Titans) was unchallenged. And all the while, an Ethereum-led resurgence of mining cryptocurrency on video cards was wreaking havoc on GPU pricing and inventory, first on Polaris products, then general mainstream parts, and finally affecting any and all GPUs.

Not that NVIDIA sat on their laurels with Vega, releasing the GTX 1070 Ti anyhow. But what was constant was how the pricing models evolved with the Founders Editions schema, the $1200 Titan X (Pascal), and then $700 GTX 1080 Ti and $1200 Titan Xp. Even the $3000 Titan V maintained gaming cred despite diverging greatly from previous Titan cards as firmly on the professional side of prosumer, basically allowing the product to capture both prosumers and price-no-object enthusiasts. Ultimately, these instances coincided with the rampant cryptomining price inflation and was mostly subsumed by it.

So the higher end of gaming video cards has been Pascal competing with itself and moving up the price brackets. For Turing, the GTX 1080 Ti has become the closest competitor. RX Vega performance hasn’t fundamentally changed, and the fallout appears to have snuffed out any Vega 10 parts, as well as Vega 14nm+ (i.e. 12nm) refreshes. As a competitive response, AMD doesn’t have many cards up their sleeves except the ones already played – game bundles (such as the current “Raise the Game” promotion), FreeSync/FreeSync 2, other hardware (CPU, APU, motherboard) bundles. Other than that, there’s a DXR driver in the works and a machine learning 7nm Vega on the horizon, but not much else is known, such as mobile discrete Vega. For AMD graphics cards on shelves right now, RX Vega is still hampered by high prices and low inventory/selection, remnants of cryptomining.

For the GeForce RTX 2080 Ti and 2080, NVIDIA would like to sell you the RTX cards as your next upgrade regardless of what card you may have now, essentially because no other card can do what Turing’s features enable: real time raytracing effects ((and applied deep learning) in games. And because real time ray tracing offers graphical realism beyond what rasterization can muster, it’s not comparable to an older but still performant card. Unfortunately, none of those games have support for Turing’s features today, and may not for some time. Of course, NVIDIA maintains that the cards will provide expected top-tier performance in traditional gaming. Either way, while Founders Editions are fixed at their premium MSRP, custom cards are unsurprisingly listed at those same Founders Edition price points or higher.

Fall 2018 GPU Pricing Comparison
  $1199 GeForce RTX 2080 Ti
  $799 GeForce RTX 2080
  $709 GeForce GTX 1080 Ti
Radeon RX Vega 64 $569  
Radeon RX Vega 56 $489 GeForce GTX 1080
  $449 GeForce GTX 1070 Ti
  $399 GeForce GTX 1070
Radeon RX 580 (8GB) $269/$279 GeForce GTX 1060 6GB
(1280 cores)
Meet The New Future of Gaming: Different Than The Old One


View All Comments

  • eddman - Thursday, September 20, 2018 - link

    It still doesn't justify their prices. Great cards, finally ray-tracing for games, horribly cutthroat prices. Reply
  • Yojimbo - Saturday, September 22, 2018 - link

    So don't buy it, eddman. In the end the only real justification for prices is what people are willing to pay. If one isn't able to make a product cheaply enough for it to be sold for what people are willing to pay then the product is a bad product.

    I don't understand why you are so worried about the price. Or why you think they are "cut-throat". A cut-throat price is a very low price, not a high one.
  • eddman - Sunday, September 23, 2018 - link

    There is a wealthy minority who'd pay that much, and? It's only "justified" if you are an nvidia shareholder.

    The cards are overpriced compared to last gen and that's an absolute fact. Your constant defending of nvidia's pricing is certainly not a normal consumer behavior.
  • mapesdhs - Wednesday, September 26, 2018 - link

    Yojimbo is right that an item is only ever worth what someone is willing to pay, so in that sense NVIDIA can do what it likes, in the end it's up to the market, to consumers, whether the prices "make sense", ie. whether people actually buy them. In this regard the situation we have atm is largely that made by gamers themselves, because even when AMD released competitive products (whether by performance, value, or both), people didn't buy them. There are even people saying atm they hope AMD can release something to compete with Turing just so NVIDIA will drop its prices and thus they can buy a cheaper NVIDIA card; that's completely crazy, AMD would be mad to make something if that's how the market is going to respond.

    What's interesting this time though is that even those who in the past have been happy to buy the more expensive cards are saying they're having major hesitation about buying Turing, and the street cred which used to be perceived as coming with buying the latest & greatest has this time largely gone, people are more likely to react like someone is a gullible money pumped moron for buying these products ("More money than sense!", as my parents used to say). By contrast, when the 8800 GTX came out, that was a huge leap over the 7800 and people were very keen to get one, those who could afford it. Having one was cool. Ditto the later series right through to Maxwell (though a bit of a dip with the GTX 480 due to heat/power). The GTX 460 was a particularly good release (though the endless rebranding later was annoying). Even Pascal was a good bump over what had come before.

    Not this time though, it's a massive price increase for little gain, while the headline features provide sub-60Hz performance at a resolution far below what NVIDIA themselves have been pushing as desirable for the last 5 years (the focus has been on high frequency monitors, 4K and VR); now NVIDIA is trying to roll back the clock, which won't work, especially since those who've gotten used to high frequency monitors physically cannot go back (ref New Scientist, changes in the brain's vision system).

    Thus, eddman is right that the card's are overpriced in a general sense, as they don't remotely match what the market has come to expect from NVIDIA based on previous releases. However, if gamers don't vote with their wallets then nothing will change. Likewise, if AMD releases something just as good, or better value, but gamers don't buy them, then again nothing will change, we'll be stuck with this new expensive normal.

    I miss the Fermi days, buy two GTX 460s to have better performance than a GTX 580, didn't cost much, games ran great, and the lesser VRAM didn't bother me anyway as I wasn't using an uber monitor. Now we have cards that cost many hundreds that don't even support multi-GPU. It's as daft as Intel making the cost entry point to >= 40 PCIe lanes much higher than it was with X79 (today it's almost 1000 UKP); an old cheapo 4820K can literally do things a 7820X can't. :D

    Alas though, again it boils down to individual choice. Some want the fastest possible and if they can afford it then that's up to them, it's their free choice, we don't have the right to tell people they shouldn't buy these cards. It's their money afterall (anything else is communism). It is though an unfortunate reality that if the cards do sell well then NVIDIA will know they can maintain this higher priced and more feature restricted strategy, while selling the premium parts to Enterprise. Btw, it amazes me how people keep comparing the 2080 to the 1080 Ti even though the former has less RAM; how is that an upgrade in the product stack? (people will respond with ray tracing! Ray tracing! A feature which can't be used yet and runs too slow to be useful anyway, and with an initial implementation that's a pretty crippled implementation of the idea aswell).And why doesn't the 2080 Ti have more than 11GB? It really should, unless NVIDIA figures that if they can indeed push people back to 1080p then 11GB is enough anyway, which would be ironic.

    I'm just going to look for a used 1080 Ti, more than enough for my needs. For those with much older cards, a used 980 Ti or 1070, or various AMD cards, are good options.

  • Yojimbo - Wednesday, September 19, 2018 - link

    Yes, exactly. A very appropriate quote. Reply
  • Skiddywinks - Thursday, September 20, 2018 - link

    No reason Ford couldn't have done both though. There is no technological reason nVidia could not have released a GTX 2080 Ti as well. But they know they couldn't charge as much, and the vast majority of people would not buy the RTX version. Instead, it makes their 1080 Ti stock look much more appealing to for value oriented gamers, helping them shift that stock as well as charge a huge price for the new cards.

    It's really great business, but as a gamer and not a stockholder, I'm salty.
  • Spunjji - Friday, September 21, 2018 - link

    Ford didn't invent the car, though. Ford invented a way to make them cheaper.

    Ford's strategy was not to make a new car that might do something different one day and then charge through the effing nose for it.
  • Gastec - Thursday, September 27, 2018 - link

    That quote applies perfectly to our digital electronic World: we want to go faster from point A to point B. To do that, Henry Ford gave us a car (a faster "horse"). We want the same from GPUs and CPU's, to be faster. Prettier sure, pink even. But first just make it fast. Reply
  • Writer's Block - Monday, October 1, 2018 - link

    Except there is no evidence he said that - it is a great statement though, and conveys the intended message well Reply
  • Hxx - Wednesday, September 19, 2018 - link

    overall dissapointing performance. RTX 2080 is a flat out bad buy at $800+ when 1080 ti custom boards are as low as $600. the RTX 2080 TI is a straight up ripoff when consumers can easily surpass its performance with 2 x 1080 TIs. I agree on the conclusion though that you are buying hardware that you wont take adavantage of yet but still, if Nvidia wants to push this hardware to all gamers, they need to drop the pricing in line with their performance otherwise not many will buy into the hype. Reply

Log in

Don't have an account? Sign up now