Meet The New Future of Gaming: Different Than The Old One

Up until last month, NVIDIA had been pushing a different, more conventional future for gaming and video cards, perhaps best exemplified by their recent launch of 27-in 4K G-Sync HDR monitors, courtesy of Asus and Acer. The specifications and display represented – and still represents – the aspired capabilities of PC gaming graphics: 4K resolution, 144 Hz refresh rate with G-Sync variable refresh, and high-quality HDR. The future was maxing out graphics settings on a game with high visual fidelity, enabling HDR, and rendering at 4K with triple-digit average framerate on a large screen. That target was not achievable by current performance, at least, certainly not by single-GPU cards. In the past, multi-GPU configurations were a stronger option provided that stuttering was not an issue, but recent years have seen both AMD and NVIDIA take a step back from CrossFireX and SLI, respectively.

Particularly with HDR, NVIDIA expressed a qualitative rather than quantitative enhancement in the gaming experience. Faster framerates and higher resolutions were more known quantities, easily demoed and with more intuitive benefits – though in the past there was the perception of 30fps as cinematic, and currently 1080p still remains stubbornly popular – where higher resolution means more possibility for details, higher even framerates meant smoother gameplay and video. Variable refresh rate technology soon followed, resolving the screen-tearing/V-Sync input lag dilemma, though again it took time to catch on to where it is now – nigh mandatory for a higher-end gaming monitor.

For gaming displays, HDR was substantively different than adding graphical details or allowing smoother gameplay and playback, because it meant a new dimension of ‘more possible colors’ and ‘brighter whites and darker blacks’ to gaming. Because HDR capability required support from the entire graphical chain, as well as high-quality HDR monitor and content to fully take advantage, it was harder to showcase. Added to the other aspects of high-end gaming graphics and pending the further development of VR, this was the future on the horizon for GPUs.

But today NVIDIA is switching gears, going to the fundamental way computer graphics are modelled in games today. Of the more realistic rendering processes, light can be emulated as rays that emit from their respective sources, but computing even a subset of the number of rays and their interactions (reflection, refraction, etc.) in a bounded space is so intensive that real time rendering was impossible. But to get the performance needed to render in real time, rasterization essentially boils down 3D objects as 2D representations to simplify the computations, significantly faking the behavior of light.

It’s on real time ray tracing that NVIDIA is staking its claim with GeForce RTX and Turing’s RT Cores. Covered more in-depth in our architecture article, NVIDIA’s real time ray tracing implementation takes all the shortcuts it can get, incorporating select real time ray tracing effects with significant denoising but keeping rasterization for everything else. Unfortunately, this hybrid rendering isn’t orthogonal to the previous concepts. Now, the ultimate experience would be hybrid rendered 4K with HDR support at high, steady, and variable framerates, though GPUs didn’t have enough performance to get to that point under traditional rasterization.

There’s a still a performance cost incurred with real time ray tracing effects, except right now only NVIDIA and developers have a clear idea of what it is. What we can say is that utilizing real time ray tracing effects in games may require sacrificing some or all three of high resolution, ultra high framerates, and HDR. HDR is limited by game support more than anything else. But the first two have arguably minimum performance standards when it comes to modern high-end gaming on PC – anything under 1080p is completely unpalatable, and anything under 30fps or more realistically 45 to 60fps hurts the playability. Variable refresh rate can mitigate the latter and framedrops are temporary, but low resolution is forever.

Ultimately, the real time ray tracing support needs to be implemented by developers via a supporting API like DXR – and many have been working hard on doing so – but currently there is no public timeline of application support for real time ray tracing, Tensor Core accelerated AI features, and Turing advanced shading. The list of games with support for Turing features - collectively called the RTX platform - will be available and updated on NVIDIA's site.

The RTX 2080 Ti & 2080 Review The RTX Recap: A Brief Overview of the Turing RTX Platform
Comments Locked

337 Comments

View All Comments

  • mapesdhs - Thursday, September 27, 2018 - link

    It also glosses over the huge pricing differences and the fact that most gamers buy AIB models, not reference cards.
  • noone2 - Thursday, September 20, 2018 - link

    Not sure why people are so negative about these and the prices. Sell your old card and amortize the cost over how long you'll keep the new one. So maybe $400/year (less if you keep it longer).

    If you're a serious gamer, are you really not willing to spend a few hundred dollars per year on your hardware? I mean, the performance is there and it's somewhat future proofed (assuming things take off for RT and DLSS.)

    A bowling league (they still have those?) probably costs more per year than this card. If you only play Minecraft I guess you don't need it, but if you want the highest setting in the newest games and potentially the new technology, then I think it's worth it.
  • milkod2001 - Friday, September 21, 2018 - link

    Performance is not there. Around 20% actual performance boost is not very convincing especially due much higher price. How can you be positive about it?
    Future tech promise doesn't add that much and it is not clear if game developers will bother.
    When one spend $1000 of GPU it has to deliver perfect 4k all maxed gaming and NV charges ever more. This is a joke, NV is just testing how much they can squeeze of us until we simply don't buy.
  • noone2 - Friday, September 21, 2018 - link

    The article clearly says that the Ti is 32% better on average.

    The idea about future tech is you either do it and early adopters pay for it in hopes it catches on, or you never do it and nothing ever improves. Game developers don't really create technology and then ask hardware producers to support it/figure out how to do it. Dice didn't knock on Nvidia's door and pay them to figure out how to do ray tracing in real time.

    My point remains though: If this is a favorite hobby/pass-time, then it's a modest price to pay for what will be hundreds of hours of entertainment and the potential that ray tracing and DLSS and whatever else catches on and you get to experience it sooner rather than later. You're saying this card is too expensive, yet I can find console players who think a $600 video game is too expensive too. Different strokes for different folks. $1100 is not terrible value. You talking hundreds of dollars here, not 10s of thousands of dollars. It's drop in the bucket in the scope of life.
  • mapesdhs - Thursday, September 27, 2018 - link

    So Just Buy It then? Do you work for toms? :D
  • TheJian - Thursday, September 20, 2018 - link

    "Ultimately, gamers can't be blamed for wanting to game with their cards, and on that level they will have to think long and hard about paying extra to buy graphics hardware that is priced extra with features that aren't yet applicable to real-world gaming, and yet only provides performance comparable to previous generation video cards."

    So, I guess I can't read charts, because I thought they said 2080ti was massively faster than anything before it. We also KNOW devs will take 40-100% perf improvements seriously (already said such, and NV has 25 games being worked on now coming soon with support for their tech) and will support NV's new tech since they sell massively more cards than AMD.

    Even the 2080 vs. 1080 is a great story at 4k as the cards part by quite a margin in most stuff.
    IE, battlefield 1, 4k test 2080fe scores 78.9 vs. 56.4 for 1080fe. That's a pretty big win to scoff at calling it comparable is misleading at best correct? Far Cry 5 same story, 57 2080fe, vs. 42 for 1080fe. Again, pretty massive gain for $100. Ashes, 74 to 61fps (2080fe vs. 1080fe). Wolf2 100fps for 2080fe, vs. 60 for 1080fe...LOL. Well, 40% is, uh, "comparable perf"...ROFL. OK, I could go on but whatever dude. Would I buy one if I had a 1080ti, probably not unless I had cash to burn, but for many that usually do buy these things, they just laughed at $100 premiums...ROFL.

    Never mind what these cards are doing to the AMD lineup. No reason to lower cards, I'd plop them on top of the old ones too, since they are the only competition. When you're competing with yourself you just do HEDT like stuff, rather than shoving down the old lines. Stack on top for more margin and profits!

    $100 for future tech and a modest victory in everything or quite a bit more in some things, seems like a good deal to me for a chip we know is expensive to make (even the small one is Titan size).

    Oh, I don't count that fold@home crap, synthetic junk as actual benchmarks because you gain nothing from doing it but a high electric bill (and a hot room). If you can't make money from it, or play it for fun (game), it isn't worth benchmarking something that means nothing. How fast can you spit in the wind 100 times. Umm, who cares. Right. Same story with synthetics.
  • mapesdhs - Thursday, September 27, 2018 - link

    It's future tech that cannot deliver *now*, so what's the point? The performance just isn't there, and it's a pretty poor implementation of what they're boasting about anyway (I thought the demos looked generally awful, though visual realism is less something I care about now anyway, games need to better in other ways). Fact is, the 2080 is quite a bit more expensive than a new 1080 Ti for a card with less RAM and no guarantee these supposed fancy features are going to go anywhere anyway. The 2080 Ti is even worse; it has the speed in some cases, but the price completely spoils the picture, where I am the 2080 Ti is twice the cost of a 1080 Ti, with no VRAM increase either.

    NVIDIA spent the last 5 years pushing gamers into high frequency displays, 4K and VR. Now they're trying to do a total about face. It won't work.
  • lenghui - Thursday, September 20, 2018 - link

    Thanks for rushing the review out. BTW, the auto-play video on every AT page has got to stop. You are turning into Tom's Hardware.
  • milkod2001 - Friday, September 21, 2018 - link

    They are both owned by Purch. Marketing company responsible for those annoying auto play videos and the lowest crap possible From the web section. They go with motto: Ad clicks over anything. Don't think it will change anytime soon. Anand sold his soul twice to Apple and also his web to Purch.
  • mapesdhs - Thursday, September 27, 2018 - link

    One can use Ublock Origin to prevent those jw-player vids.

Log in

Don't have an account? Sign up now