Meet The GeForce RTX 2080 Ti & RTX 2080 Founders Editions Cards

Moving onto the design of the cards, we've already mentioned the biggest change: a new open air cooler design. Along with the Founders Edition specification changes, the cards might be considered 'reference' in that they remain a first-party video card sold direct by NVIDIA, but strictly-speaking they are not because they no longer carry reference specifications.

Otherwise, NVIDIA's industrial design language prevails, and the RTX cards bring a sleek flattened aesthetic over the polygonal shroud of the 10 series. The silver shroud now encapsulates an integrated backplate, and in keeping with the presentation, the NVLink SLI connectors have a removable cover.

Internally, the dual 13-blade fans accompany a full-length vapor chamber and component baseplate, connected to a dual-slot aluminum finstack. Looking at improving efficiency and granular power control, the 260W RTX 2080 Ti Founders Edition features a 13-phase iMON DrMOS power subsystem with a dedicated 3-phase system for the 14 Gbps GDDR6, while the 225W RTX 2080 Founders Edition weighing in with 8-phases main and 2-phases memory.

As is typical with higher quality designs, NVIDIA is pushing overclocking, and for one that means a dual 8-pin PCIe power configuration for the 2080 Ti; on paper, this puts the maximum draw at 375W, though specifications-wise the TDP of the 2080 Ti Founders Edition against the 1080 Ti Founders Edition is only 10W higher. The RTX 2080 Founders Edition has the more drastic jump, however, with 8+6 pins and a 45W increase over the 1080's lone 8 pin and 180W TDP. Ultimately, it's a steady increase from the power-sipping GTX 980's 165W.

One of the more understated changes comes with the display outputs, which thanks to Turing's new display controller now features DisplayPort 1.4 and DSC support, the latter of which is part of the DP1.4 spec. The eye-catching addition is the VR-centric USB-C VirtualLink port, which also carries an associated 30W not included in the overall TDP.

Something to note is that this change in reference design, combined with the seemingly inherent low-volume nature of the Turing GPUs, cuts into an often overlooked but highly important aspect of GPU sales: big OEMs in the desktop and mobile space. Boutique system integrators will happily incorporate the pricier higher-end parts but from the OEM’s perspective, the GeForce RTX cards are not just priced into a new range beyond existing ones but also bringing higher TDPs and no longer equipped with blower-style coolers in its ‘reference’ implementation.

Given that OEMs often rely on the video card being fully self-exhausting because of a blower, it would certainly preclude a lot of drop-in replacements or upgrades – at least not without further testing. It would be hard to slot into the standard OEM product cycle at the necessary prices, not to mention the added difficulty in marketing. In that respect, there is definitely more to the GeForce RTX 20 series story, and it’s somewhat hard to see OEMs offering GeForce RTX cards. Or even the RT Cores themselves existing below the RTX 2070, just on basis of the raw performance needed for real time ray tracing effects at reasonable resolutions and playable framerates. So it will be very interesting to see how the rest of NVIDIA’s product stack unfolds.

The RTX Recap: A Brief Overview of the Turing RTX Platform The 2018 GPU Benchmark Suite and The Test
Comments Locked

337 Comments

View All Comments

  • mapesdhs - Thursday, September 27, 2018 - link

    It also glosses over the huge pricing differences and the fact that most gamers buy AIB models, not reference cards.
  • noone2 - Thursday, September 20, 2018 - link

    Not sure why people are so negative about these and the prices. Sell your old card and amortize the cost over how long you'll keep the new one. So maybe $400/year (less if you keep it longer).

    If you're a serious gamer, are you really not willing to spend a few hundred dollars per year on your hardware? I mean, the performance is there and it's somewhat future proofed (assuming things take off for RT and DLSS.)

    A bowling league (they still have those?) probably costs more per year than this card. If you only play Minecraft I guess you don't need it, but if you want the highest setting in the newest games and potentially the new technology, then I think it's worth it.
  • milkod2001 - Friday, September 21, 2018 - link

    Performance is not there. Around 20% actual performance boost is not very convincing especially due much higher price. How can you be positive about it?
    Future tech promise doesn't add that much and it is not clear if game developers will bother.
    When one spend $1000 of GPU it has to deliver perfect 4k all maxed gaming and NV charges ever more. This is a joke, NV is just testing how much they can squeeze of us until we simply don't buy.
  • noone2 - Friday, September 21, 2018 - link

    The article clearly says that the Ti is 32% better on average.

    The idea about future tech is you either do it and early adopters pay for it in hopes it catches on, or you never do it and nothing ever improves. Game developers don't really create technology and then ask hardware producers to support it/figure out how to do it. Dice didn't knock on Nvidia's door and pay them to figure out how to do ray tracing in real time.

    My point remains though: If this is a favorite hobby/pass-time, then it's a modest price to pay for what will be hundreds of hours of entertainment and the potential that ray tracing and DLSS and whatever else catches on and you get to experience it sooner rather than later. You're saying this card is too expensive, yet I can find console players who think a $600 video game is too expensive too. Different strokes for different folks. $1100 is not terrible value. You talking hundreds of dollars here, not 10s of thousands of dollars. It's drop in the bucket in the scope of life.
  • mapesdhs - Thursday, September 27, 2018 - link

    So Just Buy It then? Do you work for toms? :D
  • TheJian - Thursday, September 20, 2018 - link

    "Ultimately, gamers can't be blamed for wanting to game with their cards, and on that level they will have to think long and hard about paying extra to buy graphics hardware that is priced extra with features that aren't yet applicable to real-world gaming, and yet only provides performance comparable to previous generation video cards."

    So, I guess I can't read charts, because I thought they said 2080ti was massively faster than anything before it. We also KNOW devs will take 40-100% perf improvements seriously (already said such, and NV has 25 games being worked on now coming soon with support for their tech) and will support NV's new tech since they sell massively more cards than AMD.

    Even the 2080 vs. 1080 is a great story at 4k as the cards part by quite a margin in most stuff.
    IE, battlefield 1, 4k test 2080fe scores 78.9 vs. 56.4 for 1080fe. That's a pretty big win to scoff at calling it comparable is misleading at best correct? Far Cry 5 same story, 57 2080fe, vs. 42 for 1080fe. Again, pretty massive gain for $100. Ashes, 74 to 61fps (2080fe vs. 1080fe). Wolf2 100fps for 2080fe, vs. 60 for 1080fe...LOL. Well, 40% is, uh, "comparable perf"...ROFL. OK, I could go on but whatever dude. Would I buy one if I had a 1080ti, probably not unless I had cash to burn, but for many that usually do buy these things, they just laughed at $100 premiums...ROFL.

    Never mind what these cards are doing to the AMD lineup. No reason to lower cards, I'd plop them on top of the old ones too, since they are the only competition. When you're competing with yourself you just do HEDT like stuff, rather than shoving down the old lines. Stack on top for more margin and profits!

    $100 for future tech and a modest victory in everything or quite a bit more in some things, seems like a good deal to me for a chip we know is expensive to make (even the small one is Titan size).

    Oh, I don't count that fold@home crap, synthetic junk as actual benchmarks because you gain nothing from doing it but a high electric bill (and a hot room). If you can't make money from it, or play it for fun (game), it isn't worth benchmarking something that means nothing. How fast can you spit in the wind 100 times. Umm, who cares. Right. Same story with synthetics.
  • mapesdhs - Thursday, September 27, 2018 - link

    It's future tech that cannot deliver *now*, so what's the point? The performance just isn't there, and it's a pretty poor implementation of what they're boasting about anyway (I thought the demos looked generally awful, though visual realism is less something I care about now anyway, games need to better in other ways). Fact is, the 2080 is quite a bit more expensive than a new 1080 Ti for a card with less RAM and no guarantee these supposed fancy features are going to go anywhere anyway. The 2080 Ti is even worse; it has the speed in some cases, but the price completely spoils the picture, where I am the 2080 Ti is twice the cost of a 1080 Ti, with no VRAM increase either.

    NVIDIA spent the last 5 years pushing gamers into high frequency displays, 4K and VR. Now they're trying to do a total about face. It won't work.
  • lenghui - Thursday, September 20, 2018 - link

    Thanks for rushing the review out. BTW, the auto-play video on every AT page has got to stop. You are turning into Tom's Hardware.
  • milkod2001 - Friday, September 21, 2018 - link

    They are both owned by Purch. Marketing company responsible for those annoying auto play videos and the lowest crap possible From the web section. They go with motto: Ad clicks over anything. Don't think it will change anytime soon. Anand sold his soul twice to Apple and also his web to Purch.
  • mapesdhs - Thursday, September 27, 2018 - link

    One can use Ublock Origin to prevent those jw-player vids.

Log in

Don't have an account? Sign up now