Meet The GeForce RTX 2080 Ti & RTX 2080 Founders Editions Cards

Moving onto the design of the cards, we've already mentioned the biggest change: a new open air cooler design. Along with the Founders Edition specification changes, the cards might be considered 'reference' in that they remain a first-party video card sold direct by NVIDIA, but strictly-speaking they are not because they no longer carry reference specifications.

Otherwise, NVIDIA's industrial design language prevails, and the RTX cards bring a sleek flattened aesthetic over the polygonal shroud of the 10 series. The silver shroud now encapsulates an integrated backplate, and in keeping with the presentation, the NVLink SLI connectors have a removable cover.

Internally, the dual 13-blade fans accompany a full-length vapor chamber and component baseplate, connected to a dual-slot aluminum finstack. Looking at improving efficiency and granular power control, the 260W RTX 2080 Ti Founders Edition features a 13-phase iMON DrMOS power subsystem with a dedicated 3-phase system for the 14 Gbps GDDR6, while the 225W RTX 2080 Founders Edition weighing in with 8-phases main and 2-phases memory.

As is typical with higher quality designs, NVIDIA is pushing overclocking, and for one that means a dual 8-pin PCIe power configuration for the 2080 Ti; on paper, this puts the maximum draw at 375W, though specifications-wise the TDP of the 2080 Ti Founders Edition against the 1080 Ti Founders Edition is only 10W higher. The RTX 2080 Founders Edition has the more drastic jump, however, with 8+6 pins and a 45W increase over the 1080's lone 8 pin and 180W TDP. Ultimately, it's a steady increase from the power-sipping GTX 980's 165W.

One of the more understated changes comes with the display outputs, which thanks to Turing's new display controller now features DisplayPort 1.4 and DSC support, the latter of which is part of the DP1.4 spec. The eye-catching addition is the VR-centric USB-C VirtualLink port, which also carries an associated 30W not included in the overall TDP.

Something to note is that this change in reference design, combined with the seemingly inherent low-volume nature of the Turing GPUs, cuts into an often overlooked but highly important aspect of GPU sales: big OEMs in the desktop and mobile space. Boutique system integrators will happily incorporate the pricier higher-end parts but from the OEM’s perspective, the GeForce RTX cards are not just priced into a new range beyond existing ones but also bringing higher TDPs and no longer equipped with blower-style coolers in its ‘reference’ implementation.

Given that OEMs often rely on the video card being fully self-exhausting because of a blower, it would certainly preclude a lot of drop-in replacements or upgrades – at least not without further testing. It would be hard to slot into the standard OEM product cycle at the necessary prices, not to mention the added difficulty in marketing. In that respect, there is definitely more to the GeForce RTX 20 series story, and it’s somewhat hard to see OEMs offering GeForce RTX cards. Or even the RT Cores themselves existing below the RTX 2070, just on basis of the raw performance needed for real time ray tracing effects at reasonable resolutions and playable framerates. So it will be very interesting to see how the rest of NVIDIA’s product stack unfolds.

The RTX Recap: A Brief Overview of the Turing RTX Platform The 2018 GPU Benchmark Suite and The Test
Comments Locked

337 Comments

View All Comments

  • dustwalker13 - Sunday, September 23, 2018 - link

    Way too expensive. If those cards were the same price as the 1080 / TI it would be a generation change.

    This actually is a regression, the price has increased out of every proportion even if you completely were to ignore that newer generations are EXPECTED to be a lot faster than the older ones for essentially the same price (plus a bit of inflation max).

    paying 50% more for a 30% increase is a simple ripoff. no one should buy these cards ... sadly a lot of people will let themselves get ripped off once more by nvidia.

    and no: raytracing is not an argument here, this feature is not supported anywhere and by the time it will be adopted (if ever) years will have gone bye and these cards will be old and obsolete. all of this is just marketing and hot air.
  • mapesdhs - Thursday, September 27, 2018 - link

    There are those who claim buying RTX to get the new features is sensible for future proofing; I don't understand why they ignore that the performance of said features on these cards is so poor. NVIDIA spent years pushing gamers into high frequency monitors, 4K and VR, now they're trying to flip everyone round the other way, pretend that sub-60Hz 1080p is ok.

    And btw, it's a lot more than 50%. Where I am (UK) the 2080 Ti is almost 100% more than the 1080 Ti launch price. It's also crazy that the 2080 Ti does not have more RAM, it should have had at least 16GB. My guess is NVIDIA knows that by pushing gamers back down to lower resolutions, they don't need so much VRAM. People though have gotten used to the newer display tech, and those who've adopted high refresh displays physically cannot go back.
  • milkod2001 - Monday, September 24, 2018 - link

    I wonder if NV did not hike prices so much on purpose so they don't have to lower existing GTX prices that much. There is still ton of GTX in stock.
  • poohbear - Friday, September 28, 2018 - link

    Why the tame conclusion? Do us a favor will u? Just come out and say you'd have to be mental to pay these prices for features that aren't even available yet and you'd be better off buying previous gen video cards that are currently heavily discounted.
  • zozaino - Thursday, October 4, 2018 - link

    i really want to use it
  • zozaino - Thursday, October 4, 2018 - link

    i really want to use it https://subwaysurfers.vip/
    https://psiphon.vip/
    https://hillclimbracing.vip/
  • Luke212 - Wednesday, October 24, 2018 - link

    Why does the 2080ti not use the tensor cores for GEMM? I have not seen any benchmark anywhere showing it working. It would be a good news story if Nvidia secretly gimped the tensor cores.

Log in

Don't have an account? Sign up now