Back when the VESA announced the DisplayPort alt mode for USB Type-C, one of the most common questions was whether we’d see USB ports on a video card. As a standards body the VESA couldn’t answer this question, but ultimately whether that would happen would be up to the manufacturers. Now at Computex, MSI is showing off what looks to be the first high-end video card with a USB-C port for display purposes: the MSI GTX 1080 Ti GAMING X 11G Graphics Card with USB Type C.

The USB-C equipped card is virtually identical to its regular GTX 1080 Ti GAMING X 11G counterpart, except dropping the final DisplayPort for the USB-C port, leaving it with 2 DisplayPorts, a DVI-D port, an HDMI port, and the USB-C port. From a feature perspective, thanks to DisplayPort alt mode, a USB-C port is just as good as a DisplayPort for DP signaling purposes (though it should be noted that you lose TMDS backwards compatibility), which means the switch to a USB-C port doesn’t cost anything, but it also doesn’t necessarily gain anything.

What MSI isn’t commenting on right now is whether this USB-C port will offer anything besides DisplayPort alt mode functionality, such as USB 2.0 data another alt mode. The USB-C alt mode standard requires that the 4 USB 2.0 pins remain untouched, but strictly speaking they don’t seem to be necessary for alt mode to work since that’s handled by the CC pins. However DisplayPort monitors using USB-C for all of their connectivity, such as the LG UltraFine 21.5, will not appreciate the lack of USB data. In which case USB 2.0 data is required for maximum compatibility.

Ultimately, placing a USB-C port on a high-end video card serves a few different purposes for MSI. In the present, it makes the card fully compatible with USB-C monitors like the aforementioned UltraFine, along with any other display setups where you may want to quickly swap between said video card and a USB-C equipped laptop.

However in the longer term, I suspect this may be the first move in a larger (and very long) transition to USB-C for all display connectivity. Part of the purpose for the USB-C standard – and why groups like the VESA embrace it – is that everyone wants to solve the increasingly difficult external bandwidth problem once, and then share the technology (and a common port) rather than each group implementing their own solution. The DisplayPort and its associated cabling are coming up on a decade old and have been through 3 revisions, with the latest standard supporting ~30Gbps of cable bandwidth. Meanwhile the USB-C port and cabling system is intended to support 80Gb/sec (or more) of cable bandwidth. So while nothing has been officially announced at this time, USB-C on video cards may not just be a DisplayPort alternative, but may end up being the future of display connectivity itself.

Steven Lynch contributed to this report

POST A COMMENT

45 Comments

View All Comments

  • SodaAnt - Monday, June 05, 2017 - link

    The newer adapters are much better. I had one of the first gen ones which required USB for power and I had all sorts of issues, but I got another one recently which didn't require external power and was only about $30, and it works great. Reply
  • Mr Perfect - Monday, June 05, 2017 - link

    That's true.

    It makes less sense on a $700 video card though. If someone with a $250 monitor can't afford a $80 adapter, how are they splashing out for a $700 1080 Ti?
    Reply
  • DanNeely - Monday, June 05, 2017 - link

    A lot of people apparently do though. According to the steam HW survey ~4.5% of gamers have a 980/1070 or better, only 3% have a 2560x1440 or higher display. Less than 1% have a 4k monitor; if you add in people with 1440p widescreens and assume almost as many have 2560x1600 screens (this size is bundled in with other) you're still only at about 1.3% of steam gamers have what's likely a $700+ monitor vs 2% having a 1080/980 TI who spent at least that much on their GPU; which means that a large fraction of high end GPU gamers only have cheap monitors for whatever reason. At least 1/3rd of the total, probably more significantly more since I took the largest possible guestimate for 1600p users and at least some of the 980/780 gamers are people who bought them at release prices and just haven't upgraded yet. Reply
  • npz - Monday, June 05, 2017 - link

    > If you have a DL-DVI monitor, you either have something new enough to have DP or HDMI2, or can afford an active DL-DVI adapter

    That is absolutely not true. Many older monitors greater than 1080p do not have other ports. This is especially true of workstation class monitors but also true of some cheaper monitors. I have this Samsung 2343BWX
    https://www.newegg.com/Product/Product.aspx?Item=N...
    - 2048 x 1152 and only DVI and VGA ports
    Reply
  • DanNeely - Monday, June 05, 2017 - link

    AFAIK 2048x1152 should still work with a passive HDMI-SL-DVI adapter; it being the actual max that SL-DVI can support vs1920x1200 being the maximum that is reasonably available and often showing up on specsheets as a result. Reply
  • rtho782 - Monday, June 05, 2017 - link

    I have a 30" 2560x1600 Dell WFP3007-HC as my 2nd monitor (my first is a RoG Swift).

    I wish the founders edition 1080ti (which I bought) had a DVI port. I started off with a random GT620 in my other PCIe slot to drive the Dell, this caused issues, so I bought a DL-DVI adaptor, but this occasionally doesn't resume from sleep properly and I have to disconnect/reconnect it.

    DL-DVI is still useful.
    Reply
  • eek2121 - Sunday, June 04, 2017 - link

    This is a top of the line GPU. Chances are that the target market for this product has decent monitors that either have Displayport or HDMI. Hell, you wouldn't buy a 1080ti for 1080p gaming...would you? (I own a 1080ti and I use it for 1440p gaming...even that is a bit of a waste). Reply
  • chrnochime - Sunday, June 04, 2017 - link

    I'd buy it for VR(is it even powerful enough for silky smooth gaming? I have no idea), and nothing else. I could care less if my 24" screen is only 1080p instead of 4k, since I'll be 2' away from it, and I want my screen to not have freakishly tiny fonts. Reply
  • Lord of the Bored - Thursday, June 08, 2017 - link

    Problem is... you can still buy a monitor today with a VGA port, and that's a DVI-D connector, so it can't even be adapted to DVI-I.
    Support all the monitors, indeed.

    (Lest this be confused for an actual complaint: we're well past the point where VGA SHOULD be a relevant standard, and I'm all for DVI-D disappearing as well. And I abhor HDMI, which is close to the worst possible video interconnect standard. I would be delighted if this thing was DP/USB-C only, or even straight USB-C. But if they're gonna include a DVI connector, it may as well be DVI-I.)
    Reply
  • AllIDoIsWin - Monday, June 05, 2017 - link

    Noo.. that doesn't make sense. Nearly everyone I know is still using DVI. HDMI is for the birds. And nearly all monitors support DVI before HDMI, no? Reply

Log in

Don't have an account? Sign up now