When DisplayPort 1.4 Isn’t Enough: Chroma Subsampling

One of the key elements that even makes G-Sync HDR monitors possible – and yet still holds them back at the same time – is the amount of bandwidth available between a video card and a monitor. DisplayPort 1.3/1.4 increased this to just shy of 26Gbps of video data, which is a rather significant amount of data to send over a passive, two-meter cable. Still, the combination of high refresh rates, high bit depths, and HDR metadata pushes the bandwidth requirements much higher than DisplayPort 1.4 can handle.

All told, DisplayPort 1.4 was designed with just enough bandwidth to support 3840x2160 at 120Hz with 8bpc color, coming in at 25.81Gbps of 25.92Gbps of bandwidth. Notably, this isn’t enough bandwidth for any higher refresh rates, particularly not 144MHz. Meanwhile when using HDR paired with the P3 color space, where you’ll almost certainly want 10bpc color, there’s only enough bandwidth to drive it at 98Hz.

DisplayPort Bandwidth
Standard Raw Effective
DisplayPort 1.1 (HBR1) 10.8Gbps 8.64Gbps
DisplayPort 1.2 (HBR2) 21.8Gbps 17.28Gbps
DisplayPort 1.3/1.4 (HBR3) 32.4Gbps 25.92Gbps

As a result, for these first generation of monitors at least, NVIDIA has resorted to a couple of tricks to make a 144Hz 4K monitor work within the confines of current display technologies. Chief among these is support for chroma subsampling.

Chroma subsampling is a term that has become a little better known in the last few years, but the odds are most PC users have never really noticed the technique. In a nutshell, chroma subsampling is a means to reduce the amount of chroma (color) data in an image, allowing images and video data to either be stored in less space or transmitted over constrained links. I’ve seen it referred to compression at some points, and while the concept is indeed similar it’s important to note that chroma subsampling doesn’t try to recover lost color information nor does it even intelligently discard color information, so it’s perhaps thought better as a semi-graceful means of throwing out color data. In any case, the use of chroma subsampling is as old as color television, however its use in anything approaching mainstream monitors is much newer.

So how does chroma subsampling work? To understand chroma subsampling, it’s important to understand the Y'CbCr color space it operates on. As opposed to tried and true (and traditional) RGB – which stores the intensity of each color subpixel in a separate channel – Y'CbCr instead stores luma (light intensity) and chroma (color) separately. While the transformation process is not important, at the end of the day you have one channel of luma (Y) and two channels of color (CbCr), which add up to an image equivalent to RGB.

Chroma subsampling, in turn, is essentially a visual hack on the human visual system. Humans are more sensitive to luma than chroma, so as it goes, some chroma information can be discarded without significantly reducing the quality of an image.

The technique covers a range of different patterns, but by far the most common patterns, in order of image quality, are 4:4:4, 4:2:2:, and 4:2:0. 4:4:4 is a full chroma image, equivalent to RGB. 4:2:2 is a half chroma image that discards half of the horizontal color information, and requires just 66% of the data as 4:4:4/RGB. Finally 4:2:0 is a quarter chroma image, which discards half of the horizontal and half of the vertical color information. In turn it achieves a full 50% reduction in the amount of data required versus 4:4:4/RGB.


Wikipedia: diagram on chroma subsampling

In the PC space, chroma subsampling is primarily used for storage purposes. JPEG employs various modes to save on space, and virtually every video you’ve ever seen, from YouTube to Blu-rays, has been encoded with 4:2:0 chroma. In practice chroma subsampling is bad for text because of the fine detail involved – which is why PCs don’t use it for desktop work – but for images it works remarkably well.

Getting back to the matter of G-Sync then, the same principle applies to bandwidth savings over the DisplayPort connection. If DP 1.4 can only deliver enough bandwidth to get to 98Hz with RGB/4:4:4 subsampling, then going down one level, to 4:2:2, can free up enough bandwidth to reach 144Hz.

Users, in turn, are given a choice between the two options. When using HDR they can either pick to stick with a 98Hz refresh rate and get full 4:4:4 subsampling, or drop to 4:2:2 for 144Hz.

In practice for desktop usage, most users are going to be running without HDR due to Windows’ shaky color management, so the issue is moot and they can run at 120Hz without any colorspace compromises. It’s in games and media playback where HDR will be used, and at that point the quality tradeoffs for 4:2:2 subsampling will be less obvious, or so NVIDIA’s reasoning goes. Adding an extra wrinkle, even on an RTX 2080 Ti few high-fidelity HDR-enabled games will be able to pass 98fps to begin with, so the higher refresh rate isn’t likely to be needed right now. Still, if you want HDR and access to 120Hz+ refresh rates – or SDR and 144Hz for that matter – then there are tradeoffs to be made.

On that note, it’s worth pointing out that to actually go past 120Hz, the current crop of G-Sync HDR monitors require overclocking. This appears to be a limitation of the panel itself; with 4:2:2 subsampling there’s enough bandwidth for 144Hz even with HDR, so it’s not another bandwidth limitation that’s stopping these monitors at 120Hz. Rather the purpose of overclocking is to push the panel above its specifications (something it seems plenty capable of doing), allowing the panel to catch up with the DisplayPort connection to drive the entire device at 144Hz.

Meanwhile on a quick tangent, I know a few people have asked why NVIDIA hasn’t used the VESA’s actual compression technology, Display Stream Compression (DSC). NVIDIA hasn’t officially commented on the matter, and I don’t really expect they will.

However from talking to other sources, DSC had something of a rough birth. The version of the DSC specification used in DP 1.4 lacked support for some features manufacturers wanted like 4:2:0 chroma subsampling, while DP1.4 itself lacked a clear definition of how Forward Error Correction would work with DSC. As a result, manufacturers have been holding off on supporting DSC. To that end, the VESA quietly released the DisplayPort 1.4a specification back in April to resolve the issue, with the latest standard essentially serving as the “production version” of DisplayPort with DSC. As a result, DSC implementation and adoption is just now taking off.

As NVIDIA controls the entire G-Sync HDR ecosystem, they aren’t necessarily reliant on common standards. None the less, if DSC wasn’t in good shape to use in 2016/2017 when G-Sync HDR was being developed, then it’s as good a reason as any that I’ve heard for why we’re not seeing G-Sync HDR using DSC.

From G-Sync Variable Refresh To G-Sync HDR Gaming Experience The (Asus) G-Sync HDR Experience: Premium Panel for Premium Price
Comments Locked

91 Comments

View All Comments

  • Flunk - Tuesday, October 2, 2018 - link

    I'd really like one of these, but I can't really justify $2000 because I know that in 6-months to a year competition will arrive that severely undercuts this price.
  • imaheadcase - Tuesday, October 2, 2018 - link

    That's just technology in general. But keep a eye out, around that time this monitor is coming out with a revision that will remove the "gaming" features" but still maintain refresh rate and size.
  • edzieba - Tuesday, October 2, 2018 - link

    The big omission to watch out for is the FALD backlight. Without that, HDR cannot be achieved outside of an OLED panel (and even then OLED cannot yet meet the peak luminance levels). You;re going to see a lot of monitors that are effectively SDR panels with the brightness turned up, and sold as 'HDR'. If you're old enough to remember when HDTV was rolling uout, remember the wave of budget 'HD' TVs that used SD panels but accepted and downsampled HD inputs? Same situation here.
  • Hixbot - Tuesday, October 2, 2018 - link

    Pretty sure edgelit displays can hit the higher gamut by using a quantom dot filter.
  • DanNeely - Tuesday, October 2, 2018 - link

    quantum dots increase the color gamut, HDR is about increasing the luminescence range on screen at any time. Edge lit displays only have a handful of dimming zones at most (no way to get more when your control consists of only 1 configurable value per row/column). You need back lighting where each small chunk of the screen can be controlled independently to get anything approaching a decent result. Per pixel is best, but only doable with OLED or jumbotron size displays. (MicroLED - we can barely make normal LEDs small enough for this scale.) OTOH if costs can be brought down microLED does have the potential to power a FALD backlight with an order of magnitude or more more dimming zones than current models LCD can do; enough to largely make halo effects around bright objects a negligible issue.
  • Lolimaster - Tuesday, October 2, 2018 - link

    There is also miniled that will replace regular led for the backlight.

    Microled = OLED competition
    Miniled up to 50,000zones (cheap "premium phones" will come with 48zones).
  • crimsonson - Tuesday, October 2, 2018 - link

    I think you are exaggerating a bit. HDR is just a transform function. There are several standards that say what the peak luminance should be to considered HDR10 or Dolby Vision etc. But that itself is misleading.

    Define " (and even then OLED cannot yet meet the peak luminance levels)"
    Because OLED can def reach 600+ nits, which is one of the standards for HDR being proposed.
  • edzieba - Tuesday, October 2, 2018 - link

    "HDR is just a transform function"

    Just A transform function? [Laughs in Hybrid Log Gamma],

    Joking aside, HDR is also a set of minimum requirements. Claiming panels that do not even come close to meeting those requirements are also HDR is akin to claiming that 720x468 is HD, because "it's just a resolution". The requirements range far beyond just peak luminance levels, which is why merely slapping a big-ass backlight to a panel and claiming it is 'HDR' is nonsense.
  • crimsonson - Wednesday, October 3, 2018 - link

    "
    Just A transform function? [Laughs in Hybrid Log Gamma],"

    And HLG is again just a standard of how to handle HDR and SDR. It is not required or needed to display HDR images.

    "HDR is also a set of minimum requirements"

    No, there are STANDARDS that attempts to address HDR features across products and in video production. But in itself does not mean violating those standards equate to a non-HDR image. Dolby Vision, for example, supports dynamic metadata. HDR10 does not. Does that make HDR10 NOT HDR?
    Eventually, the market and the industry to congregate behind 1 or 2 SET of standards (since it is not only about 1 number or feature). But we are not there yet. Far from it.

    Since you like referencing these standards, you do know that Vesa has HDR standards as low as 400 and 600 nits right?

    And I think you are conflating wide gamut vs Dynamic Range. FALD is not needed to achieve wide gamut.

    And using HD to illustrate your points exemplifies you don't understand how standards work in broadcast and manufacturing.
  • edzieba - Thursday, October 4, 2018 - link

    "And HLG is again just a standard of how to handle HDR and SDR. It is not required or needed to display HDR images."

    The joke was that there are already at least 3 standards of HDR transfer functions, and some (e.g. Dolby Vision) allow for on the fly modification of the transfer function.

    "And I think you are conflating wide gamut vs Dynamic Range. FALD is not needed to achieve wide gamut."

    Nobody mentioned gamut. High Dynamic Range requires, as the name implies, a high dynamic range. LCD panels cannot achieve that high dynamic range on their own, they need a segmented backlight modulator to do so.
    As much as marketers would want you to believe otherwise, a straight LCD panel with an edge-lit backlight is not going to provide HDR.

    "And using HD to illustrate your points exemplifies you don't understand how standards work in broadcast and manufacturing."

    Remember how "HD ready" was brought in to address exactly the same problem of devices marketing capabilities they did not have? And how it brought complaints about allowing 720p devices to also advertise themselves as "HD Ready"? Is this not analogous to the current situation where HDR is being erroneously applied to panels that cannot achieve it, and how VESA's DisplayHDR has complaints that anything below Display HDR1000 is basically worthless?

Log in

Don't have an account? Sign up now