From G-Sync Variable Refresh To G-Sync HDR Gaming Experience

The original FreeSync and G-Sync were solutions to a specific and longstanding problem: fluctuating framerates would cause either screen tearing or, with V-Sync enabled, stutter/input lag. The result of VRR has been a considerably smoother experience in the 30 to 60 fps range. And an equally important benefit was compensating for dips and peaks over the wide ranges introduced with higher refresh rates like 144Hz. So they were very much tied to a single specification that directly described the experience, even if the numbers sometimes didn’t do the experience justice.

Meanwhile, HDR in terms of gaming is a whole suite of things that essentially allows for greater brightness, blacker darkness, and better/more colors. More importantly, this requires developer support for applications and production of HDR content. The end result is not nearly as static as VRR, as much depends on the game’s implementation – or in NVIDIA’s case, sometimes with Windows 10’s implementation. Done properly, even with simply better brightness, there can be perceived enhancements with colorfulness and spatial resolution, which are the Hunt effect and Stevens effect, respectively.

So we can see why both AMD and NVIDIA are pushing the idea of a ‘better gaming experience’, though NVIDIA is explicit about this with G-Sync HDR. The downside of this is that the required specifications for both FreeSync 2 and G-Sync HDR certifications are closed off and only discussed broadly, deferring to VESA’s DisplayHDR standards. Their situations, however, are very different. For AMD, their explanations are a little more open, and outside of HDR requirements, FreeSync 2 also has a lot to do with standardizing SDR VRR quality with mandated LFC, wider VRR range, and lower input lag. Otherwise, they’ve also stated that FreeSync 2’s color gamut, max brightness, and contrast ratio requirements are broadly comparable to those in DisplayHDR 600, though the HDR requirements do not overlap completely. And with FreeSync/FreeSync 2 support on Xbox One models and upcoming TVs, FreeSync 2 appears to be a more straightforward specification.

For NVIDIA, their push is much more general and holistic with respect to feature standards, and purely focused on the specific products. At the same time, they discussed the need for consumer education on the spectrum of HDR performance. While there are specific G-Sync HDR standards as part of their G-Sync certification process, those specifications are only known to NVIDIA and the manufacturers. Nor was much detail provided on minimum requirements outside of HDR10 support, peak 1000 nits brightness, and unspecified coverage of DCI-P3 for the 4K G-Sync HDR models, citing their certification process and deferring detailed capabilities to other certifications that G-Sync HDR monitors may have. In this case, UHD Alliance Premium and DisplayHDR 1000 certifications for the Asus PG27UQ. Which is to say that, at least for the moment, the only G-Sync HDR displays are those that adhere to some very stringent standards; there aren't any monitors under this moniker that offer limited color gamuts or subpar dynamic contrast ratios.

At least with UHD Premium, the certification is specific to 4K resolution, so while the announced 65” 4K 120Hz Big Format Gaming Displays almost surely will be, the 35” curved 3440 × 1440 200Hz models won’t. Practically-speaking, all the capabilities of these monitors are tied into the AU Optronics panels inside them, and we know that NVIDIA worked closely with AUO as well as the monitor manufacturers. As far as we know those AUO panels are only coupled with G-Sync HDR displays, and vice versa. No other standardized specification was disclosed, only referring back to their own certification process and the ‘ultimate gaming display’ ideal.

As much as NVIDIA mentioned consumer education on the HDR performance spectrum, the consumer is hardly any more educated on a monitor’s HDR capabilities with the G-Sync HDR branding. Detailed specifications are left to monitor certifications and manufacturers, which is the status quo. Without a specific G-Sync HDR page, NVIDIA lists G-Sync HDR features under the G-Sync page, and while those features are specified as G-Sync HDR, there is no explanation on the full differences between a G-Sync HDR monitor and a standard G-Sync monitor. The NVIDIA G-Sync HDR whitepaper is primarily background on HDR concepts and a handful of generalized G-Sync HDR details.

For all intents and purposes, G-Sync HDR is presented not as specification or technology but as branding for a premium product family, and right now for consumers it is more useful to think of it that way.

The ASUS ROG SWIFT PG27UQ Review: Premium HDR Gaming When DisplayPort 1.4 Isn’t Enough: Chroma Subsampling
Comments Locked

91 Comments

View All Comments

  • crimsonson - Tuesday, October 2, 2018 - link

    Someone can correct me, but AFAIK there are no native 10 bit RGB support in games. 10-bit panel would at least improve its HDR capabilities.
  • FreckledTrout - Tuesday, October 2, 2018 - link

    The games that say they are HDR should be using 10-bit color as well.
  • a5cent - Wednesday, October 3, 2018 - link

    Any game that supports HDR uses 10 bpp natively. In fact, many games use 10 bpp internally even if they don't support HDR officially.

    That's why a HDR monitor must support the HDR10 video signal (that's the only way to get the 10 bpp frame from the GPU to the monitor).

    OTOH, a 10 bit panel for gaming typically won't provide a perceptible improvement. In practice, 8bit+FRC is just as good. IMHO it's only for editing HDR still imagery where real 10bit panels provide benefits.
  • GreenReaper - Thursday, October 4, 2018 - link

    I have to wonder if 8-bit+FRC makes sense on the client side for niche situations like this, where the bandwidth is insufficient to have full resolution *and* colour depth *and* refresh rate at once?

    You run the risk of banding or flicker, but frankly that's similar for display FRC, and I imagine if the screen was aware of what was happening it might be able to smooth it out. It'd essentially improve the refresh rate of the at the expense of some precise accuracy. Which some gamers might well be willing to take. Of course that's all moot if the card can't even play the game at the target refresh rate.
  • GreenReaper - Thursday, October 4, 2018 - link

    By client, of course, I mean card - it would send an 8-bit signal within the HDR colour gamut and the result would be a frequency-interpolated output hopefully similar to that possible now - but by restricting at the graphics card end you use less bandwidth, and hopefully it doesn't take too much power.
  • a5cent - Thursday, October 4, 2018 - link

    "I have to wonder if 8-bit+FRC makes sense on the client side for niche situations like this"

    It's an interesting idea, but I don't think it can work.

    The core problem is that the monitor then has no way of knowing if in such an FRC'ed image, a bright pixel next to a darker pixel correctly describes the desired content, or if it's just an FRC artifact.

    Two neighboring pixels of varying luminance affect everything from how to control the individual LEDs in a FALD backlight, to when and how strongly to overdrive pixels to reduce motion blur. You can't do these things in the same way (or at all) if the luminance delta is merely an FRC artifact.

    As a result, the GPU would have to control everything that is currently handled by the monitor's controller + firmware, because only it has access to the original 10 bpp image. That would be counter productive, because then you'd also have to transport all the signaling information (for the monitor's backlighting and pixels) from the GPU to the monitor, which would require far more bandwidth than the 2 bpp you set out to save 😕

    What you're thinking about is essentially a compression scheme to save bandwidth. Even if it did work, employing FRC in this way is lossy and nets you, at best, a 20% bandwidth reduction.

    However, the DP1.4(a) standard already defines a compression scheme. DSC is lossless and nets you about 30%.That would be the way to do what you're thinking of.

    Particularly 4k DP1.4 gaming monitors are in dire need of this. That nVidia and Acer/Asus would implement chroma subsampling 4:2:2 (which is also a lossy compression scheme) rather than DSC is shameful. 😳

    I wonder if nVidia's newest $500+ g-sync module is even capable of DSC. I suspect it is not.
  • Zoolook - Friday, October 5, 2018 - link

    DSC is not lossless, it's "visually lossless", which means that most of the time you shouldn't percieve a difference compared to an uncompressed stream.
    I'll reserve my judgement until I see some implementations.
  • Impulses - Tuesday, October 2, 2018 - link

    That Asus PA32UC wouldn't get you G-Sync or refresh rates over 60Hz and it's still $975 tho... It sucks that the display market is so fractured and people who use their PCs for gaming as well as content creation can't get anything approaching perfect or even ideal at times.

    There's a few 4K 32" displays with G-Sync or Freesync but they don't go past 60-95Hz AFAIK and then you don't get HDR, it's all a compromise, and has been for years due to competing adaptive sync standards, lagging connection standards, a lagging GPU market, etc etc.
  • TristanSDX - Tuesday, October 2, 2018 - link

    Soon there will be new PG27UC, with mini led backlight (10000 diodes vs 384) and with DSC
  • DanNeely - Tuesday, October 2, 2018 - link

    Eventually, but not soon. AUO is the only panel company working on 4k/high refresh/HDR; and they don't have anything with more dimming zones on their public road map (which is nominally about a year out for their production, add a few months to it for monitors makers to package them and get them to retail up once they start volume production of panels).

Log in

Don't have an account? Sign up now