SDR Color Modes: sRGB and Wide Gamut

Pre-calibration/calibration steps of the monitor is done with SpectraCal’s CalMAN 5 suite. For contrast and brightness, the X-Rite i1DisplayPro colorimeter is used, but for the actual color accuracy readings we use the X-Rite i1Pro spectrophotometer. Pre-calibration measurements were done at 200 nits for sRGB and Wide Gamut with Gamma set to 2.2.

The PG27UQ comes with two color modes for SDR input: 'sRGB' and 'Wide Gamut.' Advertised as DCI-P3 coverage, the actual 'Wide Gamut' sits somewhere between DCI-P3 and BT.2020 HDR, which is right in line with minimum coverages required by DisplayHDR 1000 and UHD Premium. That being the case, the setting isn't directly calibrated to a color gamut, as opposed to sRGB.

Out-of-the-box, the monitor defaults to 8 bits per color, which can be changed in NVIDIA Control Panel. Either way, sRGB accuracy is very good, as the monitor comes factory-calibrated. To note, 10bpc for the PG27UQ is with dithering (8bpc+FRC).


SpectraCal CalMAN sRGB color space for PG27UQ, with out-of-the-box default 8bpc (top) and default with 10bpc (bottom)

In 8bpc or 10bpc, average delta E is around 1.5, which corresponds with the included factory calibration result of 1.62; for reference, for color accuracy a dE below 1.0 is generally imperceptible and a dE below 3.0 is considered accurate.

SpectraCal CalMAN DCI-P3 (above) and BT.2020 (below) color spaces for PG27UQ, on default settings with 10bpc and 'wide color gamut' enabled under SDR Input

The 'wide gamut' options are not mapped to either DCI-P3 or BT.2020, sitting somewhere in between, but then again, it doesn't need to be as a professional or prosumer monitor would.

Grayscale and Saturation

Looking at color accuracy more throughly, we look at greyscale and saturation readings with respect to the sRGB gamut. The dips in gamma aren't perfect, and the whitepoints are a little on the warm side.

SpectraCal CalMAN sRGB color space grayscales with out-of-the-box default 8bpc (top) and default with 10bpc (bottom)

The saturation numbers are better, and in fact the dE is around 1.5 to 1.4, which is impressive for a gaming monitor.

SpectraCal CalMAN sRGB color space saturation sweeps for PG27UQ, with out-of-the-box default 8bpc (top) and default with 10bpc (bottom)

 

Gretag Macbeth (GMB) and Color Comparator

The last color accuracy test is the most thorough, and again the PG27UQ shines with dE of 1.53 and 1.63

SpectraCal CalMAN sRGB color space GMB for PG27UQ, with out-of-the-box default 8bpc (top) and default with 10bpc (bottom)

Considering that this monitor was not designed for professional use, it's very calibrated out-of-the-box for gamers, and there's no strong concern for calibration. If anything, users should just be sure to select 10bpc in the NVIDIA Control Panel, but even then most games use 8bpc anyhow.

SpectraCal CalMAN sRGB relative color comparator graphs for PG27UQ, with out-of-the-box default 8bpc (top) and default with 10bpc (bottom). Each color column is split into halves; the top half is the PG27UQ's reproduction and the bottom half is the correct value

Brightness and Contrast HDR Color and Luminance
Comments Locked

91 Comments

View All Comments

  • crimsonson - Tuesday, October 2, 2018 - link

    Someone can correct me, but AFAIK there are no native 10 bit RGB support in games. 10-bit panel would at least improve its HDR capabilities.
  • FreckledTrout - Tuesday, October 2, 2018 - link

    The games that say they are HDR should be using 10-bit color as well.
  • a5cent - Wednesday, October 3, 2018 - link

    Any game that supports HDR uses 10 bpp natively. In fact, many games use 10 bpp internally even if they don't support HDR officially.

    That's why a HDR monitor must support the HDR10 video signal (that's the only way to get the 10 bpp frame from the GPU to the monitor).

    OTOH, a 10 bit panel for gaming typically won't provide a perceptible improvement. In practice, 8bit+FRC is just as good. IMHO it's only for editing HDR still imagery where real 10bit panels provide benefits.
  • GreenReaper - Thursday, October 4, 2018 - link

    I have to wonder if 8-bit+FRC makes sense on the client side for niche situations like this, where the bandwidth is insufficient to have full resolution *and* colour depth *and* refresh rate at once?

    You run the risk of banding or flicker, but frankly that's similar for display FRC, and I imagine if the screen was aware of what was happening it might be able to smooth it out. It'd essentially improve the refresh rate of the at the expense of some precise accuracy. Which some gamers might well be willing to take. Of course that's all moot if the card can't even play the game at the target refresh rate.
  • GreenReaper - Thursday, October 4, 2018 - link

    By client, of course, I mean card - it would send an 8-bit signal within the HDR colour gamut and the result would be a frequency-interpolated output hopefully similar to that possible now - but by restricting at the graphics card end you use less bandwidth, and hopefully it doesn't take too much power.
  • a5cent - Thursday, October 4, 2018 - link

    "I have to wonder if 8-bit+FRC makes sense on the client side for niche situations like this"

    It's an interesting idea, but I don't think it can work.

    The core problem is that the monitor then has no way of knowing if in such an FRC'ed image, a bright pixel next to a darker pixel correctly describes the desired content, or if it's just an FRC artifact.

    Two neighboring pixels of varying luminance affect everything from how to control the individual LEDs in a FALD backlight, to when and how strongly to overdrive pixels to reduce motion blur. You can't do these things in the same way (or at all) if the luminance delta is merely an FRC artifact.

    As a result, the GPU would have to control everything that is currently handled by the monitor's controller + firmware, because only it has access to the original 10 bpp image. That would be counter productive, because then you'd also have to transport all the signaling information (for the monitor's backlighting and pixels) from the GPU to the monitor, which would require far more bandwidth than the 2 bpp you set out to save 😕

    What you're thinking about is essentially a compression scheme to save bandwidth. Even if it did work, employing FRC in this way is lossy and nets you, at best, a 20% bandwidth reduction.

    However, the DP1.4(a) standard already defines a compression scheme. DSC is lossless and nets you about 30%.That would be the way to do what you're thinking of.

    Particularly 4k DP1.4 gaming monitors are in dire need of this. That nVidia and Acer/Asus would implement chroma subsampling 4:2:2 (which is also a lossy compression scheme) rather than DSC is shameful. 😳

    I wonder if nVidia's newest $500+ g-sync module is even capable of DSC. I suspect it is not.
  • Zoolook - Friday, October 5, 2018 - link

    DSC is not lossless, it's "visually lossless", which means that most of the time you shouldn't percieve a difference compared to an uncompressed stream.
    I'll reserve my judgement until I see some implementations.
  • Impulses - Tuesday, October 2, 2018 - link

    That Asus PA32UC wouldn't get you G-Sync or refresh rates over 60Hz and it's still $975 tho... It sucks that the display market is so fractured and people who use their PCs for gaming as well as content creation can't get anything approaching perfect or even ideal at times.

    There's a few 4K 32" displays with G-Sync or Freesync but they don't go past 60-95Hz AFAIK and then you don't get HDR, it's all a compromise, and has been for years due to competing adaptive sync standards, lagging connection standards, a lagging GPU market, etc etc.
  • TristanSDX - Tuesday, October 2, 2018 - link

    Soon there will be new PG27UC, with mini led backlight (10000 diodes vs 384) and with DSC
  • DanNeely - Tuesday, October 2, 2018 - link

    Eventually, but not soon. AUO is the only panel company working on 4k/high refresh/HDR; and they don't have anything with more dimming zones on their public road map (which is nominally about a year out for their production, add a few months to it for monitors makers to package them and get them to retail up once they start volume production of panels).

Log in

Don't have an account? Sign up now