Color Space

A color space is a subset of entire visible light spectrum, and the reason that it’s a subset is that being able to reproduce the entire visible spectrum is technologically impossible at this time. Various color spaces will all target a different subset of colors, although they may overlap with each other. A color space is also known as a color gamut.

The most common color space on a PC today is sRGB, and almost all laptops and monitors target this color range. But in the past several years, we’ve seen more and more devices able to target a gamut with a wider range of colors, such as Adobe RGB, and there are advantages and disadvantages to using these wider gamuts on a PC.


The sRGB Color Space - Source: SpectraCal

The sRGB, or standard Red Green Blue gamut is an RGB color space which is the standard used on all PCs and the Internet. If you look at an image on the internet, it’s more than likely an image within the sRGB color space. Windows defaults to using the sRGB color space for everything.

Other common gamuts used in the PC industry are the Adobe RGB gamut, developed by Adobe in 1998, and the P3 color space, both of which are wider than sRGB. Wider means that they cover a range of colors larger than the sRGB gamut.


The Adobe RGB Color Space - Source: SpectraCal


The P3 D65 Color Space - Source: SpectraCal


The BT.2020 HDR Color Space - Source: SpectraCal

But being wider isn’t always better, especially when using Windows as your operating system. Windows defaults to sRGB for pretty much everything, and doesn’t really have any sort of robust color management system (CMS) that can transform colors targeting one space into another. Applications can build and use their own CMS, but that’s at the application level only, and requires the developer to do much of the legwork. If you think of an sRGB color represented to be displayed on an 8-bit display, if you want to display 100% red you’d use 8,0,0 on that pixel so that red is on the maximum brightness (typically 255 or 1.0f). But if Windows opens a photo that was created as sRGB on a display that covers the P3 gamut, it will still tell the display to use 100% red, zero green, zero blue. But 100% red in sRGB is closer to 80% red in a P3 display, so the colors will be overblown, and the image won’t look correct.

Incorrect color on the Left (Photos) vs correct color on the Right (Adobe Photoshop Elements)

To Microsoft's credit, Windows is getting better. But progress here is slow due to legacy matters and the need to avoid breaking existing software, so using wider color spaces on the desktop is still a dicey proposition.

The end result is that on a laptop, offering a wider gamut display can cause problems for most workloads. And for that reason, unless you do a lot of work in a color managed application like something from Adobe, it’s generally not ideal, unless your laptop has the ability to switch the color gamut the display uses through software.

Finally, there are even wider gamuts available, like the Rec. 2020 color space. But at the moment there’s no technology available in a laptop-sized implementation that can create all of the colors in that space.

White Point

The white point on a PC display is generally going to be D65, which corresponds roughly to sunlight at midday in western or northern Europe. When discussing the color temperature in Kelvin, D65 is close to 6504 K. When discussing a DCI-P3 display, the target white point is D63, which is more green than D65. In the PC space, you’ll generally be dealing with a P3 D65 gamut, rather than DCI-P3 which is used in the cinema. Though there are a few PCs, such as the Surface Studio, that can target either DCI-P3 or P3 D65.

How many bits?

Most displays are 8 bits-per-channel, meaning each red, green, and blue channel can have 256 (2^8) steps, which provides 16,777,216 (256 * 256 * 256) different color combinations. That sounds like a lot of colors, and it is, but on each channel there’s still a fairly large jump between steps, since red can only be 0-255, for example.

Less expensive displays may even reduce this more to 6-bit with Frame Rate Control (FRC) which uses temporal dithering which quickly switches between two colors to simulate the color in between to simulate the full 8-bit levels. Often TN displays are only 6-bit with FRC, and lower priced IPS such as e-IPS can also be 6-bit.

Meanwhile with the wider color gamuts available now, the jump in a color step for each increase from 0-255 can be even wider. So lower-bit panels are more susceptible to color banding on the display, which as the name implies, is a situation where distinct bands of colors appear when the intention is for a smooth gradient. This occurs when two colors next to each other are supposed to be slightly different, but end up looking exactly the same or more different than intended, since the display can’t create the color in-between. To counteract this, more bits are needed to represent smaller graduations from one step to the next, so often a wider gamut display will be 10-bit, or 8-bit with FRC. A true 10-bit display offers 1024 levels for each channel, which creates 1,073,741,824 different color options.

Although all modern GPUs support 10-bit, NVIDIA currently restricts 10-bit on OpenGL to their Quadro lineup, so applications using OpenGL like Adobe’s suite require a professional GPU.

High Dynamic Range

High dynamic range, or HDR, is one of the best new features to come to displays in quite some time. HDR brings a wider range of brightness levels, and requires a higher bit-depth as well, to keep the detail available in darker scenes. To provide that, displays certified for HDR have to achieve higher brightness levels than traditional displays as well, so it can be a large benefit.

Source: Samsung

VESA has announced the DisplayHDR specification, which has three levels: DisplayHDR 400, 600, and 1000, which each level corresponding to the peak brightness of the display in nits.

There are two main competing HDR transport standards right now, although only one is available on the PC. HDR10 is a 10-bit format with static metadata for the HDR info, and Dolby Laboratories Dolby Vision is a 12-bit color depth with dynamic metadata which allows the mapping of the pixel data to the luminance to be adjusted frame by frame. PCs only support HDR10, and not only do you need an HDR monitor, you also need at least a Kaby Lake iGPU or newer, an NVIDIA 900 or newer series GPU, or AMD Radeon RX 400 series or higher, as well as DisplayPort 1.4 or HDMI 2.0.

There are only a couple of laptops on the market which offer an HDR display at the moment, but with the eDP connector and complete control over the product stack, HDR in a laptop is likely to be easier for the consumer to use. As of WIndows 10 version 1803, the built-in display needs to have a resolution of 1920x1080 or higher, with a recommended brightness of 300 nits or more. The device has to have integrated graphics with PlayReady hardware DRM for protected HDR, and of course the correct codecs for video which is generally going to be HEVC. Kaby Lake or newer support this. The nice thing will be consumers won't have to worry about any of this, since it will be up to the OEM to ensure everything is in order, rather than with a desktop system where the user has to make sure all of the i's are dotted and the t's are crossed to enable HDR.

There’s a lot of new HDR products coming to market soon, and when it becomes mainstream it’ll be a welcome benefit.

Building the Transistors and Lighting the Display How We Test Displays
Comments Locked

49 Comments

View All Comments

  • ikjadoon - Tuesday, July 10, 2018 - link

    Excellent overview, Brett. I will be linking this many weeks onward.

    I’m curious how you were able to measure the SB2’s display power usage—that sounds incredibly handy as panel efficiency seems to be the name of the game here. Is this through software or hardware, like clamping or voltage measurements?

    I had high hopes for IGZO penetrating and overtaking a-Si, but it seems like it’s the forgotten middle child sans one or two poster models like the Razer Blade.

    Seeing LTPS proliferate, though, is welcome: Lenovo’s using it on their X1 Yoga HDR display and Huawei’s MateBook has won a lot of hearts (and eyes).
  • MajGenRelativity - Tuesday, July 10, 2018 - link

    I enjoyed this article very much. I didn't know VA was a different technology, and assumed it was some subtype of IPS, so I'm glad that was cleared up.

    I look forward to in-depth articles about other components!
  • Brett Howse - Tuesday, July 10, 2018 - link

    Thanks!
  • Ehart - Tuesday, July 10, 2018 - link

    Really nice article, but you're falling into some common confusion on HDR10. HDR10 is really only defined as a 'media profile', and for a display it means that it accepts at least 10 bits to support that profile. For PC displays, they often can accept a 12 bit signal. (I'm using one right now.)
  • DanNeely - Tuesday, July 10, 2018 - link

    Is "3k" eg 3200x1800 going out of favor on 13" laptops? I'd be rather disappointed if it is.

    At 280 DPI it's equivalent to 4k on a 15.6" panel, and on anything that doens't have broken DPI scaling is high enough resolution that you can pick whatever scaling factor you want and have sharp can't come close to seeing the pixels anymore. The higher, going higher eg 4k and 330DPI doesn't really get anything except higher power consumption and lower battery life IMO.
  • Brett Howse - Tuesday, July 10, 2018 - link

    Seems to be less options for 3200x1800 these days.
  • CaedenV - Tuesday, July 10, 2018 - link

    "so the loss of 16:10 was mourned by many."
    Yeah... I miss my 1200p 16:10 display. It wasnt the best quality... but man was it useful!
  • keg504 - Tuesday, July 10, 2018 - link

    If nit is not an SI unit, why not use lux, which is, and is the same quantity (from my understanding)?
  • Death666Angel - Tuesday, July 10, 2018 - link

    It isn't, though. The SI unit for nits would be candela/square_meter [cd/m²]. Lux = Lumen/square_meter [lm/m²] has an additional light source component and a distant component in it, because it is used to measure the light that hits a certain point, not the source itself. Most non-US based tech reviewers I frequent use cd/m².
  • Amoro - Tuesday, July 10, 2018 - link

    What about adaptive refresh rate technologies?

Log in

Don't have an account? Sign up now