Final Words

QD Vision's Color IQ tech certainly delivers. In the case of the Philips 276E6, it actually delivers more than what it promises. Technically the monitor doesn't meet the Adobe RGB spec, but this is mostly due to it exceeding it rather than falling short. The gamut that it achieves is clear evidence that this quantum dot technology can be employed in monitors with relatively low prices to achieve a wide color gamut, which is something that has only existed on a small group of premium monitors until now. Given that the Philips 276E6 shows oversaturation with its red primary, I would be interested in seeing Color IQ used in a monitor targeting the DCI-P3 color space which is about to become the canon color space for UltraHD content as we move away from sRGB on a path toward eventual Rec. 2020 support.

While the Philips 276E6 is certainly a successful demonstration of QD Vision's quantum dot technology, it's difficult to say that it's successful as a product in its own right. This is a combination of two factors, with one of them being primarily an external factor.

From the last two pages it's pretty clear that Windows is not ready at all for a transition to a world beyond sRGB, and even within that gamut it makes it quite a pain to do basic color corrections. For many years, vendors who ship wide gamut displays have identified this issue and provided a fix of sorts in the form of an sRGB monitor mode which constrains its gamut so the vast majority of content on the web will render correctly without relying on proper color management. With Philips being a new entrant to this space, it seems that they were either not aware of, or underestimated the necessity of such a feature. I want to stress that this isn't something that any of these vendors should have to do, but when the most widely used desktop operating system doesn't do a good job with color management, it's something you need to do to provide a good experience for your users. Philips hasn't helped this problem by also not even providing a proper ICC color profile so that applications that actually are color managed will work correctly.

The second factor is that the Philips 276E6 is just not where it needs to be in terms of uniformity and color accuracy. Based on my measurements of two of these monitors, it's clear that Philips is allowing a large degree of variance between units. Out of the box, it's hard to tell what settings provide the most accurate image. On my original unit it was the Adobe RGB preset, while on the second unit it's the 6500K preset. Regardless of which I chose, both monitors exhibited concerning issues with color accuracy, and both had an oversaturated red primary. I'm not the only reviewer to find this, and so it's probably true of all the units which is very disappointing. 

Post-calibration numbers were better in some respects, but not others. The big shift in saturation accuracy with the 200 nit calibration was very surprising to me, and it may be best to not tweak the white point at the monitor level at all. Unfortunately, making more corrections through greyscale calibration means you reduce the tonal range of the monitor further by limiting the number of distinct levels for the red, green, and blue components, which can introduce color banding. On top of that, the fact that the Philips 276E6 is trying to target a low price point means that the idea of calibrating with a $1500 spectrophotometer using $3000 software is quite absurd, and the very large variance between panels means there's not even any point in me providing an ICC profile to be used in a general manner.

In the end, I think Philips simply has some things to learn about the wide gamut monitor market, and I'm still quite interested in seeing what future products they release, along with what future products will come from other manufacturers. Philips should definitely be applauded for taking the first step toward low-cost wide gamut displays, and if they can improve their panel accuracy and include a proper sRGB color mode they'll have a very compelling product on their hands. If 1920x1080 is the target resolution to manage cost I would probably opt for a smaller panel size as well, as the pixel density is just too low on a 27" panel.

On the software side, the companies that currently ship operating systems with essentially non-functioning color management need to get their act together. Wide gamut displays are coming, and not handling multiple color standards properly is incredibly detrimental to the user experience.

As for Color IQ, I think the technology has a bright future. It's clear that it can be added to displays with only a minimal impact on price to the consumer, and the advantages are significant. The tech can clearly push a wider red primary than the Adobe RGB standard specifies, so I'd like to see some DCI-P3 monitors using the tech so consumers can take advantage of upcoming UltraHD content that will support the wider color space. As a technology, I think Color IQ will probably exist alongside film-based quantum dot technologies, as edge-lit LCDs are never going to properly support HDR standards because of their inability to do proper local dimming, and a film solution is the only feasible way to use the tech in smartphones and tablets. However, with the majority of the  TV and monitor market using edge-lit displays there's a huge opportunity here to bring wider color gamuts to the masses. While I cannot really recommend buying the Philips 276E6 in its current state, I'm looking forward to future products that use QD Vision's Color IQ technology, both from Philips and from other vendors that I anticipate will adopt this technology soon.

Why Monitors Include sRGB Settings
Comments Locked

51 Comments

View All Comments

  • jlabelle - Friday, April 29, 2016 - link

    - In the second corner we have Android. Not clear to me how much better off they are. They have handled DPI a lot better, which is a good start -

    If you are speaking of Android, you should compare that in Windows Store with Windows apps from the Store.
    For those, the scaling is just perfect and it is handling ANY screen size / resolution / orientation perfectly.
    Only issue with scaling are Win32 programs not using hidpi API released 9 years ago with Windows 7 (at a time where Android was not a thing).

    - As far as I know there is still no color correction built into Android -

    Android is the worse on this because you have virtually 0 color management.

    bq. In the third corner we have Apple which seems perfectly positioned for all this (meaning that they will likely drive it).

    Again, this is misleading.
    For instance, iOS way of handling color management (see test on the iPad Pro) make the use of wide gamut screen virtually useless (for now) as there are no ways for a developer to take advantage of it. What it seems to do is basically apply a ICC profile to target sRGB color space.
    Scaling is not a question really as resolution are pretty much hard coded but again, Windows app are scaling perfectly.

    OS X has some "native" applications color managed (like Safari) but the same issue occur that the program needs to be color managed otherwise you have the same issue.
    For scaling, this is exactly like Windows with hidpi API existing like forever and developer just need to use it. Maybe there are more application which are using it. But that's it.
    OS X does not have really (from an OS point of view) an inherent advantage compared to Windows on color management / hiDPI screen.

    bq. they're now pushing color accuracy both on the camera side (TrueTone flash, high dynamic range sensors)

    actually, Apple is using 1/3" camera sensor, one of the smaller size in the industry (or only found in low end phone like Lumia 640XL...) and therefore the dynamic range is more limited than the competition (because it is mainly directly link to sensor size).

    - and the screen side -
    nothing exclusive to Apple. For instance, speaking of Windows here and therefore the Surface or the Lumia 950, they both have more color accurate screen that all the various iPad and the iPhone (albeit all are VERY good in color accuracy).

    bq. "Our colors look good, and look correct, across ALL our devices --- photos on your iPhone look exactly the same on your iMac. Good luck getting that consistency with photo from your Android phone on your Windows screen."

    It is no luck. Just pick the right product. If you pick a Surface and a Lumia 950 for instance, you will have the same great experience. And using a Samsung S6-S7 or accurate Android phone will give you the same.

    Seems indeed that advertising is working correctly for people to believe that Apple has inherent advantage here.

    - the relevance and interest of QD technology is whether it allows larger gamut to move to iPhone this year or at least soon.

    Until developer can take advantage of it, it has not advantage for end user. So as good is the color gamut of the iPad Pro, it is useless from an end user point of view.
  • Brandon Chester - Friday, April 29, 2016 - link

    I've already addressed why your understanding of the situation on the iPad is incorrect in my article specifically about it. Please do not spread serious misinformation in the comments or I will have to remove them; this is already an issue that is confusing to many people.
  • theduckofdeath - Friday, April 29, 2016 - link

    I don't get what bigger picture I'm missing here. Yes, LCD tech has evolved a lot over the years. But, it's just the faux marketing these manufacturers always stoop to, to give the impression that they're selling something better than LCD. A few years ago it was LED now it's Quantum Dots. Both insinuating that the backlight isn't the usual old flawed edge lit design.
  • alphasquadron - Thursday, April 28, 2016 - link

    As a Windows User(not by choice but because it supports a lot of software and games), it is tiring to see the slow pace at which Windows fixes problems. When are they going to get 4k scaling done correctly. And I remember when I got my new computer and going through the same confusing ICC sub-menus to get the actual settings.

    Also what was Phillips or QD Vision thinking when they sent a reviewer of tech site that is testing their monitor for color accuracy a fake sRGB mode. I mean he just mentioned that there was no sRGB mode on the monitor so what do you think the first thing he is going to test when he gets the new monitor is. I'm still confused whether the mode actually did change something or if they are just that dumb(or they think reviewers are that dumb).
  • Murloc - Thursday, April 28, 2016 - link

    maybe they messed up while doing a quick fix. I hope.
  • Brandon Chester - Thursday, April 28, 2016 - link

    For the record, I spent a long time trying to prove to myself that it did do something. Unfortunately, if it truly were constraining the gamut it would be so completely obvious upon toggling it that you wouldn't even need to make measurements. I did measure anyway, and it truly didn't change the output at all.
  • Guspaz - Thursday, April 28, 2016 - link

    All this talk of colour management... It all works so easily on my macbook (load the profile Anand made, and everything looks correct), but on my main PC, it's a mess...

    I've got a Dell U2711 running Windows 10. That's a wide-gamut display, and I do have an ICC profile for it. The display was also factory-calibrated (it shipped with a printed report on the results).

    If I want the most trouble-free setup where most stuff looks correct, which of these is the correct approach:

    1) Set monitor to default profile and set Windows to ICC profile
    2) Set monitor to sRGB profile and set Windows to ICC profile
    3) Set monitor to default profile and set Windows to sRGB profile
    4) Set monitor to sRGB profile and set Windows to sRGB profile

    I'm guessing option 1 is correct for wide-gamut use, but the crappy Windows colour management would mess everything up. So if I want to just go for sRGB, it seems to me that option 4 is probably correct? Or is option 2 what I want?

    This is all so confusing. On my Mac I just set the ICC profile and everything works immediately and perfectly.
  • Murloc - Thursday, April 28, 2016 - link

    yeah MacOS got this down unlike Windows.

    I wonder how amenable Linux is in this regard.
  • tuxRoller - Thursday, April 28, 2016 - link

    Pretty much as good as Mac, actually.
    Checkout my comments on the recent 9.7" iPad review (the one that dealt with color management).
  • jlabelle - Friday, April 29, 2016 - link

    See my answer in page 2. I was in your EXACT same case.

    1) I guess you have a ICC profile so you are able to calibrate the screen yourself with a probe or you have a generic ICC profile from a DELL review (which means that you do not consider production variation and variatin over time) ?  this is theoretical ideal situation to take advantage of wige gamut screen…except, I do not advise it for the reason describe below.
    2) Hassle free solution : same as above but you constraint yourself with sRGB color space. You will have good color accuracy on color managed application. And even for non color managed application, and even if your ICC profile is not very good, you will have not problem of oversaturation or washed out colors.
    3) make no sense at all ! It means that you are saying that the DELL is perfectly accurate according to sRGB color space and gamut. Obviously, it cannot be further from the truth so you will end up with all your colors (EVEN for color managed applications) oversaturated. No, no, NO !
    4) This is the equivalent as what the article advice for the Philips : you put the screen in sRGB mode. You do not have any ICC display profile (because you do not have the necessary calibration equipement). So you are assuming that it is correctly calibrated and are saying to the OS that you display is perfect according to sRGB. Actually, this is the standard and you do not need to do anything to be in this situation.

    The preferred solution is by far the number 2.

    To understand why, let’s reverse the discussion and ask you (or people) why they think they benefit from a wide gamut screen ?
    • To surf the web ? No because websites are targeting sRGB anyway
    • To view pictures received by email or taken by you ? In most cases, no because mobile phone, compact cameras and even most DSLR are setup to take sRGB pictures
    • To view film ? It is slightly more complicated but anyway, there is no content with wide gamut (to make things simple) and anyway no consumer video software that would manage it. So you would end up with over saturated colors permanently. Except if this is your thing…

    So then, in which case would you have any benefits ?
    IF you have your own DSLR/mirrorless and IF you set it up in aRGB mode and IF you make always duplicates of every single picture in sRGB anyway that you want to share / display on the web / bring or sent to printing.

    And even if all those “IF” are fulfilled, you will end up having over saturated colors in most of your applications, when surfing the web, when watching pictures of others… All that just to be able to see, on your own pictures, maybe a tiny difference with side-by-side comparison in 0,001% of the case (I am not making this number, it is the proportion of pictures where I was able to spot a difference).

    Long story short : a wide gamut screen makes NO sense currently. And there is a reason why it is said that it only make sense for professional for very specific application. And those people do not come here to ask if it makes sense because they are aware of all this.

    Bottom line : choose option 2.

Log in

Don't have an account? Sign up now