Contrast, Brightness, and Gamut

Of course, while QD Vision's Color IQ technology looks good on paper, what's equally important is how well the technology translates to the real world. QD Vision's goal to bring down the cost of backlighting suitable for wide gamut displays is unabashedly aggressive, and like other efforts to reduce manufacturing costs, cost reductions need to be carefully balanced to exploit the advantages of new technology without unnecessarily sacrificing quality. To that end QD Vision's first outing in the PC display market is an especially interesting one, as we can see and evaluate first-hand the true gamut capabilities of Color IQ as implemented in Phillips' 276E6 monitor.

Starting things off with a look at some more basic monitor metrics, Philips rates the 276E6 as having a peak brightness of 300 nits and a contrast ratio of 1000:1. This is in line with what you'll typically find on displays of this price, and in general you probably won't be using a monitor at anything close to 300 nits in a typical workspace. As always, measurements for white and black level are done with an i1DisplayPro due to its greater accuracy with levels below 0.2 nits than the i1Pro 2.

White Level -  i1DisplayPro

Peak luminance on the Philips 276E6 is relatively high. While we have seen some monitors reach beyond 400 nits, in practice this isn't really important because the ambient environment lighting for a monitor is usually constant, and rarely requires a brightness above 100-200 nits to overcome reflections. As for the minimum brightness, in its standard Adobe RGB mode the display dropped to 77 nits. This is actually fairly bright, and if were any brighter it would be difficult to perform our 80 nit calibration later in the review.

Black Level - i1DisplayPro

Black levels on the Philips 276E6 are quite good as well. At maximum brightness it's one of the better panels on record, which will pair well with its high brightness. At minimum brightness the gap between it and the next best display is quite a bit larger, but it's important to remember that the Philips 276E6 has a minimum brightness that is also higher than most displays, and so the contrast ratio should end up being quite good in both cases.

Contrast Ratio -  i1DisplayPro

As far as contrast ratio goes, the Philips 276E6 does as well as you'd expect based on its white and black levels. Obviously there are now displays on the market that utilize photoalignment technology to achieve contrast ratios around 1800:1, but 1000:1 is quite a good result for a $300 display, and it's right in line with what Philips advertises so you're definitely getting what you paid for.


Philips 276E6 gamut DeltaE with default settings

 


Philips 276E6 gamut DeltaE with OSD tweaks

Gamut accuracy on the Philips 276E6 can vary pretty wildly. On both of the units I tested there were two common errors. The first is color, with the blue primary being undersaturated while the red primary is significantly oversaturated. The second problem is the white point. If your luminance is wrong your colors will also be wrong, and on both monitors it took a lot of fiddling with the OSD to get an accurate white point, and you can see the result of that in the photos above. Out of the box the white error on our second unit causes the DeltaE to approach four.

Generally, you'll be able to eyeball it and figure out which of the white presets is the most neutral, but even if you do you're also dealing with significant errors in blue, red, and the secondary colors that rely on them. Considering that the Color IQ tech is supposed to be highly tunable I really don't know how such large errors could exist in the shipping product, and it's not clear whether this is a technology limitation or a decision on Philips' part. In any case it doesn't look good to have large errors on the most basic of our tests.

Color IQ: What it is, and how it works Adobe RGB Testing And Calibration
Comments Locked

51 Comments

View All Comments

  • jlabelle - Friday, April 29, 2016 - link

    - In the second corner we have Android. Not clear to me how much better off they are. They have handled DPI a lot better, which is a good start -

    If you are speaking of Android, you should compare that in Windows Store with Windows apps from the Store.
    For those, the scaling is just perfect and it is handling ANY screen size / resolution / orientation perfectly.
    Only issue with scaling are Win32 programs not using hidpi API released 9 years ago with Windows 7 (at a time where Android was not a thing).

    - As far as I know there is still no color correction built into Android -

    Android is the worse on this because you have virtually 0 color management.

    bq. In the third corner we have Apple which seems perfectly positioned for all this (meaning that they will likely drive it).

    Again, this is misleading.
    For instance, iOS way of handling color management (see test on the iPad Pro) make the use of wide gamut screen virtually useless (for now) as there are no ways for a developer to take advantage of it. What it seems to do is basically apply a ICC profile to target sRGB color space.
    Scaling is not a question really as resolution are pretty much hard coded but again, Windows app are scaling perfectly.

    OS X has some "native" applications color managed (like Safari) but the same issue occur that the program needs to be color managed otherwise you have the same issue.
    For scaling, this is exactly like Windows with hidpi API existing like forever and developer just need to use it. Maybe there are more application which are using it. But that's it.
    OS X does not have really (from an OS point of view) an inherent advantage compared to Windows on color management / hiDPI screen.

    bq. they're now pushing color accuracy both on the camera side (TrueTone flash, high dynamic range sensors)

    actually, Apple is using 1/3" camera sensor, one of the smaller size in the industry (or only found in low end phone like Lumia 640XL...) and therefore the dynamic range is more limited than the competition (because it is mainly directly link to sensor size).

    - and the screen side -
    nothing exclusive to Apple. For instance, speaking of Windows here and therefore the Surface or the Lumia 950, they both have more color accurate screen that all the various iPad and the iPhone (albeit all are VERY good in color accuracy).

    bq. "Our colors look good, and look correct, across ALL our devices --- photos on your iPhone look exactly the same on your iMac. Good luck getting that consistency with photo from your Android phone on your Windows screen."

    It is no luck. Just pick the right product. If you pick a Surface and a Lumia 950 for instance, you will have the same great experience. And using a Samsung S6-S7 or accurate Android phone will give you the same.

    Seems indeed that advertising is working correctly for people to believe that Apple has inherent advantage here.

    - the relevance and interest of QD technology is whether it allows larger gamut to move to iPhone this year or at least soon.

    Until developer can take advantage of it, it has not advantage for end user. So as good is the color gamut of the iPad Pro, it is useless from an end user point of view.
  • Brandon Chester - Friday, April 29, 2016 - link

    I've already addressed why your understanding of the situation on the iPad is incorrect in my article specifically about it. Please do not spread serious misinformation in the comments or I will have to remove them; this is already an issue that is confusing to many people.
  • theduckofdeath - Friday, April 29, 2016 - link

    I don't get what bigger picture I'm missing here. Yes, LCD tech has evolved a lot over the years. But, it's just the faux marketing these manufacturers always stoop to, to give the impression that they're selling something better than LCD. A few years ago it was LED now it's Quantum Dots. Both insinuating that the backlight isn't the usual old flawed edge lit design.
  • alphasquadron - Thursday, April 28, 2016 - link

    As a Windows User(not by choice but because it supports a lot of software and games), it is tiring to see the slow pace at which Windows fixes problems. When are they going to get 4k scaling done correctly. And I remember when I got my new computer and going through the same confusing ICC sub-menus to get the actual settings.

    Also what was Phillips or QD Vision thinking when they sent a reviewer of tech site that is testing their monitor for color accuracy a fake sRGB mode. I mean he just mentioned that there was no sRGB mode on the monitor so what do you think the first thing he is going to test when he gets the new monitor is. I'm still confused whether the mode actually did change something or if they are just that dumb(or they think reviewers are that dumb).
  • Murloc - Thursday, April 28, 2016 - link

    maybe they messed up while doing a quick fix. I hope.
  • Brandon Chester - Thursday, April 28, 2016 - link

    For the record, I spent a long time trying to prove to myself that it did do something. Unfortunately, if it truly were constraining the gamut it would be so completely obvious upon toggling it that you wouldn't even need to make measurements. I did measure anyway, and it truly didn't change the output at all.
  • Guspaz - Thursday, April 28, 2016 - link

    All this talk of colour management... It all works so easily on my macbook (load the profile Anand made, and everything looks correct), but on my main PC, it's a mess...

    I've got a Dell U2711 running Windows 10. That's a wide-gamut display, and I do have an ICC profile for it. The display was also factory-calibrated (it shipped with a printed report on the results).

    If I want the most trouble-free setup where most stuff looks correct, which of these is the correct approach:

    1) Set monitor to default profile and set Windows to ICC profile
    2) Set monitor to sRGB profile and set Windows to ICC profile
    3) Set monitor to default profile and set Windows to sRGB profile
    4) Set monitor to sRGB profile and set Windows to sRGB profile

    I'm guessing option 1 is correct for wide-gamut use, but the crappy Windows colour management would mess everything up. So if I want to just go for sRGB, it seems to me that option 4 is probably correct? Or is option 2 what I want?

    This is all so confusing. On my Mac I just set the ICC profile and everything works immediately and perfectly.
  • Murloc - Thursday, April 28, 2016 - link

    yeah MacOS got this down unlike Windows.

    I wonder how amenable Linux is in this regard.
  • tuxRoller - Thursday, April 28, 2016 - link

    Pretty much as good as Mac, actually.
    Checkout my comments on the recent 9.7" iPad review (the one that dealt with color management).
  • jlabelle - Friday, April 29, 2016 - link

    See my answer in page 2. I was in your EXACT same case.

    1) I guess you have a ICC profile so you are able to calibrate the screen yourself with a probe or you have a generic ICC profile from a DELL review (which means that you do not consider production variation and variatin over time) ?  this is theoretical ideal situation to take advantage of wige gamut screen…except, I do not advise it for the reason describe below.
    2) Hassle free solution : same as above but you constraint yourself with sRGB color space. You will have good color accuracy on color managed application. And even for non color managed application, and even if your ICC profile is not very good, you will have not problem of oversaturation or washed out colors.
    3) make no sense at all ! It means that you are saying that the DELL is perfectly accurate according to sRGB color space and gamut. Obviously, it cannot be further from the truth so you will end up with all your colors (EVEN for color managed applications) oversaturated. No, no, NO !
    4) This is the equivalent as what the article advice for the Philips : you put the screen in sRGB mode. You do not have any ICC display profile (because you do not have the necessary calibration equipement). So you are assuming that it is correctly calibrated and are saying to the OS that you display is perfect according to sRGB. Actually, this is the standard and you do not need to do anything to be in this situation.

    The preferred solution is by far the number 2.

    To understand why, let’s reverse the discussion and ask you (or people) why they think they benefit from a wide gamut screen ?
    • To surf the web ? No because websites are targeting sRGB anyway
    • To view pictures received by email or taken by you ? In most cases, no because mobile phone, compact cameras and even most DSLR are setup to take sRGB pictures
    • To view film ? It is slightly more complicated but anyway, there is no content with wide gamut (to make things simple) and anyway no consumer video software that would manage it. So you would end up with over saturated colors permanently. Except if this is your thing…

    So then, in which case would you have any benefits ?
    IF you have your own DSLR/mirrorless and IF you set it up in aRGB mode and IF you make always duplicates of every single picture in sRGB anyway that you want to share / display on the web / bring or sent to printing.

    And even if all those “IF” are fulfilled, you will end up having over saturated colors in most of your applications, when surfing the web, when watching pictures of others… All that just to be able to see, on your own pictures, maybe a tiny difference with side-by-side comparison in 0,001% of the case (I am not making this number, it is the proportion of pictures where I was able to spot a difference).

    Long story short : a wide gamut screen makes NO sense currently. And there is a reason why it is said that it only make sense for professional for very specific application. And those people do not come here to ask if it makes sense because they are aware of all this.

    Bottom line : choose option 2.

Log in

Don't have an account? Sign up now