Color Quality

We report two main quality metrics in our display reviews: color accuracy (Delta-E) and color gamut. Color gamut refers to the range of colors the display is able to represent with respect to some color space. In this case, our reference is the AdobeRGB 1998 color space, which is larger than the sRGB color space. So our percentages are reported with respect to this number, and larger is generally better.

Color accuracy (Delta E) refers to the display’s ability to display the correct color requested by the GPU and OS. The difference between the color represented by the display, and the color requested by the GPU is our Delta-E, and lower is better here. In practice, a Delta E under 1.0 is perfect - the chromatic sensitivity of the human eye is not great enough to distinguish a difference. Moving up, a Delta E of 2.0 or less is generally considered fit for use in a professional imaging environment - it isn’t perfect, but it’s hard to gauge the difference. Finally, Delta E of 4.0 and above is considered visible with the human eye. Of course, the big consideration here is frame of reference; unless you have another monitor or some print samples (color checker card) to compare your display with, you probably won’t notice. That is, until you print or view media on another monitor. Then the difference will no doubt be apparent.

As I mentioned in our earlier reviews, we’ve updated our display test bench. We’ve deprecated the Monaco Optix XR Pro colorimeter in favor of an Xrite i1D2 since there are no longer up-to-date drivers for modern platforms.

For these tests, we calibrate the display and try to obtain the best Delta-E we can get at both 200 nits of brightness for normal use, and 100 nits for print brightness. We target 6500K and a gamma of 2.2, but sometimes the best performance lies at native temperature and another gamma, so we try to find what the absolute best performance could be. We also take an uncalibrated measurement to show performance out of the box using either the manufacturer supplied color profile, or a generic one with no LUT data. For all of these, dynamic contrast is disabled.

Color Tracking - XR Pro and Xrite i1D2

Uncalibrated the display's color accuracy isn't very good. I found the 27-inch LED Cinema Display to be way too blue and green out of the box, calibrated the display did much better:

Color Tracking - XR Pro and Xrite i1D2

The 27-inch LED Cinema Display isn't going to be winning any awards for color reproduction but it's good enough when calibrated.

Color Tracking - XR Pro and Xrite i1D2

Curiously enough, dropping brightness down to 100 nits caused a noticeable reduction in color tracking. The average delta E went up to 2.2 while most of the 27's competitors remained about the same. The 27-inch behaves very differently depending on what brightness setting you have it on.

LCD Color Quality

Apple managed to do relatively well with the WLED backlight but it's still no match for the color gamut you get from any of the CCFL backlit displays. Note that my old 30 hasn't aged well, it's only able to cover roughly 73% today.

The Experience Color Uniformity
Comments Locked

93 Comments

View All Comments

  • burgerace - Tuesday, September 28, 2010 - link

    Wide color gamut is, for most non-professional users, a horrible drawback. Operating systems, web browsers and sites, images from my SLR camera, games, movies -- content is created for traditional color gamut!

    At the recommendation of of tech sites like this one, I bought two WCG Dell monitors, a 2408 and a 2410. They exhibit garish red push, and distorted colors in general. ATI drivers can use EDID to adjust the color temperature, reducing red push to a manageable level. But when I "upgraded" to an NVIDA 460, I lost that option.

    Anand, do you actually look at the monitors with your eyes? Can you see how bad WCG looks? Forget the tables full of misleading numbers from professional image editing software, please.
  • 7Enigma - Tuesday, September 28, 2010 - link

    I think your problem is that most people spending this chunk of change on an LCD also have them properly calibrated. As mentioned in this exact review the uncalibrated picture was quite bad. This LCD might have even been cherry-picked for the review unit (don't know if this was sent by Apple for review or Anand purchased it for personal use). So WYSIWYG doesn't apply when calibration is performed.
  • burgerace - Tuesday, September 28, 2010 - link

    WCG monitors are NOT capable of displaying a greater number of colors than a traditional monitor. They display the same 24 bit color, but it's spread over a greater range of wavelengths.

    ALL mainstream content is designed to use only the 73% gamut. There is no way to "calibrate" a monitor to make mainstream content look good. Either the monitor displays the content within the correct, limited gamut -- thereby using less than 24bit color to render the image and throwing out visual information -- or it spreads it out over the wide gamut, causing inaccurate colors.
  • Pinkynator - Tuesday, September 28, 2010 - link

    Finally someone who knows what they're talking about!

    I've finally registered here to say the exact same thing as you, but instead I'll give you my full support.

    People just don't seem to understand that wide gamut is probably the second worst thing that happened to computer displays, right after TN monitors. It's bad - it's seriously bad.

    Things might change a very long time from now, in a distant future, *IF* we get graphics cards with more bits per channel and monitors capable of understanding that (along with proper software support), but right now it's just something that is being pushed by marketing. Even tech review sites like Anandtech managed to fall for that crap, misleading monitor buyers into thinking that bigger gamut equals a better picture. In fact, it's exactly the opposite.

    To go into a serious theoretical hyperbole for those who do not understand the implications of a stretched wide gamut with 8BPC output, a monitor with a 1000000000% gamut would only be capable of displaying one single shade of red, green or blue. Everything at 0 would be black, and everything from 1..255 would be eye-scorchingly red, green or blue. (Actually, the shades would technically differ, but the human eye would not be able to discern them.)

    Your options with wide gamut are as follows:

    1) Display utterly inaccurate colours

    2) Emulate sRGB and throw out colour information, lowering the dynamic range and picture quality

    That's it. Nothing else. Wide gamut, as it stands right now, DESTROYS the displayed image.

    If you like wide gamut, that's fine - there are people who like miss Justine Bieber, too, but that doesn't make her good.
  • vlado08 - Tuesday, September 28, 2010 - link

    I don't understand sRGB emulation.
    But probably on the input of the monitor you have 8 bits per color and through processing they cange it to 10 bits to drive the panel? This way you may not lose dynamic range. Well the color information will be less than 10 bits per color but you dont have this color in the input to begin with. Tell me if I'm wrong.
  • Pinkynator - Wednesday, September 29, 2010 - link

    Example:

    Pure red (255,0,0) on a wide gamut monitor is more intense than pure red on a normal gamut monitor (which content is created for, thus ends up looking incorrect on WG).

    That means (255,0,0) should actually be internally transformed by the monitor to something like (220,0,0) if you want the displayed colour to match that of the normal monitor and show the picture accurately. It also means that when the graphics card gives the monitor (240,0,0), the monitor would need to transform it to (210,0,0) for proper display - as you can see, it has condensed 15 shades of red (240-255) into only 10 (210-220).

    To put it differently, if you display a gradient on a wide gamut monitor performing sRGB emulation, you get banding, or the monitor cheats and does dithering, which introduces visible artifacts.

    Higher-bit processing is basically used only because the gamut does not stretch linearly. A medium grey (128,128,128) would technically be measured as something like (131, 130, 129) on the WG monitor, so there's all kinds of fancy transformations going on in order to not make such things apparently visible.

    Like I said, if we ever get more bits in the entire display path, this whole point becomes moot, but for now it isn't.
  • andy o - Tuesday, September 28, 2010 - link

    If you have your monitor properly calibrated, it's not a problem. You don't have to "spread" sRGB's "73%" (of what? I assume you mean Adobe RGB). You create your own content in its own color gamut. A wider gamut monitor can ensure that the colors in it overlap other devices like printers, thus proofing becomes more accurate.

    Wide gamut are great for fairly specialized calibrated systems, but I agree they're not for movie watching or game use.
  • teng029 - Tuesday, September 28, 2010 - link

    still not compliant, i'm assuming..
  • theangryintern - Tuesday, September 28, 2010 - link

    Grrrrrr for it being a glossy panel. I *WAS* thinking about getting this monitor, but since I sit at my desk with my back to a large window, glossy doesn't cut it. That and the fact that I HATE glossy monitors, period.
  • lukeevanssi - Wednesday, September 29, 2010 - link

    I haven't used it myself, but a close friend did and said it works great - he has two monitors hooked up to his 24" iMac. I have, however, ordered stuff from OWC before (I get all my Apple RAM there since it's a lot cheaper than the Apple Store and it's all Apple-rated RAM) and they are awesome.
    http://www.articlesbase.com/authors/andrew-razor/6...

Log in

Don't have an account? Sign up now