The Experience

I’ve been using a 30-inch for nearly as long as they’ve been around in the consumer market. And I went the multi-monitor route before then. I find that I’ve got enough windows open that need interacting with to fill up a single 2560-pixel wide desktop pretty well, but move to two smaller panels and my desktop usage just isn’t as efficient. I end up having one monitor that’s largely unused except for a couple of applications and another monitor that feels way too cramped. Balancing between the two just never worked well for me, so a single high resolution display made the most sense.

The problem with the 30 is that it’s just huge. It’s got an awesome resolution but I find that it’s more of a pain while gaming, particularly in first person shooters. I end up sitting too close and the display is almost too big.

Moving to the 27-inch panel I noticed several things. The display is much more compact. It doesn’t feel too small, and it doesn’t feel too big. Dare I say it’s just right. The change in aspect ratio is strange but not a deal breaker. Admittedly I wasn’t doing too much with the extra 140 lines of resolution I had on the 30” display.

The display feels a bit sharper than my old 30. The pixel density has gone up 8% from ~101 PPI to ~109 PPI. If you felt text was too small on a 30-inch panel, things aren’t going to get any better here. As a side effect of the display physically being smaller, I can actually sit closer to it than I could with the 30-incher without feeling like I’m being totally overwhelmed by panel.

Viewing angles are great. The IPS panel works its magic as well as you’d expect.

The backlight honestly takes the most getting used to. My 30-inch display is the original Apple Cinema HD display from 2004 and it used a CCFL backlight. The white LED backlit 27-inch panel just seems too cool, even when properly calibrated. The whites are very bright, but they feel a bit too harsh for me. If I dim the display then the rest of the colors get too dim as a result, I can’t seem to find a happy medium. I hear the situation is near perfect with RGB LEDs but Apple and most other manufacturers still use WLEDs for their backlights. You’ll see the impact this has on color gamut later on in the review. I spend most of my time in front of a CCFL backlit screen, but if you’re used to something LED backlit it’ll be a less of a shock.

I feel like there are two significant features missing that ultimately prevent this from being a truly great monitor: a RGB LED backlight and 120Hz support. The former makes the shift from a CCFL backlit LCD more of a tradeoff. The latter is just wishful thinking at this point honestly, but after Brian’s experience with the ASUS 120Hz panel I want it.

The rest of the package works relatively well. I’m happy with the webcam image quality and the integrated mic seems to work well. I love the convenience of the integrated MagSafe power connector and mini DisplayPort is a nicely compact interface, unfortunately these two features only matter if you happen to have a notebook that can take advantage of them.

The New Cinema Display Color Quality
POST A COMMENT

93 Comments

View All Comments

  • burgerace - Tuesday, September 28, 2010 - link

    Wide color gamut is, for most non-professional users, a horrible drawback. Operating systems, web browsers and sites, images from my SLR camera, games, movies -- content is created for traditional color gamut!

    At the recommendation of of tech sites like this one, I bought two WCG Dell monitors, a 2408 and a 2410. They exhibit garish red push, and distorted colors in general. ATI drivers can use EDID to adjust the color temperature, reducing red push to a manageable level. But when I "upgraded" to an NVIDA 460, I lost that option.

    Anand, do you actually look at the monitors with your eyes? Can you see how bad WCG looks? Forget the tables full of misleading numbers from professional image editing software, please.
    Reply
  • 7Enigma - Tuesday, September 28, 2010 - link

    I think your problem is that most people spending this chunk of change on an LCD also have them properly calibrated. As mentioned in this exact review the uncalibrated picture was quite bad. This LCD might have even been cherry-picked for the review unit (don't know if this was sent by Apple for review or Anand purchased it for personal use). So WYSIWYG doesn't apply when calibration is performed. Reply
  • burgerace - Tuesday, September 28, 2010 - link

    WCG monitors are NOT capable of displaying a greater number of colors than a traditional monitor. They display the same 24 bit color, but it's spread over a greater range of wavelengths.

    ALL mainstream content is designed to use only the 73% gamut. There is no way to "calibrate" a monitor to make mainstream content look good. Either the monitor displays the content within the correct, limited gamut -- thereby using less than 24bit color to render the image and throwing out visual information -- or it spreads it out over the wide gamut, causing inaccurate colors.
    Reply
  • Pinkynator - Tuesday, September 28, 2010 - link

    Finally someone who knows what they're talking about!

    I've finally registered here to say the exact same thing as you, but instead I'll give you my full support.

    People just don't seem to understand that wide gamut is probably the second worst thing that happened to computer displays, right after TN monitors. It's bad - it's seriously bad.

    Things might change a very long time from now, in a distant future, *IF* we get graphics cards with more bits per channel and monitors capable of understanding that (along with proper software support), but right now it's just something that is being pushed by marketing. Even tech review sites like Anandtech managed to fall for that crap, misleading monitor buyers into thinking that bigger gamut equals a better picture. In fact, it's exactly the opposite.

    To go into a serious theoretical hyperbole for those who do not understand the implications of a stretched wide gamut with 8BPC output, a monitor with a 1000000000% gamut would only be capable of displaying one single shade of red, green or blue. Everything at 0 would be black, and everything from 1..255 would be eye-scorchingly red, green or blue. (Actually, the shades would technically differ, but the human eye would not be able to discern them.)

    Your options with wide gamut are as follows:

    1) Display utterly inaccurate colours

    2) Emulate sRGB and throw out colour information, lowering the dynamic range and picture quality

    That's it. Nothing else. Wide gamut, as it stands right now, DESTROYS the displayed image.

    If you like wide gamut, that's fine - there are people who like miss Justine Bieber, too, but that doesn't make her good.
    Reply
  • vlado08 - Tuesday, September 28, 2010 - link

    I don't understand sRGB emulation.
    But probably on the input of the monitor you have 8 bits per color and through processing they cange it to 10 bits to drive the panel? This way you may not lose dynamic range. Well the color information will be less than 10 bits per color but you dont have this color in the input to begin with. Tell me if I'm wrong.
    Reply
  • Pinkynator - Wednesday, September 29, 2010 - link

    Example:

    Pure red (255,0,0) on a wide gamut monitor is more intense than pure red on a normal gamut monitor (which content is created for, thus ends up looking incorrect on WG).

    That means (255,0,0) should actually be internally transformed by the monitor to something like (220,0,0) if you want the displayed colour to match that of the normal monitor and show the picture accurately. It also means that when the graphics card gives the monitor (240,0,0), the monitor would need to transform it to (210,0,0) for proper display - as you can see, it has condensed 15 shades of red (240-255) into only 10 (210-220).

    To put it differently, if you display a gradient on a wide gamut monitor performing sRGB emulation, you get banding, or the monitor cheats and does dithering, which introduces visible artifacts.

    Higher-bit processing is basically used only because the gamut does not stretch linearly. A medium grey (128,128,128) would technically be measured as something like (131, 130, 129) on the WG monitor, so there's all kinds of fancy transformations going on in order to not make such things apparently visible.

    Like I said, if we ever get more bits in the entire display path, this whole point becomes moot, but for now it isn't.
    Reply
  • andy o - Tuesday, September 28, 2010 - link

    If you have your monitor properly calibrated, it's not a problem. You don't have to "spread" sRGB's "73%" (of what? I assume you mean Adobe RGB). You create your own content in its own color gamut. A wider gamut monitor can ensure that the colors in it overlap other devices like printers, thus proofing becomes more accurate.

    Wide gamut are great for fairly specialized calibrated systems, but I agree they're not for movie watching or game use.
    Reply
  • teng029 - Tuesday, September 28, 2010 - link

    still not compliant, i'm assuming.. Reply
  • theangryintern - Tuesday, September 28, 2010 - link

    Grrrrrr for it being a glossy panel. I *WAS* thinking about getting this monitor, but since I sit at my desk with my back to a large window, glossy doesn't cut it. That and the fact that I HATE glossy monitors, period. Reply
  • lukeevanssi - Wednesday, September 29, 2010 - link

    I haven't used it myself, but a close friend did and said it works great - he has two monitors hooked up to his 24" iMac. I have, however, ordered stuff from OWC before (I get all my Apple RAM there since it's a lot cheaper than the Apple Store and it's all Apple-rated RAM) and they are awesome.
    http://www.articlesbase.com/authors/andrew-razor/6...
    Reply

Log in

Don't have an account? Sign up now