Final Words

I think the more compact 27-inch form factor is the right package to deal with greater than 1080p resolutions. Thirty inch monitors are great if you need more than 1920 x 1200 on a single panel but they’re bulky and don’t have a particularly great pixel density. The 27-inch 16:9 panel in the new LED Cinema Display is a nice alternative.

The styling is impeccable however Apple made two sacrifices in order to design such a pretty display. The first sacrifice is the glass covered panel. It looks great but glare can be a problem. Apple has generally avoided the problems associated with glare by outfitting its glass displays with ridiculously bright backlights/panels; the 27-inch LED Cinema Display is no exception. Glare is actually even less of a problem indoors since its easier to control light, and the bright display is more than enough to compensate. The issue of glare actually has to do with watching dark scenes in movies on the screen. You’ll see your reflection in dark scenes or even in objects like a suit jacket in an otherwise well-lit scene. It’s very bothersome at first, but you can get used to it if you absolutely must. While I don’t mind Apple’s glossy MacBook Pro screens, I’m less sold on their use for a desktop. Perhaps this is because I don’t watch a lot of TV/movies on my notebook and more on my desktop.

The second sacrifice is the lack of a height adjustable stand. You can tilt the Cinema Display but you can’t move it up or down. Apple even has the gall to suggest simply adjusting the height of your workspace if your monitor is too high/low. This wasn’t a problem for me because I actually bought a height adjustable desk a while ago (a properly adjusted desk helps fend off carpal tunnel in a major way), but I recognize that the vast majority of desks out there don’t let you change their height. Whether or not the lack of height adjustment will bother you really depends on your choice of desk.

The integrated speakers are a nice touch. They’re good enough to get the job done if you’re space constrained and a significant step above what you get in a notebook. Compared to a good set of desk speakers however they obviously fall short.

Cable management is beautifully handled. The single cable carrying MagSafe power, USB/audio and video keeps desk clutter to a minimum. Being able to charge your MacBook/MacBook Pro/MacBook Air is awesomely convenient. This is the sort of proprietary Apple design that the company has employed for decades, the difference is now Apple has the marketshare for it to actually be useful. The cable length is a bit limiting to how you can setup your desk so keep that in mind before getting too excited.

As a monitor the 27-inch LED Cinema Display is very bright. Black levels are average for a high end panel and as a result we noted middle of the road contrast on the display. Color reproduction out of the box isn’t that great, but calibrated the display is good.

Color gamut is the bigger issue thanks to the LED backlight. You get a power efficient display, but you also lose a chunk of the AdobeRGB 1998 color gamut. RGB LEDs would solve this problem but they are costly (and power hungry) to implement. Apple wanted a thin display (ruling out CCFL) and presumably wanted to stay below $1000, which ruled out RGB LEDs for the backlight.

If you’re used to notebook displays, the 27-inch LED Cinema Display will still be a step above. But if you’re moving from a high end desktop panel you may actually take a step back in color quality. Coming from using mostly CCFL lit panels, I found the whites to be too harsh on the 27. Color and brightness uniformity are both very good.

Overall the new 27-inch LED Cinema Display isn’t the knockout I had hoped it would be. You get 90% of the resolution of a 30-inch display, in a more compact package. The ability to charge your notebook (if you’re a modern Apple user) is a nice convenience as well. And at $999 it’s actually more affordable than most 30-inch LCDs. With a 120Hz panel and RGB LED backlighting it could have been both forward looking and near perfect, instead what we have is a display that’s good, but not great.

Power Consumption
POST A COMMENT

93 Comments

View All Comments

  • burgerace - Tuesday, September 28, 2010 - link

    Wide color gamut is, for most non-professional users, a horrible drawback. Operating systems, web browsers and sites, images from my SLR camera, games, movies -- content is created for traditional color gamut!

    At the recommendation of of tech sites like this one, I bought two WCG Dell monitors, a 2408 and a 2410. They exhibit garish red push, and distorted colors in general. ATI drivers can use EDID to adjust the color temperature, reducing red push to a manageable level. But when I "upgraded" to an NVIDA 460, I lost that option.

    Anand, do you actually look at the monitors with your eyes? Can you see how bad WCG looks? Forget the tables full of misleading numbers from professional image editing software, please.
    Reply
  • 7Enigma - Tuesday, September 28, 2010 - link

    I think your problem is that most people spending this chunk of change on an LCD also have them properly calibrated. As mentioned in this exact review the uncalibrated picture was quite bad. This LCD might have even been cherry-picked for the review unit (don't know if this was sent by Apple for review or Anand purchased it for personal use). So WYSIWYG doesn't apply when calibration is performed. Reply
  • burgerace - Tuesday, September 28, 2010 - link

    WCG monitors are NOT capable of displaying a greater number of colors than a traditional monitor. They display the same 24 bit color, but it's spread over a greater range of wavelengths.

    ALL mainstream content is designed to use only the 73% gamut. There is no way to "calibrate" a monitor to make mainstream content look good. Either the monitor displays the content within the correct, limited gamut -- thereby using less than 24bit color to render the image and throwing out visual information -- or it spreads it out over the wide gamut, causing inaccurate colors.
    Reply
  • Pinkynator - Tuesday, September 28, 2010 - link

    Finally someone who knows what they're talking about!

    I've finally registered here to say the exact same thing as you, but instead I'll give you my full support.

    People just don't seem to understand that wide gamut is probably the second worst thing that happened to computer displays, right after TN monitors. It's bad - it's seriously bad.

    Things might change a very long time from now, in a distant future, *IF* we get graphics cards with more bits per channel and monitors capable of understanding that (along with proper software support), but right now it's just something that is being pushed by marketing. Even tech review sites like Anandtech managed to fall for that crap, misleading monitor buyers into thinking that bigger gamut equals a better picture. In fact, it's exactly the opposite.

    To go into a serious theoretical hyperbole for those who do not understand the implications of a stretched wide gamut with 8BPC output, a monitor with a 1000000000% gamut would only be capable of displaying one single shade of red, green or blue. Everything at 0 would be black, and everything from 1..255 would be eye-scorchingly red, green or blue. (Actually, the shades would technically differ, but the human eye would not be able to discern them.)

    Your options with wide gamut are as follows:

    1) Display utterly inaccurate colours

    2) Emulate sRGB and throw out colour information, lowering the dynamic range and picture quality

    That's it. Nothing else. Wide gamut, as it stands right now, DESTROYS the displayed image.

    If you like wide gamut, that's fine - there are people who like miss Justine Bieber, too, but that doesn't make her good.
    Reply
  • vlado08 - Tuesday, September 28, 2010 - link

    I don't understand sRGB emulation.
    But probably on the input of the monitor you have 8 bits per color and through processing they cange it to 10 bits to drive the panel? This way you may not lose dynamic range. Well the color information will be less than 10 bits per color but you dont have this color in the input to begin with. Tell me if I'm wrong.
    Reply
  • Pinkynator - Wednesday, September 29, 2010 - link

    Example:

    Pure red (255,0,0) on a wide gamut monitor is more intense than pure red on a normal gamut monitor (which content is created for, thus ends up looking incorrect on WG).

    That means (255,0,0) should actually be internally transformed by the monitor to something like (220,0,0) if you want the displayed colour to match that of the normal monitor and show the picture accurately. It also means that when the graphics card gives the monitor (240,0,0), the monitor would need to transform it to (210,0,0) for proper display - as you can see, it has condensed 15 shades of red (240-255) into only 10 (210-220).

    To put it differently, if you display a gradient on a wide gamut monitor performing sRGB emulation, you get banding, or the monitor cheats and does dithering, which introduces visible artifacts.

    Higher-bit processing is basically used only because the gamut does not stretch linearly. A medium grey (128,128,128) would technically be measured as something like (131, 130, 129) on the WG monitor, so there's all kinds of fancy transformations going on in order to not make such things apparently visible.

    Like I said, if we ever get more bits in the entire display path, this whole point becomes moot, but for now it isn't.
    Reply
  • andy o - Tuesday, September 28, 2010 - link

    If you have your monitor properly calibrated, it's not a problem. You don't have to "spread" sRGB's "73%" (of what? I assume you mean Adobe RGB). You create your own content in its own color gamut. A wider gamut monitor can ensure that the colors in it overlap other devices like printers, thus proofing becomes more accurate.

    Wide gamut are great for fairly specialized calibrated systems, but I agree they're not for movie watching or game use.
    Reply
  • teng029 - Tuesday, September 28, 2010 - link

    still not compliant, i'm assuming.. Reply
  • theangryintern - Tuesday, September 28, 2010 - link

    Grrrrrr for it being a glossy panel. I *WAS* thinking about getting this monitor, but since I sit at my desk with my back to a large window, glossy doesn't cut it. That and the fact that I HATE glossy monitors, period. Reply
  • lukeevanssi - Wednesday, September 29, 2010 - link

    I haven't used it myself, but a close friend did and said it works great - he has two monitors hooked up to his 24" iMac. I have, however, ordered stuff from OWC before (I get all my Apple RAM there since it's a lot cheaper than the Apple Store and it's all Apple-rated RAM) and they are awesome.
    http://www.articlesbase.com/authors/andrew-razor/6...
    Reply

Log in

Don't have an account? Sign up now