The Display

When it was first announced, I shrugged off the 21.5-inch iMac model. At the time I was using a 27-inch Thunderbolt Display and couldn’t see myself using anything smaller, or lower resolution. With the new 27-inch iMac looking a lot like last year’s model with evolutionary upgrades on the internals, I was obviously drawn to the new 21.5-inch system because of its use of Intel’s Iris Pro 5200 graphics so I ended up with the first < 3MP desktop display I’d used since the release of the first 30-inch 2560 x 1600 panels years ago.

Given how much time I spend on notebook displays these days, now was as good a time as any to go back to a 1080p desktop display. While I’d prefer something with an insanely higher resolution, it’s still too early for a 21.5-inch 4K panel (or a 27-inch 5K panel), which Apple would likely move to in order to bring Retina displays to its desktops.

There are two reasons why you’d opt for the 21.5-inch iMac vs. the larger one: cost and size. At a bare minimum you’re looking at a $500 price difference between the 21.5 and 27-inch iMacs, which is pretty substantial to begin with. The size argument is just as easy to understand. The 27-inch iMac occupies a considerable amount of space on my desk, and I’ve come to realize that not everyone likes to be surrounded by a sea of desks. Either way there’s clearly a market for a computer this size, with this sort of a resolution. So how does the display fare?

In short: it’s nearly perfect.

Brian and I were comparing notes on the two reviews we’re working on at the same time. He sent me some CIE diagrams showing me color accuracy for the displays he’s testing, I responded with this:


21.5-inch iMac (Late 2013) Saturations

Those boxes show what’s expected, the circles inside of them show what’s delivered by the display. The 21.5-inch iMac is spot on, out of the box, without any calibration required. Brian’s response:

WOW
is that out of the box?

The iMac’s display does extremely well in all of our tests, always turning in a delta E of less than 2. It’s just incredible. I'm borrowing the graphs below from our tablet bench data, but I've tossed in the 2013 MacBook Air as a reference point.

CalMAN Display Performance - White Point Average

CalMAN Display Performance - Saturations Average dE 2000

CalMAN Display Performance - Gretag Macbeth Average dE 2000

CalMAN Display Performance - Grayscale Average dE 2000

CalMAN Display Performance - Gamut Average dE 2000

Although I doubt Apple’s intended audience for the entry-level 21.5-inch iMac are imaging professionals, they could very well use the system and be perfectly happy with it. Literally all that’s missing is a 2x resolution model, but my guess is it’ll be another year before we see that.

I have to point out that Apple does source its display panels from multiple providers (typically 2 or 3), not to mention panel variance within a lot. I don’t anticipate finding many panels better than the one in my review sample, but it’s always possible that there will be worse examples in the market. I haven’t seen huge variance in color accuracy from Apple panels, so I think it’s a pretty safe bet that what you’re going to get with any new iMac is going to be awesome.

Storage & Fusion Drive WiFi, IO & The Chassis
Comments Locked

127 Comments

View All Comments

  • rootheday3 - Monday, October 7, 2013 - link

    I don't think this is true. See the die shots here:
    http://wccftech.com/haswell-die-configurations-int...

    I count 8 different die configurations.

    Note that the reduction in LLC (CPU L3) on Iris Pro may be because some of the LLC is used to hold tag data for the 128MB of eDRAM. Mainstream Intel CPUs have 2MB of LLC per CPU core, so the die has 8MB of LLC natively. The i7-4770R has all 8MB enabled but 2MB for eDRAM tag ram leaving 6MB for the CPU/GPU to use directly as cache (how it is reported on the spec sheet). The i5s generally have 6MB natively (for either die recovery and/or segmentation reasons) but if 2MB is used for eLLC tag ram, that leaves 4 for direct cache usage.

    Given that you get 128MB of eDRAM in exchange for the 2MB LLC consumed as tag ram, seems like a fair trade.
  • name99 - Monday, October 7, 2013 - link

    HT adds a pretty consistent 25% performance boost across an extremely wide variety of benchmarks. 50% is an unrealistic value.

    And, for the love of god, please stop with this faux-naive "I do not understand why Intel does ..." crap.
    If you do understand the reason, you are wasting everyone's time with your lament.
    If you don't understand the reason, go read a fscking book. Price discrimination (and the consequences thereof INCLUDING lower prices at the low end) are hardly deep secret mysteries.

    (And the same holds for the "Why oh why do Apple charge so much for RAM upgrades or flash upgrades" crowd. You're welcome to say that you do not believe the extra cost is worth the extra value to YOU --- but don't pretend there's some deep unresolved mystery here that only you have the wit to notice and bring to our attention; AND on't pretend that your particular cost/benefit tradeoff represents the entire world.

    And heck, let's be equal opportunity here --- the Windows crowd have their own version of this particular fool, telling us how unfair it is that Windows Super Premium Plus Live Home edition is priced at $30 more than Windows Ultra Extra Pro Family edition.

    I imagine there are the equivalent versions of these people complaining about how unfair Amazon S3 pricing is, or the cost of extra Google storage. Always with this same "I do not understand why these companies behave exactly like economic theory predicts; and they try to make a profit in the bargain" idiocy.)
  • tipoo - Monday, October 7, 2013 - link

    Wow, the gaming performance gap between OSX and Windows hasn't narrowed at all. I had hoped, two major OS releases after the Snow Leopard article, it would have gotten better.
  • tipoo - Monday, October 7, 2013 - link

    I wonder if AMD will support OSX with Mantle?
  • Flunk - Monday, October 7, 2013 - link

    Likely not, I don't think they're shipping GCN chips in any Apple products right now.
  • AlValentyn - Monday, October 7, 2013 - link

    Look up Mavericks, it supports OpenGL4.1, while Mountain Lion is still at 3.2

    http://t.co/rzARF6vIbm

    Good overall improvements in the Developer Previews alone.
  • tipoo - Monday, October 7, 2013 - link

    ML supports a higher OpenGL spec than Snow Leopard, but that doesn't seem to have helped lessen the real world performance gap.
  • Sm0kes - Tuesday, October 8, 2013 - link

    Got a link with real numbers?
  • Hrel - Monday, October 7, 2013 - link

    The charts show the Iris Pro take a pretty hefty hit any time you increase quality settings. HOWEVER, you're also increasing resolution. I'd be interested to see what happens when you increase resolution but leave detail settings at low-med.

    In other words, is the bottleneck the processing power of the GPU (I think it is) or the memory bandwidth? I suspect we could run Mass Effect or something similar at 1080p with medium settings.
  • Kevin G - Monday, October 7, 2013 - link

    "OS X doesn’t seem to acknowledge Crystalwell’s presence, but it’s definitely there and operational (you can tell by looking at the GPU performance results)."

    I bet OS X does but not in the GUI. Type the following in terminal:

    sysctl -a hw.

    There should be line about the CPU's full cache hierarchy among other cache information.

Log in

Don't have an account? Sign up now