I have a confession to make. For the past year I’ve been using a 27-inch iMac as my primary workstation. I always said that if I had a less mobile lifestyle the iMac is probably the machine I’d end up with (that was prior to the announcement of the new Mac Pro of course). This past year has been the most insane in terms of travel, so it wasn’t a lack of mobility that kept me on the iMac but rather a desire to test Apple’s new Fusion Drive over the long haul.

It’s entirely possible to mask the overwhelmingly bad experience of a hard drive in a high performance machine by only sampling at the beginning of the journey. When the OS is a clean install, the drive is mostly empty and thus operating at its peak performance. Obviously Apple’s Fusion Drive is designed to mitigate the inevitable performance degradation, and my initial take on it after about a month of use was very good - but would it last?

I’m happy to report that it actually did. So today’s confession is really a two-parter: I’ve been using an iMac for the past year, and I’ve been using a hard drive as a part of my primary storage for the past year. Yeesh, I never thought I’d do either of those things.

Apple 2013 iMac
Configuration 21.5-inch iMac 21.5-inch Upgraded iMac 27-inch iMac 27-inch Upgraded iMac
Display 21.5-inch 1920 x 1080 21.5-inch 1920 x 1080 27-inch 2560 x 1440 27-inch 2560 x 1440
CPU (Base/Turbo) Intel Core i5-4570R (2.7GHz/3.2GHz) Intel Core i5-4570S (2.9GHz/3.6GHz) Intel Core i5-4570 (3.2GHz/3.6GHz) Intel Core i5-4670 (3.4/3.8GHz)
GPU Intel Iris Pro 5200 NVIDIA GeForce GT 750M (1GB GDDR5) NVIDIA GeForce GT 755M (1GB GDDR5) NVIDIA GeForce GTX 775M (2GB GDDR5)
RAM 8GB DDR3-1600 8GB DDR3-1600 8GB DDR3-1600

8GB DDR3-1600

Storage 1TB 5400RPM 1TB 5400RPM 1TB 7200RPM 1TB 7200RPM
WiFi 802.11ac
I/O 4 x USB 3.0, 2 x Thunderbolt, 1 x GigE, SDXC reader, headphone jack
Starting Price $1299 $1499 $1799 $1999

This year the iMacs get incrementally better. Displays and resolutions are the same, but silicon options are a bit quicker, 802.11ac is on deck and the SSDs all move to PCIe (including Fusion Drive). As tempted as I was to begin my first look at the 2013 iMac evaluating the impact of going to faster storage, it was the entry-level model that grabbed my attention first because of a little piece of silicon we’ve come to know as Crystalwell.

The CPU: Haswell with an Optional Crystalwell

The entry level 21.5-inch iMac is one of the most affordable options in Apple’s lineup. At $1499 Apple will typically sell you a dual-core notebook of some sort, but here you get no less than a quad-core, 65W Core i5-4570R. That’s four cores running at 2.7GHz, and capable of hitting up to 3.2GHz. In practice I pretty much always saw the cores running at 3.0GHz regardless of workload. I’d see some excursions up at 3.1GHz but for the most part you’re effectively buying a 3GHz Haswell system.

The R at the end of the SKU connotes something very special. Not only do you get Intel’s fastest GPU configuration (40 EUs running at up to 1.15GHz), but you also get 128MB of on-package eDRAM. The combination of the two gives you a new brand: Intel’s Iris Pro 5200.

The Iris Pro 5200 is a GPU configuration option I expect to see on the 15-inch MacBook Pro with Retina Display, and its presence on the iMac tells us how it’ll be done. In last year’s iMacs, Apple picked from a selection of NVIDIA discrete GPUs. This year, the entry level 21.5-inch model gets Iris Pro 5200 while the rest feature updated NVIDIA Kepler discrete GPUs. It’s the same bifurcation that I expect to find on the 15-inch MacBook Pro with Retina Display. As we found in our preview of Intel’s Iris Pro 5200, in its fastest implementation the GPU isn’t enough to outperform NVIDIA’s GeForce GT 650M (found in the 2012 15-inch rMBP). Apple’s engineers aren’t particularly fond of performance regressions, so the NVIDIA GPUs stick around for those who need them, and for the first time we get a truly decent integrated option from Intel.

Most PC OEMs appear to have gone the opposite route - choosing NVIDIA’s low-end discrete graphics over Intel’s Iris Pro. The two end up being fairly similar in cost (with Intel getting the slight edge it seems). With NVIDIA you can get better performance, while Intel should deliver somewhat lower power consumption and an obvious reduction in board area. I suspect Iris Pro probably came in a bit slower than even Apple expected, but given that Apple asked Intel to build the thing it probably felt a bit compelled to use it somewhere. Plus there’s the whole believing in the strategy aspect of all of this. If Apple could shift most of its designs exclusively to processor graphics, it would actually be able to realize board and power savings which would have an impact on industrial design. We’re just not there yet. Whether we ever get there depends on just how aggressive Intel is on the graphics front.

I already went through what the 128MB eDRAM (codename Crystalwell) gets you, but in short that massive on-package memory acts as a L4 cache for both the CPU and GPU. You get 50GB/s of bandwidth in both directions, and access latency somewhere between L3 cache and main memory requests.

OS X doesn’t seem to acknowledge Crystalwell’s presence, but it’s definitely there and operational (you can tell by looking at the GPU performance results). Some very specific workloads can benefit handsomely from the large eDRAM. At boot I suspect key parts of the OS are probably cached on-package as well, something that’ll have big implications for power usage in mobile. Unfortunately my review sample came with a hard drive, and these new iMacs aren’t super easy to break into (not to mention that Apple frowns upon that sort of behavior with their review samples), which hampered the user experience. OS X continues to do a good job of keeping things cached in memory, and the iMac’s 8GB default configuration helps there tremendously. Whenever I was working with data and apps in memory, the system felt quite snappy. I’ll get to the benchmarks in a moment.

The non-gaming experience with Iris Pro under OS X seemed fine. I noticed a graphical glitch under Safari in 10.8.5 (I saw tearing while scrolling down a long list of iCloud tabs) but otherwise everything else looked good.

iMac (Late 2013) CPU Options
  21.5-inch 27-inch
  Base Upgraded Optional Base Upgraded Optional
Intel CPU i5-4570R i5-4570S i7-4770S i5-4570 i5-4670 i7-4771
Cores / Threads 4 / 4 4 / 4 4 / 8 4 / 4 4 / 4 4 / 8
Base Clock 2.7GHz 2.9GHz 3.1GHz 3.2GHz 3.4GHz 3.5GHz
Max Turbo 3.2GHz 3.6GHz 3.9GHz 3.6GHz 3.8GHz 3.9GHz
L3 Cache 4MB 6MB 8MB 6MB 6MB 8MB
TDP 65W 65W 65W 65W 84W 84W
VT-x / VT-d Y / Y Y / Y Y / Y Y / Y Y / Y Y / Y
TSX-NI N Y Y Y Y Y

In typical Intel fashion, you get nothing for free. The 128MB of eDRAM comes at the expense of a smaller L3 cache, in this case 4MB shared by all four cores (and the GPU). Note that this tradeoff also exists on the higher end Core i7 R-series SKU, but 6MB of L3 is somehow less bothersome than 4MB. This is the lowest core:L3 cache ratio of any modern Intel Core series processor. The 128MB eDRAM likely more than makes up for this reduction, and I do wonder if this isn’t a sign of things to come from Intel. A shift towards smaller, even lower latency L3 caches might make sense if you’ve got a massive eDRAM array backing it all up.

CPU Performance
Comments Locked

127 Comments

View All Comments

  • elian123 - Monday, October 7, 2013 - link

    Anand, could you perhaps indicate when you would expect higher-res iMac displays (as well as pc displays in general, not only all-in-ones)?
  • solipsism - Monday, October 7, 2013 - link

    Before that happens Apple will likely need to get their stand-alone Apple display "high-res". I don't expect it to go 2x like avery other one of their display; instead I would suspect it to be 4K, which is exactly 1.5x over the current 27" display size. Note that Apple mentioned 4K many times when previewing the Mac Pro.

    Also, the most common size for quality 4K panels appears to be 31.5" so I would't be surprised to see it move to that size. When the iMacs are to get updated I think each would then most likely use a slightly larger display panel.
  • mavere - Monday, October 7, 2013 - link

    ~75% of the stock desktop wallpapers in OSX 10.9 are at 5120x2880.

    It's probably the biggest nudge-nudge-wink-wink Apple has ever given for unannounced products.
  • name99 - Monday, October 7, 2013 - link

    Why does Apple have to go to exactly 4K? We all understand the point, and the value, of going to 2x resolution. The only value in going to exactly 4K is cheaper screens (but cheaper screens means crappy lousy looking screens, so Apple doesn't care).
  • jasonelmore - Monday, October 7, 2013 - link

    4k is 16:9 ratio, to do a 16:10 right, they would have to do 5k
  • repoman27 - Monday, October 7, 2013 - link

    iMacs have been 16:9 since 2009, and 3840x2400 (4K 16:10) panels have been produced in the past and work just fine.
  • repoman27 - Monday, October 7, 2013 - link

    Apple is pretty locked in to the current screen sizes and 16:9 aspect ratio by the ID, and I can only imagine they will stick with the status quo for at least one more generation in order to recoup some of their obviously considerable design costs there.

    Since Apple sells at best a couple million iMacs of each form factor in a year’s time, they kinda have to source panels for which there are other interested customers—we’re not even close to iPhone or iPad numbers here. Thus I’d reckon we’ll see whatever panels they intend to use in future generations in the wild before those updates happen. As solipsism points out, the speculation that there will be a new ATD with a 31.5”, 3840x2160 panel released alongside the new Mac Pro makes total sense because other vendors are already shipping similar displays.

    I actually made a chart to illustrate why a Retina iMac was unlikely anytime soon: http://i.imgur.com/CfYO008.png

    I listed the size and resolution of previous LCD iMacs, as well as possible higher resolutions at 21.5” and 27”. Configurations that truly qualify as "Retina" are highlighted in green, and it looks as though pixel doubling will be Apple’s strategy when they make that move. I also highlighted configurations that require two DP 1.1a links or a DP 1.2 link in yellow, and those that demand four DP 1.1a links or two DP 1.2 links in red for both standard CVT and CVT with reduced blanking. Considering Apple has yet to ship any display that requires more than a single DP 1.1a link, and all of the Retina options at 27" are in the red is probably reason alone that such a device doesn't exist yet.

    I also included the ASUS PQ321Q 31.5" 3840x2160 display, and the Retina MacBook Pros as points of comparison to illustrate the pricing issues that Retina iMacs would face. While there are affordable GPU options that could drive these displays and still maintain a reasonable degree of UI smoothness, the panels themselves either don't exist or would be prohibitively expensive for an iMac.
  • name99 - Monday, October 7, 2013 - link

    OR what you chart tells us is that these devices will be early adopters of the mythical (but on its way) DisplayPort 1.3?
    Isn't it obvious that part of the slew of technologies to arrive when 4K hits the mainstream (as opposed to its current "we expect you to pay handsomely for something that is painful to use" phase will be an updated DisplayPort spec?
  • repoman27 - Monday, October 7, 2013 - link

    Unlike HDMI 1.4, DisplayPort 1.2 can handle 4K just fine. I'd imagine DP 1.3 should take us to 8K.

    What baffles me is that every Mac Apple has shipped thus far with Thunderbolt and either a discrete GPU or Haswell has been DP 1.2 capable, but the ports are limited to DP 1.1a by the Thunderbolt controller. So even though Intel is supposedly shipping Redwood Ridge which has a DP 1.2 redriver, and Falcon Ridge which fully supports DP 1.2, we seem to be getting three generations of Macs where only the Mac Pros can actually output a DP 1.2 signal.

    Furthermore, I don't know of any panels out there that actually support eDP HBR2 signaling (introduced in the eDP 1.2 specification in May 2010, clarified in eDP 1.3 in February 2011, and still going strong in eDP 1.4 as of January this year). The current crop of 4K displays appear to be driven by converting a DisplayPort 1.2 HBR2 signal that uses MST to treat the display as 2 separate regions into a ridiculously wide 8 channel LVDS signal. Basically, for now, driving a display at more than 2880x1800 seems to require multiple outputs from the GPU.

    And to answer your question about why 4K, the problem is really more to do with creating a panel that has a pixel pitch somewhere in the no man's land between 150 and 190 PPI. Apple does a lot of work to make scaling decent even with straight up pixel doubling, but the in-between pixel densities would be really tricky, and probably not huge sellers in the Windows market. Apple needs help with volume in this case, they can't go it alone and expect anything short of ludicrously expensive.
  • name99 - Tuesday, October 8, 2013 - link

    My bad. I had in mind the fancier forms of 4K like 10 bits (just possible) and 12 bit (not possible) at 60Hz, or 8bit at 120Hz; not your basic 8 bits at 60Hz. I should have filled in my reasoning.

Log in

Don't have an account? Sign up now