Driving the Retina Display: A Performance Discussion

As I mentioned earlier, there are quality implications of choosing the higher-than-best resolution options in OS X. At 1680 x 1050 and 1920 x 1200 the screen is drawn with 4x the number of pixels, elements are scaled appropriately, and the result is downscaled to 2880 x 1800. The quality impact is negligible however, especially if you actually need the added real estate. As you’d expect, there is also a performance penalty.

At the default setting, either Intel’s HD 4000 or NVIDIA’s GeForce GT 650M already have to render and display far more pixels than either GPU was ever intended to. At the 1680 and 1920 settings however the GPUs are doing more work than even their high-end desktop counterparts are used to. In writing this article it finally dawned on me exactly what has been happening at Intel over the past few years.

Steve Jobs set a path to bringing high resolution displays to all of Apple’s products, likely beginning several years ago. There was a period of time when Apple kept hiring ex-ATI/AMD Graphics CTOs, first Bob Drebin and then Raja Koduri (although less public, Apple also hired chief CPU architects from AMD and ARM among other companies - but that’s another story for another time). You typically hire smart GPU guys if you’re building a GPU, the alternative is to hire them if you need to be able to work with existing GPU vendors to deliver the performance necessary to fulfill your dreams of GPU dominance.

In 2007 Intel promised to deliver a 10x improvement in integrated graphics performance by 2010:

In 2009 Apple hired Drebin and Koduri.

In 2010 Intel announced that the curve had shifted. Instead of 10x by 2010 the number was now 25x. Intel’s ramp was accelerated, and it stopped providing updates on just how aggressive it would be in the future. Paul Otellini’s keynote from IDF 2010 gave us all a hint of what’s to come (emphasis mine):

But there has been a fundamental shift since 2007. Great graphics performance is required, but it isn't sufficient anymore. If you look at what users are demanding, they are demanding an increasingly good experience, robust experience, across the spectrum of visual computing. Users care about everything they see on the screen, not just 3D graphics. And so delivering a great visual experience requires media performance of all types: in games, in video playback, in video transcoding, in media editing, in 3D graphics, and in display. And Intel is committed to delivering leadership platforms in visual computing, not just in PCs, but across the continuum.

Otellini’s keynote would set the tone for the next few years of Intel’s evolution as a company. Even after this keynote Intel made a lot of adjustments to its roadmap, heavily influenced by Apple. Mobile SoCs got more aggressive on the graphics front as did their desktop/notebook counterparts.

At each IDF I kept hearing about how Apple was the biggest motivator behind Intel’s move into the GPU space, but I never really understood the connection until now. The driving factor wasn’t just the demands of current applications, but rather a dramatic increase in display resolution across the lineup. It’s why Apple has been at the forefront of GPU adoption in its iDevices, and it’s why Apple has been pushing Intel so very hard on the integrated graphics revolution. If there’s any one OEM we can thank for having a significant impact on Intel’s roadmap, it’s Apple. And it’s just getting started.

Sandy Bridge and Ivy Bridge were both good steps for Intel, but Haswell and Broadwell are the designs that Apple truly wanted. As fond as Apple has been of using discrete GPUs in notebooks, it would rather get rid of them if at all possible. For many SKUs Apple has already done so. Haswell and Broadwell will allow Apple to bring integration to even some of the Pro-level notebooks.

To be quite honest, the hardware in the rMBP isn’t enough to deliver a consistently smooth experience across all applications. At 2880 x 1800 most interactions are smooth but things like zooming windows or scrolling on certain web pages is clearly sub-30fps. At the higher scaled resolutions, since the GPU has to render as much as 9.2MP, even UI performance can be sluggish. There’s simply nothing that can be done at this point - Apple is pushing the limits of the hardware we have available today, far beyond what any other OEM has done. Future iterations of the Retina Display MacBook Pro will have faster hardware with embedded DRAM that will help mitigate this problem. But there are other limitations: many elements of screen drawing are still done on the CPU, and as largely serial architectures their ability to scale performance with dramatically higher resolutions is limited.

Some elements of drawing in Safari for example aren’t handled by the GPU. Quickly scrolling up and down on the AnandTech home page will peg one of the four IVB cores in the rMBP at 100%:

The GPU has an easy time with its part of the process but the CPU’s workload is borderline too much for a single core to handle. Throw a more complex website at it and things get bad quickly. Facebook combines a lot of compressed images with text - every single image is decompressed on the CPU before being handed off to the GPU. Combine that with other elements that are processed on the CPU and you get a recipe for choppy scrolling.

To quantify exactly what I was seeing I measured frame rate while scrolling as quickly as possible through my Facebook news feed in Safari on the rMBP as well as my 2011 15-inch High Res MacBook Pro. While last year’s MBP delivered anywhere from 46 - 60 fps during this test, the rMBP hovered around 20 fps (18 - 24 fps was the typical range).


Scrolling in Safari on a 2011, High Res MBP - 51 fps


Scrolling in Safari on the rMBP - 21 fps

Remember at 2880 x 1800 there are simply more pixels to push and more work to be done by both the CPU and the GPU. It’s even worse in those applications that have higher quality assets: the CPU now has to decode images at 4x the resolution of what it’s used to. Future CPUs will take this added workload into account, but it’ll take time to get there.

The good news is Mountain Lion provides some relief. At WWDC Apple mentioned the next version of Safari is ridiculously fast, but it wasn’t specific about why. It turns out that Safari leverages Core Animation in Mountain Lion and more GPU accelerated as a result. Facebook is still a challenge because of the mixture of CPU decoded images and a standard web page, but the experience is a bit better. Repeating the same test as above I measured anywhere from 20 - 30 fps while scrolling through Facebook on ML’s Safari.

Whereas I would consider the rMBP experience under Lion to be borderline unacceptable, everything is significantly better under Mountain Lion. Don’t expect buttery smoothness across the board, you’re still asking a lot of the CPU and GPU, but it’s a lot better.

Achieving Retina Boot Camp Behavior & Software Funniness
Comments Locked

471 Comments

View All Comments

  • OCedHrt - Sunday, June 24, 2012 - link

    He missed another important point. All of that was in 3 lbs. Now, the current generation starting from last summer has an external discrete graphics and optical drive connected via a thunderbolt based connector (because Apple had exclusivity) with the laptop being only 2.5 lbs.

    This isn't going to compare to the retina macbook pro though - at 15 inches 4.5 lbs is pretty impressive though I think if Sony wanted to do it they could do 4 lbs or less.
  • deathdemon89 - Saturday, June 23, 2012 - link

    I agree completely, I'm a proud owner of the old Z, and even today it doesn't feel the least bit dated. And the 1080p screen is holding up admirably well, with no signs of pixellation at normal viewing distances. This device was truly innovative for its time. I still don't understand why it received such mixed reviews by the press.
  • Spunjji - Monday, June 25, 2012 - link

    Mainly the price. Only Apple are allowed to charge that much for a laptop. Also, only Apple can have hot systems. Repeat ad infinitum.
  • mlambert890 - Wednesday, November 28, 2012 - link

    Really ridiculous comment. I can see you are bitter, as is the other mega z fan, but come on already. I worked for Sony for 5 years and love the company. I have owned probably a dozen Vaios including the top of the line last gen Z (with the SSD RAID)

    Instead of ranting and raving you need to ask yourself *why it is* that "only Apple can charge so much" and why "Anand only gives a free pass to Apple"

    You feel what exactly? That there is some grand conspiracy in play? Do you realize how ridiculous that sounds?

    WHY has Sony *lost the ability to charge a premium*? In other words WHY have they *burned all customer loyalty and good will*? I left the company back in 1999 because I saw the writing on the wall.

    You (and the other Z guy) are no different than any other apologist. Companies dont bleed marketshare and fail to sell cancer curing products (you guys are presenting the Z as "truly revolutionary" right?) for no reason. Sorry to tell you there is no "big conspiracy".

    Sony sells super high priced products into a super commoditized market and then they layer on a CRAP TON of bloatware dragging the machine to a stop, make idiot decisions like the HDMI one, and push proprietary crap *worse* than Apple ever has. All of that into the Wintel space which, sorry to tell you, was *always* driven by the cheapest possible parts at the best possible price.

    The PC industry grew *because it was cheap*. Apple *always* occupied a premium niche. I vividly remember the initial release of the Apple I, the Lisa, the Mac 128. These were all always premium products and the competition at the time (be it Commodore, Atari, Ti, or the wintel ecosystem) *always* captured share by being cheap.

    That may annoy you for some reason, but Apple has pretty much *always* held a core premium audience. The only exception was the period of insanity when Jobs had been pushed out and Scully destroyed the company. Even then, the core fans stayed.

    You two make it sound like poor Sony is a victim because the world doesnt all run out and by the Vaio Z.

    Even *without Apple* Sony would be going under, hate to tell you. Sony's problems are Sony's and the whole is *not* the sum of its parts with a laptop.
  • solipsism - Saturday, June 23, 2012 - link

    None of that makes sense and is, in fact, rubbish.

    Sony added 1080p because it was it was popular not because it made sense. You have a 168 PPI display on 13" machine which makes text too small to be a good experience for most users.

    They also didn't use a good quality display or add anything to the SW to make the experience good (unlike what Anand talked about in this review), they just added the single metric that was trending because of HDTVs.

    Blu-ray in a notebook has always been a silly option for most users. There is a reason the BRD adoption failed on PCs and it's not because everyone is stupid... except you. ODDs are long overdue for being removed since they take up 25% of the chassis, require them to placed at an edge reducing over 5" of port real estate and restricting design, require a lot of power, are noisy, more prone to break due to the many moving parts, are slow, are just too expensive to be feasible, and add nothing visually that most users trying to watch a movie can discern.

    Quad-SSDs? Really? That's a sensible solution for a consumer notebook?
  • EnzoFX - Saturday, June 23, 2012 - link

    and that really is what people don't get. It isn't just about raw specs. The package needs to be complete, polished, what have you. In this case of high dpi screens, is good scaling support, and Apple did it. Support on the software side is something they never get credit for by the Apple haters. All they can see is numbers and think "I've seen numbers like that before".
  • mabellon - Saturday, June 23, 2012 - link

    No Apple didn't do it. Just like on the iPad, they increased resolution by doubling width and height. Their software simply doesn't scale well to arbitrary higher resolution. If it was done right then Chrome would work out of the box - instead the OS 2x scales everything without increasing resolution/quality.

    To the consumer, the choice means a good experience without breaking apps. But claiming that Apple was successful simply bc of software? HA!
  • Ohhmaagawd - Saturday, June 23, 2012 - link

    Did you actually read the retina part of the review?

    Chrome doesn't work right because they do their own text rendering. Read the review. If an app uses the native text rendering, the app will look good (at least the text portion). They will have to update the graphical assets of course.

    BTW, Chrome Dev builds have this issue fixed.

    Windows DPI setting isn't default, so few use or even know about the setting and devs haven't made sure they work properly in the high DPI settings.

    Apple has made a move that will be short-term painful in having apps that aren't updated look a bit fuzzy. But since they made it default, this will force devs to update.
  • OCedHrt - Sunday, June 24, 2012 - link

    What do you mean Windows DPI setting isn't default? You can change it in a few clicks, but the same thing applies - if your app does not read the DPI values, then Windows can't help you. This is because windows UI is not vector based (I don't know about now, but older apps definitely not) and many applications back then were coded with hard coded pixel counts.

    When the DPI is changed, windows scales the text but the UI dimensions is controlled by the application implementation.
  • KitsuneKnight - Saturday, July 7, 2012 - link

    On Windows, changing the DPI will generally mean a huge amount of applications will become simply unusable.

    On this Retina MBP, the worst case appears to be slightly blurry text (which was quickly updated).

    Apple's solution is a good one, because it does things in a way that should keep existing apps working fine, while allowing future developers to leverage new APIs to take advantage of the increased resolution.

Log in

Don't have an account? Sign up now