Driving the Retina Display: A Performance Discussion

As I mentioned earlier, there are quality implications of choosing the higher-than-best resolution options in OS X. At 1680 x 1050 and 1920 x 1200 the screen is drawn with 4x the number of pixels, elements are scaled appropriately, and the result is downscaled to 2880 x 1800. The quality impact is negligible however, especially if you actually need the added real estate. As you’d expect, there is also a performance penalty.

At the default setting, either Intel’s HD 4000 or NVIDIA’s GeForce GT 650M already have to render and display far more pixels than either GPU was ever intended to. At the 1680 and 1920 settings however the GPUs are doing more work than even their high-end desktop counterparts are used to. In writing this article it finally dawned on me exactly what has been happening at Intel over the past few years.

Steve Jobs set a path to bringing high resolution displays to all of Apple’s products, likely beginning several years ago. There was a period of time when Apple kept hiring ex-ATI/AMD Graphics CTOs, first Bob Drebin and then Raja Koduri (although less public, Apple also hired chief CPU architects from AMD and ARM among other companies - but that’s another story for another time). You typically hire smart GPU guys if you’re building a GPU, the alternative is to hire them if you need to be able to work with existing GPU vendors to deliver the performance necessary to fulfill your dreams of GPU dominance.

In 2007 Intel promised to deliver a 10x improvement in integrated graphics performance by 2010:

In 2009 Apple hired Drebin and Koduri.

In 2010 Intel announced that the curve had shifted. Instead of 10x by 2010 the number was now 25x. Intel’s ramp was accelerated, and it stopped providing updates on just how aggressive it would be in the future. Paul Otellini’s keynote from IDF 2010 gave us all a hint of what’s to come (emphasis mine):

But there has been a fundamental shift since 2007. Great graphics performance is required, but it isn't sufficient anymore. If you look at what users are demanding, they are demanding an increasingly good experience, robust experience, across the spectrum of visual computing. Users care about everything they see on the screen, not just 3D graphics. And so delivering a great visual experience requires media performance of all types: in games, in video playback, in video transcoding, in media editing, in 3D graphics, and in display. And Intel is committed to delivering leadership platforms in visual computing, not just in PCs, but across the continuum.

Otellini’s keynote would set the tone for the next few years of Intel’s evolution as a company. Even after this keynote Intel made a lot of adjustments to its roadmap, heavily influenced by Apple. Mobile SoCs got more aggressive on the graphics front as did their desktop/notebook counterparts.

At each IDF I kept hearing about how Apple was the biggest motivator behind Intel’s move into the GPU space, but I never really understood the connection until now. The driving factor wasn’t just the demands of current applications, but rather a dramatic increase in display resolution across the lineup. It’s why Apple has been at the forefront of GPU adoption in its iDevices, and it’s why Apple has been pushing Intel so very hard on the integrated graphics revolution. If there’s any one OEM we can thank for having a significant impact on Intel’s roadmap, it’s Apple. And it’s just getting started.

Sandy Bridge and Ivy Bridge were both good steps for Intel, but Haswell and Broadwell are the designs that Apple truly wanted. As fond as Apple has been of using discrete GPUs in notebooks, it would rather get rid of them if at all possible. For many SKUs Apple has already done so. Haswell and Broadwell will allow Apple to bring integration to even some of the Pro-level notebooks.

To be quite honest, the hardware in the rMBP isn’t enough to deliver a consistently smooth experience across all applications. At 2880 x 1800 most interactions are smooth but things like zooming windows or scrolling on certain web pages is clearly sub-30fps. At the higher scaled resolutions, since the GPU has to render as much as 9.2MP, even UI performance can be sluggish. There’s simply nothing that can be done at this point - Apple is pushing the limits of the hardware we have available today, far beyond what any other OEM has done. Future iterations of the Retina Display MacBook Pro will have faster hardware with embedded DRAM that will help mitigate this problem. But there are other limitations: many elements of screen drawing are still done on the CPU, and as largely serial architectures their ability to scale performance with dramatically higher resolutions is limited.

Some elements of drawing in Safari for example aren’t handled by the GPU. Quickly scrolling up and down on the AnandTech home page will peg one of the four IVB cores in the rMBP at 100%:

The GPU has an easy time with its part of the process but the CPU’s workload is borderline too much for a single core to handle. Throw a more complex website at it and things get bad quickly. Facebook combines a lot of compressed images with text - every single image is decompressed on the CPU before being handed off to the GPU. Combine that with other elements that are processed on the CPU and you get a recipe for choppy scrolling.

To quantify exactly what I was seeing I measured frame rate while scrolling as quickly as possible through my Facebook news feed in Safari on the rMBP as well as my 2011 15-inch High Res MacBook Pro. While last year’s MBP delivered anywhere from 46 - 60 fps during this test, the rMBP hovered around 20 fps (18 - 24 fps was the typical range).


Scrolling in Safari on a 2011, High Res MBP - 51 fps


Scrolling in Safari on the rMBP - 21 fps

Remember at 2880 x 1800 there are simply more pixels to push and more work to be done by both the CPU and the GPU. It’s even worse in those applications that have higher quality assets: the CPU now has to decode images at 4x the resolution of what it’s used to. Future CPUs will take this added workload into account, but it’ll take time to get there.

The good news is Mountain Lion provides some relief. At WWDC Apple mentioned the next version of Safari is ridiculously fast, but it wasn’t specific about why. It turns out that Safari leverages Core Animation in Mountain Lion and more GPU accelerated as a result. Facebook is still a challenge because of the mixture of CPU decoded images and a standard web page, but the experience is a bit better. Repeating the same test as above I measured anywhere from 20 - 30 fps while scrolling through Facebook on ML’s Safari.

Whereas I would consider the rMBP experience under Lion to be borderline unacceptable, everything is significantly better under Mountain Lion. Don’t expect buttery smoothness across the board, you’re still asking a lot of the CPU and GPU, but it’s a lot better.

Achieving Retina Boot Camp Behavior & Software Funniness
Comments Locked

471 Comments

View All Comments

  • wfolta - Sunday, June 24, 2012 - link

    I insisted on a 17" laptop since the 17" MBP came out, and used them right up until I got the 15" rMBP last week. I'll never turn back. I've got it turned up to "1920x1200 equivalent" right now, and so I get as much screen real estate in a machine that's way, way, WAY smaller and lighter. And the display is so good, I'm impressed every single time I use it.

    At first, that seemed too small. I started with the 1440x900, and the next day tried the next denser step and it was okay, and the next day went 1920x1200 and it's something you get used to fairly easily. Obviously, you couldn't use 1920x1200 on a 6" screen if you had the pixel density to pull it off, but I really don't see the need for a 17" screen anymore.

    (And I use a 30" screen at work, which this laptop could easily drive if that's what I want. Heck, I've read about people driving the internal screen, an HDMI screen, and two Thunderbolt screens with a video running on each simultaneously. You don't need a huge built-in screen.)
  • yvesluther - Sunday, June 24, 2012 - link

    I am wondering how I should connect my two Thunderbolt displays to my new Retina MacBook Pro?

    a) Should I chain both displays and use one Thunderbolt port

    or

    b) Should I hook each display to its own port?

    Thanks for any advice.
  • wfolta - Sunday, June 24, 2012 - link

    I'm pretty sure you will have to put each display on its own port. You'll be able to daisy chain other devices (disk drives, etc), but I think it's one display each for this laptop at least.
  • Constructor - Thursday, June 28, 2012 - link

    The "classic" MBPs support both displays on the same TB port, so I would expect that to work here as well. It's mostly just a question of convenient cabling, since the displays have TB daisy-chaining outputs anyway.
  • SJdead - Sunday, June 24, 2012 - link

    One issue though is how can the "retina display" [(FYI, 226 PPI is not comparable to the human eye which at 20/20 vision is 426 PPI) that's why I don't like Apple because they treat people like their dumb.] is that there is so much focus on the PPI.

    Ya, it's innovative (in a sense) as Apple knows how to market to the masses. But what about LED - IPS? That's a downgrade from older IPS displays. What about color contrast? Blacks/whites? What about billions of colors instead of millions? What about color accuracy and Adobe sRBG/RGB? What about glossy screens that cause over-saturated colors? Response time in MS?

    All the above mentioned are glaring oversights to the, "Best display I've ever seen..." comment. If that is the case, you should check out a variety of other displays.

    My point is that display resolution isn't everything, there are a lot of other factors that go into a good display. There are far better displays from 5+ years ago that will outperform and look more gorgeous that Apple's current 2012 model. All you have to do is go to Apple's website to see that good specifications are highlighted while poor specifications are not even mentioned.
  • wfolta - Sunday, June 24, 2012 - link

    You do realize that you can't make any statements at all about "retina" qualities based solely on PPI, right? "Retina display" is based on an angular measurement, so you need PPI at a specified viewing distance. And the rMBP meets that at its typical viewing distance.

    (On an anecdotal note, I work with video as a profession, so am very sensitive to pixels, etc, and this screen is gorgeous, density-wise.)

    You mention all of those "far better" displays from 5+ years ago... Were they in laptops, or were they expensive desktop options? What manufacturer has continued to make them since then? In fact, most everyone except Apple has headed towards the consumerized, 16:9 1920x1080 (1080p) screen size, which is 15% shorter than Apple's displays. Contrary to popular opinion, Apple has been holding out for the more useful, professional aspect ratio and resolution, while its competitors have chased the checkbox of "Play Bluray DVDs at HD resolution".

    In terms of your actual comments, the rMBP's display is in the top two or three laptop screens for contrast and blacks, basically falling a bit short (top 6-8) in white because Apple just couldn't pump up the brightness without compromising battery life too much. Color accuracy has dropped a bit from older MBP's. Still, the combination of perceivable features is the best most of us have ever seen outside of a calibrated desktop setup.
  • wfolta - Sunday, June 24, 2012 - link

    One thing to think about is that Apple's auto-switching between dedicated GPU and Intel graphics is a bit of a mystery. Some programs trigger the dedicated GPU and it's not clear why.

    For example, Eaglefiler (a terrific program) seems to trigger it, even though it's not a graphics heavyweight and even when it's hidden. With a program that you leave running all the time like that, it will drag down your battery life. The author is looking into this, but it's not clear what Apple API is the cause.

    In light of that, I'd highly recommend the gfxCardStatus, which monitors what programs trigger discrete graphics and which don't. It'll at least give you a clue that a program you wouldn't otherwise suspect may be shortening your battery life.
  • designerfx - Sunday, June 24, 2012 - link

    okay, so we're looking at a screenshot. 15 FPS in the most uncluttered situations with D3. Can you even imagine the FPS in a normal situation during the game, even on normal difficulty?

    Hint: we're looking at FPS in the 1-3 range, maybe 4 if you're lucky.

    Apple can deliver great displays but if we don't have performance to match then the only main use is photoshop.
  • Fx1 - Monday, June 25, 2012 - link

    If you have any brains you will bootcamp windows and game in there with usually 40% increase in FPS. No one games in OSX. Even OSX fanboys have better sense.
  • wfolta - Monday, June 25, 2012 - link

    I've said repeatedly in this thread that I get 20+ FPS in Normal Act III, with two or three dozen mobs onscreen. I've read at least one other report of 20+ FPS (perhaps this article mentions that in the text). And this is with settings mostly at max. Cut the res down to 1440x900 and drop a few settings to Medium and I think you could double the frame rate.

    So your speculation is wrong.

Log in

Don't have an account? Sign up now