Driving the Retina Display: A Performance Discussion

As I mentioned earlier, there are quality implications of choosing the higher-than-best resolution options in OS X. At 1680 x 1050 and 1920 x 1200 the screen is drawn with 4x the number of pixels, elements are scaled appropriately, and the result is downscaled to 2880 x 1800. The quality impact is negligible however, especially if you actually need the added real estate. As you’d expect, there is also a performance penalty.

At the default setting, either Intel’s HD 4000 or NVIDIA’s GeForce GT 650M already have to render and display far more pixels than either GPU was ever intended to. At the 1680 and 1920 settings however the GPUs are doing more work than even their high-end desktop counterparts are used to. In writing this article it finally dawned on me exactly what has been happening at Intel over the past few years.

Steve Jobs set a path to bringing high resolution displays to all of Apple’s products, likely beginning several years ago. There was a period of time when Apple kept hiring ex-ATI/AMD Graphics CTOs, first Bob Drebin and then Raja Koduri (although less public, Apple also hired chief CPU architects from AMD and ARM among other companies - but that’s another story for another time). You typically hire smart GPU guys if you’re building a GPU, the alternative is to hire them if you need to be able to work with existing GPU vendors to deliver the performance necessary to fulfill your dreams of GPU dominance.

In 2007 Intel promised to deliver a 10x improvement in integrated graphics performance by 2010:

In 2009 Apple hired Drebin and Koduri.

In 2010 Intel announced that the curve had shifted. Instead of 10x by 2010 the number was now 25x. Intel’s ramp was accelerated, and it stopped providing updates on just how aggressive it would be in the future. Paul Otellini’s keynote from IDF 2010 gave us all a hint of what’s to come (emphasis mine):

But there has been a fundamental shift since 2007. Great graphics performance is required, but it isn't sufficient anymore. If you look at what users are demanding, they are demanding an increasingly good experience, robust experience, across the spectrum of visual computing. Users care about everything they see on the screen, not just 3D graphics. And so delivering a great visual experience requires media performance of all types: in games, in video playback, in video transcoding, in media editing, in 3D graphics, and in display. And Intel is committed to delivering leadership platforms in visual computing, not just in PCs, but across the continuum.

Otellini’s keynote would set the tone for the next few years of Intel’s evolution as a company. Even after this keynote Intel made a lot of adjustments to its roadmap, heavily influenced by Apple. Mobile SoCs got more aggressive on the graphics front as did their desktop/notebook counterparts.

At each IDF I kept hearing about how Apple was the biggest motivator behind Intel’s move into the GPU space, but I never really understood the connection until now. The driving factor wasn’t just the demands of current applications, but rather a dramatic increase in display resolution across the lineup. It’s why Apple has been at the forefront of GPU adoption in its iDevices, and it’s why Apple has been pushing Intel so very hard on the integrated graphics revolution. If there’s any one OEM we can thank for having a significant impact on Intel’s roadmap, it’s Apple. And it’s just getting started.

Sandy Bridge and Ivy Bridge were both good steps for Intel, but Haswell and Broadwell are the designs that Apple truly wanted. As fond as Apple has been of using discrete GPUs in notebooks, it would rather get rid of them if at all possible. For many SKUs Apple has already done so. Haswell and Broadwell will allow Apple to bring integration to even some of the Pro-level notebooks.

To be quite honest, the hardware in the rMBP isn’t enough to deliver a consistently smooth experience across all applications. At 2880 x 1800 most interactions are smooth but things like zooming windows or scrolling on certain web pages is clearly sub-30fps. At the higher scaled resolutions, since the GPU has to render as much as 9.2MP, even UI performance can be sluggish. There’s simply nothing that can be done at this point - Apple is pushing the limits of the hardware we have available today, far beyond what any other OEM has done. Future iterations of the Retina Display MacBook Pro will have faster hardware with embedded DRAM that will help mitigate this problem. But there are other limitations: many elements of screen drawing are still done on the CPU, and as largely serial architectures their ability to scale performance with dramatically higher resolutions is limited.

Some elements of drawing in Safari for example aren’t handled by the GPU. Quickly scrolling up and down on the AnandTech home page will peg one of the four IVB cores in the rMBP at 100%:

The GPU has an easy time with its part of the process but the CPU’s workload is borderline too much for a single core to handle. Throw a more complex website at it and things get bad quickly. Facebook combines a lot of compressed images with text - every single image is decompressed on the CPU before being handed off to the GPU. Combine that with other elements that are processed on the CPU and you get a recipe for choppy scrolling.

To quantify exactly what I was seeing I measured frame rate while scrolling as quickly as possible through my Facebook news feed in Safari on the rMBP as well as my 2011 15-inch High Res MacBook Pro. While last year’s MBP delivered anywhere from 46 - 60 fps during this test, the rMBP hovered around 20 fps (18 - 24 fps was the typical range).


Scrolling in Safari on a 2011, High Res MBP - 51 fps


Scrolling in Safari on the rMBP - 21 fps

Remember at 2880 x 1800 there are simply more pixels to push and more work to be done by both the CPU and the GPU. It’s even worse in those applications that have higher quality assets: the CPU now has to decode images at 4x the resolution of what it’s used to. Future CPUs will take this added workload into account, but it’ll take time to get there.

The good news is Mountain Lion provides some relief. At WWDC Apple mentioned the next version of Safari is ridiculously fast, but it wasn’t specific about why. It turns out that Safari leverages Core Animation in Mountain Lion and more GPU accelerated as a result. Facebook is still a challenge because of the mixture of CPU decoded images and a standard web page, but the experience is a bit better. Repeating the same test as above I measured anywhere from 20 - 30 fps while scrolling through Facebook on ML’s Safari.

Whereas I would consider the rMBP experience under Lion to be borderline unacceptable, everything is significantly better under Mountain Lion. Don’t expect buttery smoothness across the board, you’re still asking a lot of the CPU and GPU, but it’s a lot better.

Achieving Retina Boot Camp Behavior & Software Funniness
Comments Locked

471 Comments

View All Comments

  • iCrunch - Sunday, June 24, 2012 - link

    Agreed. This review is awesome. I hope he's right about the 4K Retina Thunderbolt display, which I'd buy in a heartbeat. One thing I don't get is why so many people and reviewers alike consider the $2,200 price tag extremely and often times too expensive. You're getting the latest and the greatest processors, both CPU and GPU-wise, a generous 8GB of RAM and I find the 256GB SSD to be plenty. After my two 180GB Intel 520 SSD's, this is the largest single SSD that I have ever owned. The upgrades are fair as well as far as doubling the RAM for $200 is concerned. At Apple no less! A few months ago, any setup of 16GB of RAM in 2 SODIMM's was well over $300 and if you go back a few more months, that amount of RAM set you back over a full grand! As in $1,000+

    I couldn't justify the $600+ price difference for an extra 300MHz in CPU clock and an additional 256GB Flash, though. If the GPU had come with 2GB GDDR5, then maybe, but not as it stands today.
  • hyrule4927 - Sunday, June 24, 2012 - link

    First of all, the 650M may be one of the latest mobile GPU's, but it is pretty far from the greatest. It is a midrange GPU forced to drive an insane resolution with only 1GB of VRAM. And 8GB of RAM isn't "generous", to have any less in a laptop this expensive would be ridiculous. Paying $200 to upgrade to 16GB is a scam, especially considering Apple made the decision to prevent consumers from simply purchasing and installing more RAM on their own (you can find 2X8GB SODIMMs for a bit over $100, no idea what planet you were shopping on where that would cost $1000 at any time in the past year).
  • EnerJi - Sunday, June 24, 2012 - link

    I'm sure you can find 8GB of no-name stuff on sale somewhere, but for one example of name-brand memory, Crucial memory goes for $86.99 per 8GB ($173.98 total):

    http://www.crucial.com/store/listmodule/DDR3/list....

    Also, Apple uses low-voltage DDR3-1600, which is lower volume and may be slightly more expensive as a result.

    In that light, while $200 to upgrade to 16GB isn't exactly a bargain, it isn't the typical rapacious Apple upgrade prices.
  • Fx1 - Monday, June 25, 2012 - link

    ARE YOU KIDDING? The 650M is running at 900mhz stock! people are clocking this bad boy well over 1050-1100mhz

    Those are ABOVE 660M GTX Speeds.

    Id say Apple has packed in the BEST GPU possible given the thermal limits and size of this notebook.

    In windows this MBP will run games at very nice settings and maybe the first Notebook that isn't as thick as an encyclopaedia that can run games on high settings.

    Most will never use 8GB of Ram and 16GB is an option so i don't see the issue. Its also custom made which means Walmart RAM prices aren't compatible
  • hyrule4927 - Monday, June 25, 2012 - link

    No, I'm not kidding. Nice capitalization though, it really does wonders for the credibility of your statements. Here are the flaws in your logic. You say that the 650M is the "BEST GPU possible given the thermal limits” with carefully placed capitalization in order to play down the qualifying terms in your statement. Then you suggest overclocking it. If Apple chose this GPU because they were fighting thermal limits, do you really expect it to handle overclocking well? And sure it can run Half Life 2 and Diablo III (an old game, and a game that is hardly demanding by modern standards) at standard resolutions, but users are going to want to game at native resolution on their new retina screen. Too bad Diablo runs at 18 frames per second. It is ludicrous to consider that a playable framerate, and if it can't handle Diablo, it won't be able to handle much of anything at that resolution. Again, that VRAM limitation is a killer. Considering that many popular current games, such as Skyrim, easily consume more than 1GB at 1080p, memory capacity is going to be an enormous bottleneck even when you are nowhere near native resolution. No matter how you want to look at it, a GPU like the 680M is much better suited for that screen, and the 650M doesn't even hold a candle to the performance of that chip.

    As for the system RAM, while I am sure that you enjoy shopping at Walmart, perhaps you should look on Newegg where you can find a 16GB kit from the manufacturer of your choice for just over $100. Of course you have probably never bought a single computer component in your life, so you can be forgiven for not knowing that. And you describe the "custom" RAM as if it is a selling point. Because everyone knows that proprietary format soldered RAM was included with the best interests of the consumer in mind . . .
  • iCrunch - Sunday, June 24, 2012 - link

    Hi guys, does anyone have this new rMBP (love the abbreviation) and TOSHIBA "SSD "Flash storage"? You can find this in "System Information" under Serial-ATA and it will say either "Apple SSD TS256E" for a Toshiba drive/Flash storage. If you have a Samsung, it will say "Apple SSD SM256E". Naturally, if you have a 512GB drive, it'll display Apple SSD SM(or TS)512E.

    This should be interesting.
  • iCrunch - Sunday, June 24, 2012 - link

    Thank you, Anand, for the single best and exhaustive review of this gorgeous new powerhouse. I picked one up from an Apple Store, so naturally, I only have the 8GB RAM. I have a 2nd one coming, also a 2.3, but with 16GB and then I will sell this one. That is, if I decide that I truly need and want 16GB. At $200 before any discounts, it sure seems like a worthwhile upgrade either way. There had better not ANYTHING be wrong with ANY other part of my new rMBP, though. lol...
  • pxavierperez - Sunday, June 24, 2012 - link

    It's funny how Anand went to great lengths describing, even posting an image as an example, how OSX DPI scaling implementation was superior compared to Window, which really was his point and the point that really mattered to end users, and yet we have Apple haters getting all fumed up just because Anand made one simple typo on the numeric value of Window 7 DPI setting.

    Sure you can set Windows 7 to scale 200% (2x) but it's flaky, dialog box breaks, inconsistency in rendering objects, and just all around not usable. It's not just all about features it's also about how they are wrapped together to make it work so seamless. Here Apple did a far, far better job than Windows. Which was Anand's point.
  • spronkey - Sunday, June 24, 2012 - link

    All around, it's not a bad review. But I'm disappointed that you still decided to give it an award despite the massive issues:

    #1 - The soldered RAM.
    #2 - The nonstandard SSD form factor.
    #3 - The price. Not so much of the machine, but of the upgrades more than anything else.

    I'm also disappointed that I didn't see (though may have missed) a bashing of the new MagSafe 2 connector. What a waste of time - just make the chassis ever so slightly thicker. Or, do what other manufacturers do and mould a port around it. Then make it look good.

    However. For a Pro machine to be so bastardised... 8GB is not plenty of RAM. Look at the rate we've been increasing RAM requirements over the past few years - it's speeding up, not slowing down. In a year's time, 8GB will probably be standard on half new machines, and in 2 years it'll be very limiting.

    I'm also disappointed that these points above aren't also factored in to a good bashing about Apple's very minimal warranty, and very expensive AppleCare product.

    I've owned Macs for years - in fact all bar 1 of my portables have been Apple machines; the software/hardware integration just runs circles over the Windows slabs, but I'm really starting to get pissed off with Apple's blatant lockdowns and price gouges. It's anticompetitive and bad taste.
  • Ohhmaagawd - Sunday, June 24, 2012 - link

    I wish it had socketed RAM and a standard SSD too.

    But fact is most people don't upgrade their machines (although pro users are much more likely). Apple really wanted the thinnest laptop possible with the longest battery life possible. Those goals conflict with upgradability. And I would guess Apple just doesn't care about upgradability. They don't want people opening their cases.

    The future for apple laptops is clear: you won't be able to upgrade anything. So better buy what you need to start with.

    If you can't deal with that, buy elsewhere.

Log in

Don't have an account? Sign up now