Battery Life

For much of the past year I haven’t been pleased with just how good Apple’s caching has become both on OS X and iOS. Aggressively caching our test web pages produces artificially inflated battery life numbers and that’s no fun for anyone. I’m happy to say that I’ve fixed that problem with our OS X battery life tests.

The suite is completely redone although conceptually it’s quite similar to what I’ve run in the past. I have three separate workloads: light, medium and heavy, each one representing a different stress level on the machine and all three giving you a decent idea of the dynamic range of battery life you can expect from one of these notebooks. All three tests are run with the displays set to 100 nits (a little above the halfway brightness point on most MacBook Pros).

The light and medium suites are inherently related - they use the same workload and simply vary the aggressiveness of that workload. The light test hits four different websites every minute, pausing for nearly the entire time to simulate reading time. Flash is enabled and present on three of the sites. The long pause time between page loads is what really makes this a light test. Web browsing may be the medium for the test but if all you’re doing is typing, watching Twitter update and maybe lazily doing some other content consumption this is a good representation of the battery life you’ll see. It’s a great way of estimating battery life if you’re going to be using your notebook as a glorified typewriter (likely a conservative estimate for that usage model).

The medium test hits the same webpages (Flash and all) but far more aggressively. Here there’s less than 10 seconds of reading time before going onto the next page. It sounds like a small change but the impact on battery life is tremendous.

Both the light and medium tests are run in their default state with processor graphics enabled, as well as with the discrete GPU forced on. I run with the dGPU on as well because it’s far too often that a single application open in the background will fire up the dGPU and contribute to draining your battery. The goal here is to deliver useful numbers after all.

The final test is very similar to our old heavy multitasking battery life tests, but with some updates. Here I’m downloading large files at a constant 1MB/s from a dedicated server, while playing back a looped 1080p H.264 movie (the Skyfall trailer) all while running the medium battery life test. The end result is a workload that gives you a good idea of what a heavy multitasking usage model will do in terms of battery life. I’ve found that OS X tends to fire up the dGPU anyway while running this workload so I saw no reason to run a separate set of numbers for processor and discrete graphics.

Light Workload Battery Life

Medium Workload Battery Life

Heavy Workload Battery Life

Overall the rMBP pretty much behaves as expected. Apple claims up to 7 hours of battery life and using our light workload we see a bit over that. Fire up the dGPU and even a light workload will get cut down to around 5.5 hours. Moderate usage will drop battery life to around 5 hours, and if you fire up the dGPU you’ll see that cut down to 3.5. The heavy multitaskers in the audience will see a bit above 2 hours out of a single charge. Note that all of these numbers are at 100 nits, drive the 2880 x 1800 panel at its full brightness and you can expect a tangible reduction in battery life.

The rMBP’s integrated 95Wh battery is ginormous by today’s standards, but it’s really necessary to drive both the silicon and that impressive panel. Subjectively I did find the rMBP lasted longer than last year’s MacBook Pro, despite the similar max battery life ratings. My experiences were echoed by the results in our tests.

I suspect most users will see around 5 hours of battery life out of the system compared to a bit under 4 hours out of last year’s machine. At minimum brightness, typing a long document (similar to what I’m doing right now) you can significantly exceed Apple’s 7 hour estimate. As always it really depends on usage model. Professional users doing a lot of photo and video editing aren’t going to see anywhere near the max battery life, while the writers and general users will be quite happy.

One trick to maximizing battery life on light or moderate workloads is to keep an eye on what the discrete GPU is doing. I still find that OS X will wake up the discrete GPU far too frequently, even when in my opinion its services aren’t needed. As always I turn to Cody Krieger’s excellent gfxCardStatus app for keeping an eye on which GPU is driving the panel. The app has been updated and is now fully compatible with the rMBP.

GPU Performance What to Buy
Comments Locked

471 Comments

View All Comments

  • OCedHrt - Sunday, June 24, 2012 - link

    He missed another important point. All of that was in 3 lbs. Now, the current generation starting from last summer has an external discrete graphics and optical drive connected via a thunderbolt based connector (because Apple had exclusivity) with the laptop being only 2.5 lbs.

    This isn't going to compare to the retina macbook pro though - at 15 inches 4.5 lbs is pretty impressive though I think if Sony wanted to do it they could do 4 lbs or less.
  • deathdemon89 - Saturday, June 23, 2012 - link

    I agree completely, I'm a proud owner of the old Z, and even today it doesn't feel the least bit dated. And the 1080p screen is holding up admirably well, with no signs of pixellation at normal viewing distances. This device was truly innovative for its time. I still don't understand why it received such mixed reviews by the press.
  • Spunjji - Monday, June 25, 2012 - link

    Mainly the price. Only Apple are allowed to charge that much for a laptop. Also, only Apple can have hot systems. Repeat ad infinitum.
  • mlambert890 - Wednesday, November 28, 2012 - link

    Really ridiculous comment. I can see you are bitter, as is the other mega z fan, but come on already. I worked for Sony for 5 years and love the company. I have owned probably a dozen Vaios including the top of the line last gen Z (with the SSD RAID)

    Instead of ranting and raving you need to ask yourself *why it is* that "only Apple can charge so much" and why "Anand only gives a free pass to Apple"

    You feel what exactly? That there is some grand conspiracy in play? Do you realize how ridiculous that sounds?

    WHY has Sony *lost the ability to charge a premium*? In other words WHY have they *burned all customer loyalty and good will*? I left the company back in 1999 because I saw the writing on the wall.

    You (and the other Z guy) are no different than any other apologist. Companies dont bleed marketshare and fail to sell cancer curing products (you guys are presenting the Z as "truly revolutionary" right?) for no reason. Sorry to tell you there is no "big conspiracy".

    Sony sells super high priced products into a super commoditized market and then they layer on a CRAP TON of bloatware dragging the machine to a stop, make idiot decisions like the HDMI one, and push proprietary crap *worse* than Apple ever has. All of that into the Wintel space which, sorry to tell you, was *always* driven by the cheapest possible parts at the best possible price.

    The PC industry grew *because it was cheap*. Apple *always* occupied a premium niche. I vividly remember the initial release of the Apple I, the Lisa, the Mac 128. These were all always premium products and the competition at the time (be it Commodore, Atari, Ti, or the wintel ecosystem) *always* captured share by being cheap.

    That may annoy you for some reason, but Apple has pretty much *always* held a core premium audience. The only exception was the period of insanity when Jobs had been pushed out and Scully destroyed the company. Even then, the core fans stayed.

    You two make it sound like poor Sony is a victim because the world doesnt all run out and by the Vaio Z.

    Even *without Apple* Sony would be going under, hate to tell you. Sony's problems are Sony's and the whole is *not* the sum of its parts with a laptop.
  • solipsism - Saturday, June 23, 2012 - link

    None of that makes sense and is, in fact, rubbish.

    Sony added 1080p because it was it was popular not because it made sense. You have a 168 PPI display on 13" machine which makes text too small to be a good experience for most users.

    They also didn't use a good quality display or add anything to the SW to make the experience good (unlike what Anand talked about in this review), they just added the single metric that was trending because of HDTVs.

    Blu-ray in a notebook has always been a silly option for most users. There is a reason the BRD adoption failed on PCs and it's not because everyone is stupid... except you. ODDs are long overdue for being removed since they take up 25% of the chassis, require them to placed at an edge reducing over 5" of port real estate and restricting design, require a lot of power, are noisy, more prone to break due to the many moving parts, are slow, are just too expensive to be feasible, and add nothing visually that most users trying to watch a movie can discern.

    Quad-SSDs? Really? That's a sensible solution for a consumer notebook?
  • EnzoFX - Saturday, June 23, 2012 - link

    and that really is what people don't get. It isn't just about raw specs. The package needs to be complete, polished, what have you. In this case of high dpi screens, is good scaling support, and Apple did it. Support on the software side is something they never get credit for by the Apple haters. All they can see is numbers and think "I've seen numbers like that before".
  • mabellon - Saturday, June 23, 2012 - link

    No Apple didn't do it. Just like on the iPad, they increased resolution by doubling width and height. Their software simply doesn't scale well to arbitrary higher resolution. If it was done right then Chrome would work out of the box - instead the OS 2x scales everything without increasing resolution/quality.

    To the consumer, the choice means a good experience without breaking apps. But claiming that Apple was successful simply bc of software? HA!
  • Ohhmaagawd - Saturday, June 23, 2012 - link

    Did you actually read the retina part of the review?

    Chrome doesn't work right because they do their own text rendering. Read the review. If an app uses the native text rendering, the app will look good (at least the text portion). They will have to update the graphical assets of course.

    BTW, Chrome Dev builds have this issue fixed.

    Windows DPI setting isn't default, so few use or even know about the setting and devs haven't made sure they work properly in the high DPI settings.

    Apple has made a move that will be short-term painful in having apps that aren't updated look a bit fuzzy. But since they made it default, this will force devs to update.
  • OCedHrt - Sunday, June 24, 2012 - link

    What do you mean Windows DPI setting isn't default? You can change it in a few clicks, but the same thing applies - if your app does not read the DPI values, then Windows can't help you. This is because windows UI is not vector based (I don't know about now, but older apps definitely not) and many applications back then were coded with hard coded pixel counts.

    When the DPI is changed, windows scales the text but the UI dimensions is controlled by the application implementation.
  • KitsuneKnight - Saturday, July 7, 2012 - link

    On Windows, changing the DPI will generally mean a huge amount of applications will become simply unusable.

    On this Retina MBP, the worst case appears to be slightly blurry text (which was quickly updated).

    Apple's solution is a good one, because it does things in a way that should keep existing apps working fine, while allowing future developers to leverage new APIs to take advantage of the increased resolution.

Log in

Don't have an account? Sign up now