Battery Life

I'll begin this section with an admission: we need to update our battery life suite. With the introduction of the very first iPhone I introduced a web page loading test that simply cycled through a bunch of web pages, pausing on each one to simulate reading time (I measured how long it took me to read a typical content page and used that as the reading time). Our web browsing battery life test is largely dominated by the power consumption of the display, but it also causes the CPU to wake up from its low power states and hits the WiFi/cellular stacks as well. The test managed to do reasonably well over the years however it's getting a bit long in the tooth, especially given that mobile browsers have become more aggressive in caching content. The move to iOS 5 in particular hurt our web browser test as it cached so much of the content of each page that our cellular results now closely mirror our WiFi results on the iPhone 4/4S. There's still a bit of a penalty to be paid over 3G, but not nearly as much as it should be in the real world. The test data is still valid, it's simply no longer representative of real world web browsing battery life, but rather a more academic look at very light (but continuous) smartphone usage. Thankfully we do have other tools at our disposal until we update the web browsing suite. Brian Klug devised a hotspot test that really stresses the cellular baseband of these phones by constantly streaming content over the Internet, via the phone being tested, to a tethered notebook. Between our hotspot, web browsing and call tests we should be able to get a good idea of the overall performance of the iPhone 4S on battery.

Before we get to the results, let's talk a little bit about what we should see architecturally. As Brian already mentioned at the start of the review, battery capacity is up slightly in the iPhone 4S. The increase is marginal at best, on the order of 1%, meaning it shouldn't result in a tangible impact to battery life.

The display is a major consumer of power but with the specs unchanged since the original iPhone, the 4S' panel shouldn't consume any more power than its predecessor. This leaves the A5 SoC and the Qualcomm MDM6610 baseband as the primary influencers on power consumption.

Process technology hasn't changed going from the A4 to the A5, both chips were built using Samsung's 45nm process as far as we know. At the core level, a single ARM Cortex A9 core is about 10 - 50% faster than a Cortex A8 at the same frequency. Thankfully Apple kept frequency constant with the move to the A5 in the 4S, making this comparison a bit easier to make.

NVIDIA originally told me that the Cortex A9 was more power efficient than the A8 it replaced. The A9 has a shorter, more efficient pipeline and, in the case of the A5, isn't pushing ridiculous frequencies. Based on Apple's frequency targets alone I'd say that it's probably a safe bet that we're looking at a 45nm LP implementation.

To claim the A9 is more power efficient than the A8 isn't enough however. If we look at Larrabee and Intel's first five years of Atom it's clear that when faced with the ultimate goal of minimizing power consumption, an in-order core is the way to go. In the ARM space, the recently announced Cortex A7 offers an additional datapoint: when ARM needed a low power core, it picked an in-order design with an 8-stage pipeline. The additional hardware required by an OoO architecture consumes significant power, and the gains in performance aren't always enough to offset the corresponding increase in power.

Why would being faster make a microprocessor use less power? The concept is called race to sleep. At idle the CPU in an SoC is mostly clock gated if not power gated entirely. In this deep sleep state, power draw is on the order of a few milliwatts. Under full load however, power consumption can be well above a watt. If a faster processor consumes more power under load but can get to sleep quicker, the power savings may give it an advantage over a slower processor. Consider the following examples:

Here we have two hypothetical CPUs, one with a max power draw of 1W and another with a max power draw of 1.3W. The 1.3W chip is faster under load but it draws 30% more power. Running this completely made-up workload, the 1.3W chip completes the task in 4 seconds vs. 6 for its lower power predecessor and thus overall power consumed is lower. Another way of quantifying this is to say that in the example above, over 10 seconds CPU A does 5.5 Joules of work vs. 6.2J for CPU B (assuming both chips have the same 0.05W idle power consumption).

Now let's take the same two hypothetical CPUs and present them with a workload that doesn't scale nearly as well on the faster part:

Despite being faster, the 1.3W CPU isn't fast enough to overcome the 30% increase in power. Here CPU A does 9.25J of work vs. 8.1J for CPU B. Perhaps the faster CPU has more cores and the workload isn't well threaded, or maybe the workload is more optimized for the slower architecture, regardless of the reason this is just as valid of a scenario.

Albeit overly simplified, these two cases are examples of what could happen between the iPhone 4 and iPhone 4S. ARM hasn't published a lot of data comparing the Cortex A8 to A9, but ARM has publicly stated that a single A9 core can consume 10 - 20% more power than a single A8 core. If we assume those numbers are under max load, then the A9 simply needs to be more than 10 - 20% faster than the A8 in order to come out ahead. As we've already seen from some of our benchmarks, that's not too difficult, particularly in web browsing. But in other tests, the advantage is more marginal.

The comparison becomes more complex when you take into account there are two Cortex A9s in Apple's A5 SoC vs. a single Cortex A8 in Apple's A4. This is potentially an advantage as a well threaded app could run both cores at a lower voltage/frequency combination (reducing power at an exponential level) while the single core would have to run at its maximum voltage/frequency levels.

It's also possible than two cores would consume more power, but for that to happen you'd have to be running a heavily threaded app at full frequency for a considerable amount of time. To date I haven't seen many smartphone apps that would create such a scenario, but it's akin to looping Cinebench on a quad-core vs. a dual-core part and noting a reduction in battery life for the quad-core CPU. Although the former is quicker to complete the task, the fact that you're looping it indefinitely prevents its speed from ever being an advantage for battery life.

I crudely measured power consumption on the iPhone 4 and 4S (both on AT&T) doing a variety of tasks. The granularity of my measurements is what makes them crude, I was limited to a resolution of 0.1W. While this data would've been far more useful given 0.01W resolution, we are able to use it to get a general idea of power consumption between these two phones. I briefly contemplating inserting a multimeter in-line with the battery however I chickened out, not wanting to risk damage to my phone or review device. I highlighted the obvious power advantages although keep in mind some of these advantages may be smaller (or larger) than they appear due to the 0.1W resolution of my measurements:

Power Consumption Comparison
  Apple iPhone 4 (AT&T) Apple iPhone 4S (AT&T)
Idle 0.7W 0.7W
Launch Safari 0.9W 0.9W
Load AnandTech.com 1.0W 1.1W
Maps (Determine Current Location via GPS/WiFi) 1.3W 1.4W

Power at idle and during application launches was pretty much unchanged between the two devices, which is to be expected. The 4S did draw measurably more power loading web pages. As we've already seen however, the average performance gain in our web page loading tests was over 30%, easily making up for the increase in power draw here. Maps however pulled more power on the 4S.

What does all of this mean? The iPhone 4S has the potential to have slightly better, equal or much worse battery life than the iPhone 4. It really depends on your workload. If you're mostly browsing the web, the 4S should be about equal to if not slightly better than the 4. Our numbers seem to back that up:

Smartphone Web Browsing Battery Life

Even though the 3G results are skewed by an unrealistic amount of caching, the CPU still has to work to render and display each page. Since the workload remains the same between the iPhone 4 and 4S, the latter simply enjoys a performance improvement (pages load quicker) while extending battery life a bit thanks to being asleep for longer.

WiFi Web Browsing Battery Life

There is one caveat to web browsing battery life: the 4S will only last longer if you do the same amount of work on it. Typically, if web pages load quicker, you end up browsing more on the faster device than you would on the slower device. If you do browse more on the 4S as a result of its speed improvements, battery life won't be as good as it was on the 4. There's nothing you can do about this - faster CPUs and faster Internet connections have always encouraged faster browsing, but it's something to keep in mind if you make the upgrade.

3D Gaming Battery Life
 

Power Consumption Comparison
  Apple iPhone 4 (AT&T) Apple iPhone 4S (AT&T)
Launch Infinity Blade 2.2W 2.6W
Infinity Blade (Opening Scene, Steady State) 2.0W 2.2W

Infinity Blade is a GPU intensive 3D game, which obviously causes the GPU transistors to fire up on both SoCs. Given the beefier GPU in the 4S, much higher power consumption here isn't unexpected. Since battery capacities haven't really changed, and the 4S does draw significantly more power under heavy GPU load (even limited by Vsync), you can expect lower battery life when running GPU intensive 3D games. To put some real world numbers to the data I ran a loop of Epic's Citadel demo on both the 4 and 4S until both phones died:

3D Gaming Battery Life - Epic Citadel Demo

The iPhone 4 lasted around 30% longer in our GPU test compared to the iPhone 4S. This is actually a trend we have seen before, with the move to the 3GS we noted a similar impact on battery life compared to the previous iPhone 3G. If you're going to do any heavy 3D gaming, expect the iPhone 4S to burn through your battery quicker - although you will have a better experience on the 4S thanks to a smoother frame rate. Note that for sufficiently light 3D workloads (e.g. where the iPhone 4 is already bumping into Vsync), it's unlikely that you'll see much of a difference in battery life between the two phones. Citadel is simply too strenuous of a test for the 4. What really penalizes the 4S is its ability to run at nearly 2x the frame rate of the 4.

Power Consumption Comparison
  Apple iPhone 4 (AT&T) Apple iPhone 4S (AT&T)
Launch iBooks 1.3W 1.2W
iBooks Page Turning Animation (Rapid Movement) 1.6W 1.5W

If you're concerned that GPU acceleration throughout the OS will penalize the 4S, I wouldn't be too worried. The data above shows power consumption while running iBooks. For the second test I took a book page and quickly moved it left/right to trigger the ever impressive page turning animation. Doing so drove power consumption up, but the 4S consistently pulled less power than the iPhone 4. If you're going to be at the forefront of 3D gaming on iOS, the 4S won't last as long as its predecessor. For casual use, you should be just fine.

3G/WiFi Battery Life

I ran several speedtests in the same location on both 3G and WiFi to see if I could get a clear idea of whether or not the baseband and WiFi stack in the 4S was more power efficient than in the 4. The results unanimously agree, the 4S is more power efficient at uploading/downloading at the limits of 3G and WiFi:

Power Consumption Comparison
  Apple iPhone 4 (AT&T) Apple iPhone 4S (AT&T)
Speed Test (3G, Downstream) 2.8W 2.4W
Speed Test (3G, Upstream) 3.0W 2.8W
Speed Test (WiFi, Downstream) 1.5W 1.4W
Speed Test (WiFi, Upstream) 1.6W 1.4W

Our tethered test gives us a good idea of how quickly the 4S will die under moderate cellular data load. Apple's power advantages under iOS are due to wonderful management of idle time, similar to what we've seen with OS X vs. Windows 7. Under load however, Apple is bound by the same physical realities as its competitors and the question of battery life becomes one of battery capacity divided by peak power draw. Here the iPhone 4S does very well, but it's outpaced by the upper echeleon of Android phones:

WiFi Hotspot Battery Life Time

It is surprising that despite the peak power advantages above, we didn't see any improvement in our WiFi hotspot test. The only explanation I have is that the power advantage may not be as pronounced if we're not pushing the limits of the wireless interfaces.

Call time, on the other hand, improves tangibly compared to the iPhone 4. As the screen is off and the CPU mostly idle during this test, it really just echoes the numbers we saw above. Qualcomm's MDM6610 seems to outclass the outgoing Infineon X-Gold baseband when it comes to power efficiency:

3G Talk Time Battery Life

Based on the data we have here, I'd say Apple's claim of 8 hours of battery life is fairly realistic under some sort of continuous use/load. If you're constantly pulling data don't expect to see more than 5 hours, but if you're mostly reading/watching/consuming content you will get closer to 10 hours on the iPhone 4S. Call time falls at the longer end of the spectrum, but be warned: run a demanding 3D title and you'll see barely over 3 hours of use out of the iPhone 4S. It looks like any serious 3D gaming is going to have to be tethered or at least near a power outlet. The move to 28/32nm should buy us some more power headroom, but then again there are even faster GPUs just around the corner.

Based on our data, concerns about the iPhone 4S' battery life seem unrelated to hardware. The raw power consumption numbers show a platform that's competitive with its predecessor in most areas, only really hurting when it comes to heavy 3D workloads. If you're seeing worse battery life on the 4S, the cause would appear to be software related. Wipe, setup from scratch (no restore), remove/re-add all accounts and reset network settings would be the best course of action if you're seeing higher than normal power consumption.

Moving forward, I wouldn't be too surprised to see battery life remain around this level for the near future without significant advancements in battery or process technology. As we look toward the next-generation of microprocessor architectures, they simply become more robust out-of-order designs. As we've learned from the move to multi-core on the PC side however, continued gains in single threaded performance become increasingly difficult to come by - particularly without expending a lot of energy. There is hope for an increase in efficiency via heterogeneous multiprocessing, but just how much that will buy us remains to be seen. Process technology and architecture are going to become even more important over the coming years in the mobile space.

Video Capture Quality Final Words
Comments Locked

199 Comments

View All Comments

  • metafor - Tuesday, November 1, 2011 - link

    When you say power efficiency, don't you mean perf/W?

    I agree that perf/W varies depending on the workload, exactly as you explained in the article. However, the perf/W is what makes the difference in terms of total energy used.

    It has nothing to do with race-to-sleep.

    That is to say, if CPU B takes longer to go to sleep but it had been better perf/W, it would take less power. In fact, I think this was what you demonstrated with your second example :)

    The total energy consumption is directly related to how power-efficient a CPU is. Whether it's a slow processor that runs for a long time or a fast processor that runs for a short amount of time; whichever one can process more instructions per second vs joules per second wins.

    Or, when you take seconds out of the equations, whichever can process more instructions/joule wins.

    Now, I assume you got this idea from one of Intel's people. The thing their marketing team usually forgets to mention is that when they say race-to-sleep is more power efficient, they're not talking about the processor, they're talking about the *system*.

    Take the example of a high-performance server. The DRAM array and storage can easily make up 40-50% of the total system power consumption.
    Let's then say we had two hypothetical CPU's with different efficiencies. CPU A being faster but less power efficient and CPU B being slower but more power efficient.

    The total power draw of DRAM and the rest of the system remains the same. And on top of that, the DRAM and storage can be shut down once the CPU is done with its processing job but must remain active (DRAM refreshed, storage controllers powered) while the CPU is active.

    In this scenario, even if CPU A draws more power processing the job compared to CPU B, the system with CPU B has to keep the DRAM and storage systems powered for longer. Thus, under the right circumstances, the system containing CPU A actually uses less overall power because it keeps those power-hungry subsystems active for a shorter amount of time.

    However, how well this scenario translates into a smartphone system, I can't say. I suspect not as well.
  • Anand Lal Shimpi - Tuesday, November 1, 2011 - link

    I believe we're talking about the same thing here :)

    The basic premise is that you're able to guarantee similar battery life, even if you double core count and move to a power hungry OoO architecture without a die shrink. If your performance gains allow your CPU/SoC to remain in an ultra low power idle state for longer during those workloads, the theoretically more power hungry architecture can come out equal or ahead in some cases.

    You are also right about platform power consumption as a whole coming into play. Although with the shift from LPDDR1 to LPDDR2, an increase in effective bandwidth and a number of other changes it's difficult to deal with them independently.

    Take care,
    Anand
  • metafor - Tuesday, November 1, 2011 - link

    "If your performance gains allow your CPU/SoC to remain in an ultra low power idle state for longer during those workloads, the theoretically more power hungry architecture can come out equal or ahead in some cases."

    Not exactly :) The OoOE architecture has to perform more tasks per joule. That is, it has to have better perf/W. If it had worse perf/W, it doesn't matter how much longer it remains idle compared to the slower processor. It will still use more net energy.

    It's total platform power that may see savings, despite a less power-efficient and more power-hungry CPU. That's why I suspect that this "race to sleep" situation won't translate to the smartphone system.

    The entire crux relies on the fact that although the CPU itself uses more power per task, it saves power by allowing the rest of the system to go to sleep faster.

    But smartphone subsystems aren't that power hungry, and CPU power consumption generally increases with the *square* of performance. (Generally, this wasn't the case of A8 -> A9 but you can bet it's the case to A9 -> A15).

    If the increase in CPU power per task is greater than the savings of having the rest of the system active for shorter amounts of time, it will still be a net loss in power efficiency.

    Put it another way. A9 may be a general power gain over A8, but don't expect A15 to be so compared to A9, no matter how fast it finishes a task :)
  • doobydoo - Tuesday, November 1, 2011 - link

    You are both correct, and you are also both wrong.

    Metafor is correct because any chip, given a set number of tasks to do over a fixed number of seconds, regardless of how much faster it can perform, will consume more energy than an equally power efficient but slower chip. In other words, being able to go to sleep quicker never means a chip becomes more power efficient than it was before. It actually becomes less.

    This is easily logically provable by splitting the energy into two sections. If 2 chips are both equally power efficient (as in they can both perform the same number of 'tasks' per W), if one is twice as fast, it will consume twice the energy during that time, but complete in half the time, so that element will ALWAYS be equal in both chips. However, the chip which finished sooner will then have to be idle for LONGER because it finished quicker, so the idle expense of energy will always be higher for the faster chip. This assumes, as I said, that the idle power draw of both chips being equal.

    Anand is correct, because if you DO have a more power efficient chip with a higher maximum wattage consumption, race to sleep is the OFTEN (assuming reasonable idle times) the reason it can actually use less power. Consider 2 chips, one which consumes 1.3 W per second (max) and can carry out '2' tasks per second. A second chip consumes 1 W per second (max), and can carry out '1' task per second (so is less power efficient). Now consider a world without race-to-sleep. To carry out '10' tasks over a 10 second period, Chip one would take 5 seconds, but would remain on full power for the full 10 seconds, thereby using 13W. Chip two would take 10 seconds, and would use a total of 10W over that period. Thus, the more power efficient chip actually proved less power efficient.

    Now if we factor in race-to-sleep, the first chip can use 1.3 for the first 5 seconds, then go down to 0.05 for the last 5. Consuming 6.75W. The second chip would still consume the same 10W.

    Conclusion:

    If the chip is not more power effficient, it can never consume less energy, with or without race-to-sleep. If the chip IS more power efficient, but doesn't have the sleep facility, it may not use less energy in all scenarios.

    In other words, for a higher powered chip to reduce energy in ALL situations, it needs to a) be more power efficient fundamentally, and b) it needs to be able to sleep (race-to-sleep).
  • djboxbaba - Monday, October 31, 2011 - link

    Well done on the review Brian and Anand, excellent job as always. I was resisting the urge to tweet you about the eta of the review, and of course I end up doing it the same day as your release the review :).
  • Mitch89 - Monday, October 31, 2011 - link

    "This same confidence continues with the 4S, which is in practice completely usable without a case, unlike the GSM/UMTS iPhone 4. "

    Everytime I read something like this, I can't help but compare it to my experience with iPhone 4 reception, which was never a problem. I'm on a very good network here in Australia (Telstra), and never did I have any issues with reception when using the phone naked. Calls in lifts? No problem. Way outside the suburbs and cities? Signal all the way.

    I never found the iPhone 4 to be any worse than other phones when I used it on a crappy network either.

    Worth noting, battery life is noticeably better on a strong network too...
  • wonderfield - Tuesday, November 1, 2011 - link

    Same here. It's certainly possible to "death grip" the GSM iPhone 4 to the point where it's rendered unusable, but this certainly isn't the typical use case. For Brian to make the (sideways) claim that the 4 is unusable without a case is fairly disingenuous. Certainly handedness has an impact here, but considering 70-90% of the world is right-handed, it's safe to assume that 70-90% of the world's population will have few to no issues with the iPhone 4, given it's being used in an area with ample wireless coverage.
  • doobydoo - Tuesday, November 1, 2011 - link

    I agree with both of these. I am in a major capital city which may make a difference, but no amount or technique of gripping my iPhone 4 ever caused dropped calls or stopped it working.

    Very much an over-stated issue in the press, I think
  • ados_cz - Tuesday, November 1, 2011 - link

    It was not over-stated at all and the argument that most people are right handed does not hold a ground. I live in a small town in Scotland and my usual signal strength is like 2-3 bars. If browsing on net on 3G without case and holding the iPhone 4 naturaly with left hand (using the right hand for touch commands ) I loose signal completely.
  • doobydoo - Tuesday, November 1, 2011 - link

    Well the majority of people don't lose signal.

    I have hundreds of friends who have iPhone 4's who've never had any issue with signal loss at all.

    The point is you DON'T have to be 'right handed' for them to work, I have left handed friends who also have no issues.

    You're the exception, rather than the rule - which is why the issue was overstated.

    For what it's worth, I don't believe you anyway.

Log in

Don't have an account? Sign up now