Battery Life

At this point, it really goes without saying that battery life can make or break the experience of a smartphone. The anxiety that is associated with running out of battery is probably one of the worse experiences to have when using a smartphone, which is why good battery life is important. In theory, a phone should never run out of battery in a single day no matter the use case, but battery life is a complex issue to address. It’s common to see people assume that battery capacity and battery life are closely correlated, but this completely ignores total system power draw. Last year, one of the best examples of this was the One M8 compared against the Galaxy S5, which showed a slight edge in favor of the One M8 despite the smaller battery.

The Galaxy S6 and S6 edge are in a similarly peculiar situation. For the past few years, it has gone without saying that battery capacity would increase from year to year, but for the first time ever, Samsung has gone backwards in this regard. The Galaxy S6 has a 2550 mAh, 3.85V battery, which is 91% of the Galaxy S5’s battery. If we simply looked at this metric, it would be rather trivial to simply write off the Galaxy S6 as worse than the S5 in battery life. As previously mentioned, this is a simplistic view of the situation and ignores the other half of the battery life equation.

As a result, we must test battery life holistically, which is far from trivial in most cases. One of the first problems we can encounter in trying to test battery life is display brightness, which can significantly affect results in a battery life test. Although it’s common to standardize at 50% brightness, 50% brightness can be as low as 80 nits or as high as 250 nits depending upon the brightness curve that the OEM sets. In order to avoid this pitfall, we test battery life with the display set to have a 200 nit brightness when displaying a white image. In addition, it’s necessary to have appropriate tests that cover the full curve of performance and power, ranging from a display-bound web browsing use case to sustained, intense CPU and GPU loads.

As with most reviews, our first battery life test is the standard web browsing workload, which loads a set of webpages at a fixed interval, with sufficient time between each page load to ensure that the modem and SoC can reach an idle state. This helps to ensure that faster SoCs aren’t penalized in this test. This test doesn’t exactly match real-world web browsing patterns, but it will give a good idea of relative performance per watt at a constant level of performance.

Web Browsing Battery Life (WiFi)

In web browsing, the Galaxy S6 manages to keep battery life while on WiFi at approximately the same level as the Galaxy S5. It’s likely that a combination of the newer Broadcom BCM4358, upgraded AMOLED display, and the Exynos 7420 helped to keep battery consumption relatively constant here, which represents a 10-15% overall power efficiency increase in this test. It’s likely that we’re mostly looking at differences in display efficiency when comparing the 1440p panel of the S6 to the 1080p panel of the S5. It’s definitely impressive that Samsung has pulled this off, but I do wonder what the result would be if Samsung stayed at 1080p.

Web Browsing Battery Life (4G LTE)

On LTE, we see a pattern that seems to generally mirror devices like the iPhone 6 with an external MDM9x25 Gobi modem. The Shannon 333 modem and Samsung’s other RF front-end components seem to be competitive with Qualcomm’s implementations, but given just how close WiFi and LTE battery life was with the Snapdragon 801 generation I suspect Qualcomm still holds an edge in average RF system power. The difference isn’t massive here, so it’s possible that this could simply be the difference between an external and integrated modem, but we’ll have to do a deeper investigation on power to be sure.

While web browsing is one of the crucial use cases, battery life experiences are often different when looking at multiple points in the curve. In order to get a better idea of battery life in less display-bound use cases, we’ll look at PCMark’s Work Battery Life test. Although not a fixed workload per unit time test like our web browsing test, it avoids strongly emphasizing display power at high APL scenarios and tends to be more CPU and GPU intensive, along with more mixed APL scenarios.

PCMark - Work Battery Life

In this test, the Galaxy S6’s runtime in this battery life test is pretty close to the One M8 and One M9, but the major point of differentiation when compared to both is that the score throughout the test is significantly higher. It’s also important to note that the “battery” temperature during the test manages to stay much lower on the Galaxy S6 than on the One M9, which means that the SoC stayed in a more efficient mode of operation throughout the test as power consumption will rise with die temperature due to the way semiconductors work.

Now that we have a good idea of battery life in a display-bound and balanced workload, we can now look at SoC-bound workloads which include GFXBench and Basemark OS II. These tests are almost always limited by how much heat the case of the phone can carry away, and can often reveal weaknesses in how OEMs control throttling if a phone or tablet fails to complete either test. We’ll start by looking at GFXBench, which strongly stresses the GPU similar to an intense 3D game.

GFXBench 3.0 Battery Life

GFXBench 3.0 Performance Degradation

The Galaxy S6 ends up performing around the same level as the One M9 in terms of overall runtime, and the sustained frame rate ends up relatively similar as well. However, it’s critical to add context to this as the Galaxy S6 is running the test at 1440p, while the One M9 is rendering at 1080p. What this means is that the Mali T760 of the Galaxy S6 is sustaining a higher level of performance when compared to the Adreno 430 of the One M9 in this workload, even if that performance is “wasted” on rendering more pixels per frame. The one major issue here that is visible from the FPS vs time graph is that Samsung continues to struggle with graceful throttling as the GPU attempts to always target maximum performance, which causes a strong rise and fall in frame rate as the GPU goes through periods of high and low clock speeds determined by the thermal governor.

BaseMark OS II Battery Life

BaseMark OS II Battery Score

The final battery life test is Basemark OS II’s sustained CPU load test. Although it appears that the Galaxy S6 is comparable to the One M9 in this test, logging CPU frequencies over time reveals that the Exynos 7420 manages to keep the A57 cluster online throughout the course of the test at around 1.2 GHz, while the One M9 is forced to shut off the A57 cluster completely as the phone reaches skin temperature limits. Although both are kept at similar levels of normalized CPU load and run through the test for similar amounts of time, the Galaxy S6 manages to keep the CPU at a significantly higher performance level throughout the test. In general, it’s likely that the Exynos 7420 will be able to sustain overdrive frequencies for longer periods of time due to the massive process node advantages that come with Samsung’s 14LPE process.

Charge Time

Broadly speaking, much of the discourse around battery life as a whole has centered around time off of the charger. We can talk about how many hours of screen on time or total time a phone can spend on a battery, but charging time is often a critical to maintaining mobility. Removable batteries might be able to help with this problem, but if it’s easy to forget to charge a phone overnight, it’s just as easy to forget to charge a spare battery. Charge rate is often crucial for this reason, which is why we attempt to test it. In order to test this, we measure the time it takes to charge from a fully discharged battery to 100% either measured at the wall or indicated by a charging LED. The Galaxy S6 retains the same fast charge protocol as the Note 4, which seems to be QC 2.0 as the AC adapter negotiates fast charging with phones like the LG G Flex 2 and One M8.

Charge Time

When using the included USB charger, the Galaxy S6 charges incredibly quickly. However, the wireless charger is noticeably slower than the wired charger, which is due to inefficiencies associated with wireless charging and the rather limited charge speed, which is appears to be limited to 1.5 amps at 5 volts. It’s a bit surprising to see that there is no option to disable fast charging the way there was with the Note 4 given that the battery is now sealed and rather difficult to replace, but I suspect most won’t notice a difference in lifetime unless the phone is used for more than 2-3 years.

Exynos 7420: First 14nm Silicon In A Smartphone Display
Comments Locked

306 Comments

View All Comments

  • h3ck - Friday, May 8, 2015 - link

    Because even though media, books, music and other content is cloud based and easy to access, applications are getting increasingly larger and you still want to have space for local content in general. I went with the 64GB myself, but I also use a Dash Micro. If you don't know what that is, it's amazing > http://www.amazon.com/Dash-Micro-MicroSD-Android-U...
  • elchepe - Saturday, May 23, 2015 - link

    I don't know what world you're living in, but most people take cheap loads of pictures and videos these days. Considering the megapixel count on the S6 and other flagship, expandable memory certainly becomes a necessity. There are plenty of people who know the woes of being unable to update their iPhones because there's not enough space available.
  • deputc26 - Friday, April 17, 2015 - link

    I would like to see an APL vs. Brightness graph for maximum manual brightness. Using 100% APL unjustifiably biases the test against AMOLED screens. 60-80% APL would correspond much better to real world scenarios.
  • Brett Howse - Saturday, April 18, 2015 - link

    Check the display page again.
  • deputc26 - Saturday, April 18, 2015 - link

    "Manual" brightness, not automatic. I checked the display page again and was disappointed.
  • melgross - Saturday, April 18, 2015 - link

    The problem amoled has, is if brightness remains too high for too long, the display will burn out after a time. The real max brightness is around 350nits, the same as before. In fact, Samsung is taking a big chance with the max brightness here. I predict that we'll be seeing display life significantly shortened for a number of people who spend a lot of time outdoors with their phones.

    The life of all LEDs is directly dependent on their temperature. Amoled needs low temps to work, unlike metal based LEDs, used for LCD screen back lights. Their innefficiency means that a lot of power is required for maximum brightness, which leads to overheating. That's why manual brightness on these is the same its been for the last three generations.
  • Brett Howse - Saturday, April 18, 2015 - link

    @deptuc26: So you are concerned about 100% APL being biased against AMOLED but want to see the manual brightness where the scores are lower? I'm confused. You seem to contradict yourself there.
  • melgross - Saturday, April 18, 2015 - link

    No, not at all. I'm pointing out that while Samsung has a new brightness setting for outside, you can't use it for regular use. That because using it more than occasionally will damage the display. This is like redlining a car engine. You can do it, but don't do it too often.
  • deputc26 - Saturday, April 18, 2015 - link

    Values for both Manual brightness and automatic brightness are given at 100% APL in charts in the review. This is unrealistic, no one walks around with a pure white screen. The review partially makes up for this error by including a "APL vs brightness" graph for automatic brightness. It does not include such a graph for manual brightness leaving the attentive reader wondering what the brightness would be at a realistic APL and leaving the inattentive reader with a false impression.
  • Uplink10 - Saturday, April 18, 2015 - link

    Because then people would not buy 64GB or 128GB phones which are very overpriced. They would just buy 32GB/16GB phone and additional MicroSD card. Why? Because of greed.

Log in

Don't have an account? Sign up now