Battery Life

Both the Note 4 Exynos and the Snapdragon version sport the same size 3220mAh 3.85V battery. The difference is in the internal components, one running off Qualcomm's Snapdragon 805 platform that includes their own PMIC, audio solution, and modem IC. On the other hand Samsung provides their own PMIC too, but relies on audio solutions from WolfsonMicro and modem from Ericsson.

Web Browsing Battery Life (WiFi)

Unfortunately the battery disadvantage for the Exynos versions is quite large in our web browser benchmark. Here the Exynos comes out 79 minutes shorter than the Qualcomm version. I commented in the A57 page how bad Samsung's software power management is on the Exynos 5430 and 5433, running a quite outdated version of GTS to control big.LITTLE migration in the Linux kernel scheduler. I see this result as a direct consequence of an inferior configuration in terms of CPUIdle drivers and lacking big.LITTLE optimization. I mentioned in the A53 power analysis that the kernel is lacking several power management features such as "Small Task Packing".

In the time between starting this article and publishing it, I was able to confirm this and improve the power efficiency by applying a handful of patches onto the stock kernel. I'm looking forward and hoping to do a more in-depth article about user-customization of device firmware in the future, especially on the topics of overclocking and undervolting in the mobile space.

BaseMark OS II Battery Life

BaseMark OS II Battery Score

The BaseMark OS II Battery benchmark is a CPU run-down test. Here we're seeing the devices limited by thermal drivers and power allocation mechanisms. Both the Exynos and the Qualcomm are neck-and-neck in this benchmark, representing around 3.85W total device TDP. The Exynos is able to do more work in the same period, awarding it a slightly higher battery score.

Since we are adding PCMark to our 2015 suite, I decided it would also make a good test candidate for overall device power consumption. For this I measure the power draws during the various subtests. These figures portray total device power, meaning the screen is included in the measured power. The video test in particular is something that will be interesting as we have stopped publishing video playback battery life because modern devices routinely exceed 12 hours of lifetime.

Although we can't do much with the device power figures without having comparison devices, we can see the power differences between running the device in normal versus power-savings mode. The power savings mode limits the A57 cluster to 1.4GHz and disables all boost logic in the scheduler, meaning the threads are no longer aggressively scheduled on the big cluster due to lowered migration thresholds.

It's interesting to see the perf/W go up in battery savings mode. I've commented in the A57 section how Samsung's default scheduler settings are much too aggressive in normal usage and may have detrimental effects on power consumption. The figures presented by PCMark corroborate my suspicions as real-world loads are negatively impacted in power efficiency.

I hope to continue using PCMark as a power consumption test in future devices whenever it's possible so that we can paint a better picture of general power consumption of devices.

GFXBench 3.0 Battery Life

On the GFXBench battery rundown the Exynos leads the Snapdragon by half an hour. It's worth mentioning that I ran this several times while Josh ran his review unit under different ambient temperature conditions; we found the score can fluctuate a lot depending on much heat the phone is allowed to dissipate.

GFXBench 3.0 Performance Degradation

The performance degradation metric is exceptionally bad on the Exynos version. While the Snapdragon also has its thermal throttling issues, it seems to degrade much more gracefully than the Exynos. The Mali T760 is here limited to half its frequency during most of the benchmark run, spiking back to full frequency momentarily before throttling again. I'm disappointed to see such throttling behavior; it would've made much more sense to scale down to some of the higher frequencies first rather than dropping immediately to 350MHz.

The Exynos rendered 202K frames while the Snapdragon achieved 252K, 24.5% more. On the other hand the Exynos ran 18% longer. I have a feeling if both devices actually had similar throttling policies we would be seeing much closer scores in the end, taking into account screen power overhead.

All in all it seems the Qualcomm version has superior power management and is able to outpace the Exynos in day-to-day usage. I think this difference is not a matter of hardware efficiency differences but rather software oversights. The question is whether Samsung is willing to invest the time and effort into fixing their own SoC software, or if we'll see this pattern continue on for the foreseeable future.

Display Power

Until now we never really had the opportunity to take an in-depth look at a mobile phone's display power. This is particularly a problem in trying to characterize the power consumption and battery life of emissive screen technologies such as OLED. I went ahead and measured the device's power consumption at all its available brightness points and a few APL and grey-scale levels totalling 191 data points.

The x-axis represents the luminance measured at 100% APL throughout the device's 65 hardware brightness points, but the sunlight boost mode with auto brightness couldn't be included with this test. At 100% APL, meaning a full white screen, we see the power consumption rise up to 1.7W. The phone has a base power consumption of around 440mW when displaying a black screen, meaning the screen emission power to display white at 336 cd/m² of this particular device comes in at about 1.25W. The measurements were done in the sRGB accurate "Movie" screen mode; more saturated modes should increase the power usage due to the higher intensities on the OLEDs.

The power consumption seems to scale pretty linearly with luminance when using the device's brightness settings. However what is extremely interesting is how power scales with APL (Average Picture Level) and grey-scale level. Theoretically, an OLED's screen power should scale with APL and grey-scale level identically as a 70% grey-scale image is the same APL as an artificial 70% APL image. Here in our measurements we use APL samples consisting of pure white areas and pure black areas to achieve a target APL over the whole screen area.

Instead of our theoretical scenario, we see a huge discrepancy between a 70% APL pattern and a 70% grey-scale level image. This is due to how primary brightness and secondary luminance of color components are controlled in Samsung's AMOLED screens. I mentioned that power scaled more or less linearly when luminance is altered via the device's brightness controls. This is due to the panel increasing brightness by increasing the PWM cycle of all pixels on the panel.

In fact, if one had a high-speed camera or DSLR with fast enough shutter speed one could see and capture the rolling-shutter-like effect of how the panel is lit up: multiple horizontal stripes of pixels rolling from the top of the device towards the bottom in a similar fashion to how a CRT's electron beam behaves when moving in one axis. The brightness depends on the thickness of the stripes, or in other words how much active area is powered up vs inactive pixels. When looking from a single pixel's perspective, this results in nothing but a typical PWM pulse.

What remains to be answered is why we see a large difference in power when looking at the grey-scale power figures at equal APL as the APL pattern images. Although the primary brightness is controlled by PWM, the actual color output of each OLED is controlled via voltage. In my research in the past I've actually managed to retrieve the display driver IC's programmed voltage to see how the sub-pixels, and thus color, is actually controlled. Although I haven't looked into the Note 4's driver yet (it's an extremely time-consuming procedure), I had saved up the table of my old Galaxy S4:

If the Note 4's voltage curve behaves anywhere near the same, it's clear why an image containing white consumes more power than an equal homogeneous grey-scale picture with the same APL. There's a steep voltage increase towards the last few levels for each color component (white being 255 on all three RGB channels).

I am holding off trying to do any comparisons with LCD screens and past AMOLED generations as I still lack enough data points to make any educated statement on the power efficiency of the different technologies and panel generations. What can be said for certain is that the increasingly higher APL values, and especially white interfaces of Android, are AMOLED's Achilles' Heel. There's no doubt that we could see some very large power efficiency gains if UX design would try to minimize on colors nearing 100% saturation, which leaves some justification for the "grayscale mode" when using extreme power saving modes.

Charge Time

While Josh already tested the charge time on the Qualcomm version of the device, I wanted to add a charge graph comparing the two charge modes and add a little comment in regards to fast-charging.

The Exynos model supports the same fast-charging capabilities as the Qualcomm version - in fact many people have been calling this a feature enabled by Qualcomm's Snapdragon platform, or "Quick Charge", but that is only half correct. Both Note 4 variants' fast-charging mechanisms are enabled by Maxim Integrated's MAX77823 charger IC chip. The IC has been around for a bit but it has seen limited usage in only Japanese models of the Galaxy S5.

Samsung has not used Qualcomm's charger ICs in any of their recent products and always relied on Maxim solutions, even in combination with Snapdragon SoCs. What Qualcomm markets as Quick Charge is actually the proprietary signalling between the charger and a dedicated voltage switcher IC on the phone. What seems to be happening here is that although the Qualcomm device is advertised as QC2.0 compatible, it doesn't use it. Instead what appears to be implemented in the stock charger is Qnovo's Adaptive Fast Charging technology, which provides a similar solution.

The phone is able to detect if there's a 9V charger plugged in, enabling this charge mode if you enable it in the battery screen option, and it requires you to re-plug the device cable for the new mode to be active. The included charger supplies up to 1.67A at 9V, or 15W. Otherwise the normal charging mode takes place at 2A on 5V, or 10W.

We see in the fast-charging graph that the phone is being delivered just short of 11W of power up until the 75% capacity mark, after which the charger IC vastly limits the input power and gradually switches to trickle charging the device. The phone is able to be charged to 75% in just 50 minutes, which is quite amazing. A mere 15 minutes of charging on the fast charger can charge up to 25% of the battery. The total charging time in this mode is 138 minutes.

In the "slow" charging mode over a normal 5V charger, the device gets delivered around 7W of power up until the ~85% mark. Again the charger IC slows down the power gradually until the device reaches true 100% capacity. You will notice that the fuel-gauge will report 100% capacity much earlier than the input power reaching zero, but power continues for another 30 minutes after that. This is due to modern fuel-gauges faking the top few percent of a battery's capacity. Here it takes about 50 minutes to reach the 50% battery capacity mark, and a total of 171 minutes to truly fill up the battery.

GPU Performance Conclusion
Comments Locked

135 Comments

View All Comments

  • habbakuk87 - Wednesday, February 11, 2015 - link

    I had like to thanks the writers for great in depth article, this is the kind of thing which has kept me coming to this site over the past many years.
    Keep up the good work.
  • Arbie - Wednesday, February 11, 2015 - link

    Yes, this is a great article - knowledgeable and in-depth. Work like this is what keeps Anandtech way above the crowd. Thanks.
  • joe_dude - Wednesday, February 11, 2015 - link

    What it looks like to me is that they need to cap 4 big core load to 1.6 GHz, which would keep the thermals under control. Going by the chart, 1 core @ 1.9 GHz, 2 cores @ 1.8 GHz, 3 cores @ 1.7 GHz would work nicely for power consumption/heat as well. It seems Samsung and Qualcomm are setting the max frequency too high. That last 100 to 200 MHz requires a lot more voltage.
  • joe_dude - Wednesday, February 11, 2015 - link

    You look at the power consumption at 1.9 GHz vs. 1.6 GHz - 7.39w vs. 4.44w. That's the crux of the problem right there. 4 cores should not be running at such a high clock speed (and voltage needed to support it).
  • PrinceGaz - Wednesday, February 11, 2015 - link

    "So the question is, is it still worth to try and get an Exynos variant over the Snapdragon one? I definitely think so. In everyday usage the Exynos variant is faster. The small battery disadvantage is more than outweighed by the increased performance of the new ARM cores."

    That made me laugh as it is easily the most out of touch comment I've read in an AnandTech review.

    It may be true for you as a reviewer sitting at home with the phone plugged into a USB charger running benchmarks that you feel the extra performance is worthwhile, but it will count for nothing when you use it in the real world and find your battery is dead earlier. Almost everyone with a smartphone today would rather have longer battery life instead of higher performance, because just like with PCs for some time now, the performance is already good enough.

    Longer battery life trumps extra performance in smartphones now. I don't know why you want to put a positive spin on these higher performance but lower efficiency A57/A53 cores; I doubt ARM are paying you, but it seems they are a step backwards for people who use their phone primarily on the move, which includes most people.
  • patrickjchase - Wednesday, February 11, 2015 - link

    From long experience with a variety of microarchitectures (including ARMs), I would guess that the latency/bandwidth "oddities" reflect differences in hardware prefetch. That's the most logical culprit among the list of changes that ARM provided.

    The hypothesis that the capability to dual-issue loads/stores impacts bandwidth to L2 and/or DDR seems questionable, because 4xA7 already has enough ld/st bandwidth to saturate both. Memory-limited code shouldn't see much impact due to issue-rule relaxation.
  • aryonoco - Wednesday, February 11, 2015 - link

    Andrei and Ryan, thank you. I have not been impressed with anything Anandtech has published this much since the Original HTC One M7 review by Brian.

    I believe you guys have just published the most thorough, detailed, comprehensive review of every aspect of an ARM SoC. Short of working at a chip maker's lab, I don't think anyone is going to have any better exposure to the ARM ecosystem than what you guys have presented here.

    Huge thanks for finally paying attention to the SoCs that don't make it to the North American market. I've been fascinated by their performance and power consumption metrics, it's great to finally have an authoritative view of them. I would love to have your take on some other SoCs in this regard as well, especially Cortex A17. There is not much coverage of Cortex A17 and I think, given the situation with big.LITTLE, a quad core well optimised Cortex A17 might actually be a hidden weapon that no one seems to be using.

    Also very much looking forward to your coverage of overclocking/undervolting mobile devices. You guys are truly bringing AnandTech to the mobile industry.

    Once again, thank you. To be honest, I've been a bit worried since Anand and Brian's departure about the direction of AT, and it's so good to see it thrive in such capable hands now.
  • Andrei Frumusanu - Wednesday, February 11, 2015 - link

    I'm planning on reviewing the A17, but still in the process of securing a device. Hopefully in the near future.
  • aryonoco - Wednesday, February 11, 2015 - link

    On the subject of Javascript benchmarks and Chrome vs Stock browsers:

    Are we sure that all of the difference is indeed due to the optimized libraries that Samsung has developed, and that there is no benchmark-targeting optimization going on? After all, we saw what happend with Sunspider (and thanks for dropping it), it is impossible that they are not targeting Kraken and Octane as well?

    Would it be feasible for Anandtech to develop its own proprietary javascript benchmark? It could answer a few of these questions.
  • tuxRoller - Wednesday, February 11, 2015 - link

    This was an excellent read.
    Good detail, and enlightening investigation.
    With regards to the power aware governor, you have to realize that it's a very hard problem. One that no one has managed, yet (iirc, huawei has claimed that their kernel has such a scheduler, but, you've seen how well it works), for general loads. Yes, there are many ideas but, as you've surmised, board implementations can drastically change assumptions.
    BTW, power collapse appears to just be the arm term for the state under which cpuidle takes over for that core. So, it's not actually powered off, thus hotplug would still be needed.
    For an overview of the domain see: http://lwn.net/Articles/482344/

    Some useful links:
    https://rt.wiki.kernel.org/index.php/Energy_Aware_...
    https://lkml.org/lkml/2014/5/23/621

Log in

Don't have an account? Sign up now