Battery Life

One of the things Qualcomm promised would come with Snapdragon 800 (8974) (and by extension the process improvement with 28nm HPM) was lower power consumption, especially versus Snapdragon 600 (8064). There are improvements throughout the overall Snapdragon 800 platform which help as well, newer PMIC (PM8941) and that newer modem block onboard as well, but overall platform power goes down in the lower performance states for Snapdragon 800. In addition the G2 has a few unique power saving features of its own, including display GRAM (Graphics RAM) which enables the equivalent of panel self refresh for the display. When the display is static, the G2 can run parts of the display subsystem and AP off and save power, which they purport increases the mixed use battery life case by 10 percent overall, and 26 percent compared to the actively refreshing display equivalent. In addition the G2 has a fairly sizable 3000 mAh 3.8V (11.4 watt-hour) battery which is stacked to get the most out of the rounded shape of the device, and utilizes LG's new SiO+ anode for increased energy density compared to the conventional graphite anode. 

Our battery life test is unchanged, we calibrate the display to exactly 200 nits and run it through a controlled workload consisting of a dozen or so popular pages and articles with pauses in between until the device dies. This is repeated on cellular and WiFi, in this case since we have an international model of the G2 that lacks the LTE bands used in the USA, that's 3G WCDMA on AT&T's Band 2 network. I've tested 3G battery life on devices concurrently for a while now in addition to LTE though, so we still have some meaningful comparisons. The most interesting comparisons are to the Optimus G (APQ8064) and HTC One (APQ8064T) previous generation.

AT Smartphone Bench 2013: Web Browsing Battery Life (3G/2G)

AT Smartphone Bench 2013: Web Browsing Battery Life (WiFi)

Cellular Talk Time

The LG G2 battery life is shockingly good through our tests, and in subjective use. The combination of larger battery, GRAM for panel self refresh, new HK-MG process, and changes to the architecture dramatically improve things for the G2 over the Optimus G. While running the two web browsing tests I suspected that the G2 might be my first phone call test to break 24 hours, while it doesn't break it it comes tantalizingly close at 23.5 hours. I'm very impressed with the G2 battery life.

Device Charge Time - 0 to 100 Percent

The G2 also charges very fast for its battery size. I've been profiling charging behavior and current for devices for a while now, since I strongly believe that battery life and charging speed are complementary problems. You should always opportunistically charge your smartphone, being able to draw as much while you have access to a power outlet is critical. The G2 can negotiate a 2A charge rate on my downstream charge port controller and charges very fast in that mode. Of course the PM8941 PMIC also includes some new features that Qualcomm has given QuickCharge 2.0 branding.

Display CPU Performance
Comments Locked

120 Comments

View All Comments

  • ijozic - Saturday, September 7, 2013 - link

    Would love to see some audio quality tests and a comment on the volume levels (maybe in the full review?) as LG usually goes below average in this area..
  • Impulses - Saturday, September 7, 2013 - link

    Seconded
  • BoneAT - Saturday, September 7, 2013 - link

    It's interesting that the G2 either slightly over-exposes most situations, or the dynamic range is tighter than on the S4 Octa's Exmor RS and or the Lumia 1020, this applies to photos and videos alike. Otherwise I'm highly impressed with the camera performance, very natural, even slightly under-saturated results like the S4 Octa (which in the 808 comparison shows that it's everybody else over-saturating), I'd only set half a step lower exposition correction and let everything else done by the device.

    Brian, what is the maximum exposure time you could get automatically or manually off a single shot? What is the highest ISO value?
  • Jon Tseng - Saturday, September 7, 2013 - link

    Hmmm. So looks like trade off vs Nexus 5 (2300mah) will be great battery life vs OS updates. Tough one!

    On rear buttons I'm cool w that -used to have atrix w rear power button/fingerprint and no probs at all w day to day use.
  • andykins - Saturday, September 7, 2013 - link

    Don't forget the biggest difference (imo): price. The Nexus 5 should be around half the price.
  • Alketi - Saturday, September 7, 2013 - link

    This actually bodes *very* well for the Nexus 5, as it also packs a Snapdragon 800 chipset.

    It's not too much of a stretch to expect better battery life than the Nexus 4, which was already decent. Plus, there's a good chance of an upside surprise, if it also packs the panel self-refresh and gets gains from Android 4.4.
  • Spunjji - Monday, September 9, 2013 - link

    One thing worth bearing in mind is that even with ~75% of the battery capacity the G2 would still have class-leading battery life. So, the Nexus 5 is hardly going to stink in that regard!
  • Krysto - Saturday, September 7, 2013 - link

    Great to see those battery efficiency improvements from Qualcomm. You'd following the right path here, Qualcomm. Please don't change.

    Nvidia is stupid for following the "pure performance" path. That strategy has lost them most customers, especially since they followed that strategy to the point where they were making only "tablet chips", which is code-word for "our chips aren't efficient enough for smartphones".

    I've said it before, chip makers should think about making "smartphone chips" first and foremost, and THEN, use the same chips, maybe with a little extra clock speed in tablets, too. If think think about making "tablet chips", they will blow it, because they will make the chip too inefficient and won't be able to "downscale" as easily to put it in smartphones.

    So yeah, Qualcomm please continue doing your own thing. If Nvidia, Samsung and others keep following the "performance/benchmark" path, then the joke is on them, and will ultimately fail (as they have so far, and it's most devices are using Qualcomm's chips). I do hope they wake up to it sooner rather than later though, because I don't want Qualcomm to become another monopolistic Intel.
  • UpSpin - Saturday, September 7, 2013 - link

    Qualcomm is the only Android SoC producer which does design their own cores and does not rely on ARM finished CPU designs.

    If you take a close look, you'll see that the Tegra 4 and the Exyons 5 are the only A15 processors at the moment, and it probably took much longer for ARM to release them and for NVIDIA and Samsung to finalize them than expected. They also had no other option than A15 to get some improvement over A9 and to remain competive with future Qualcomm SoCs.

    Qualcomm on the other hand was able to release minor updates the whole time, so processors between A9 and A15.
    Samsung will do the same next year, so expect some larger competition to Qualcomm.
    Qualcomm also has the big radio advantage, which NVIDIA adressed with the i500 and which might make them competive to Qualcomm next year again.

    Neither Samsung nor NVIDIA followed a performance strategy only. They had no other choice than using A15, and Tegra, as always used their 4+1, Samsung had to use big.LITTLE to make A15 usable in a smartphone. But big.LITTLE wasn't fully ready yet, so they had no other choice than using an octa-core setup.

    And also remember that MIPS (a competive contender to ARM, but so far mostly used in low end applications) got bought up by Imagination Technologies, which I strongly believe will try everything they can to push MIPS in the high end sector.

    So I think it's safe to assume that there won't be a monopoly, Qualcomm just had a big advantage for one year because of the A9 to A15 gap and by offering integrated radios. Both Samsung and NVIDIA learned from this, and a new competitior to ARM is coming up, too.
    So it will get really interesting.
  • Impulses - Saturday, September 7, 2013 - link

    That one year advantage in design turned into two solid years of device wins for Qualcomm tho... It's gonna get interesting next year for sure.

Log in

Don't have an account? Sign up now