Battery Life

Battery life remains probably the single largest differentiator for devices lately, and of huge concern to enthusiasts and normal shoppers alike. We’ve already caught a glimpse of how well 8974 fares from a power perspective inside the LG G2, a device that posted some seriously impressive battery numbers. The Note 3 we’re looking at is also 8974 based since it’s a T-Mobile model, and thus we expect the same kind of battery life.

With this generation of Note, battery gets even larger. The Note started with a then quite large 9.25 watt hour battery, then Note 2 moved to 11.78 watt hours, and Note 3 now moves to a very large 12.16 watt hour battery with of course the newest 3.8V chemistry and all that comes along with it. Display size goes up, but those power gains are offset in other places.

After we talked about the panel self refresh features in the G2 a few people reached out and let me know that this feature has been shipping for a while in some phones, and it’s easy to check for. If we look under the display subsystem we can see that the same MIPI_CMD_PANEL type 9 is used, which refers to this type of interface.

 

Qualcomm HWC state:
 MDPVersion=500
 DisplayPanel=9

define MIPI_CMD_PANEL ‘9’

Our battery life tests are unchanged and consist of a set of popular webpages that are loaded on a schedule with the display set to exactly 200 nits and repeated until the battery runs out and the device dies on both WiFi and cellular data connections. In this case that means T-Mobile LTE which is 10 MHz FDD in my market, I haven’t had a chance to run the Note 3 on HSPA+ yet, or complete the call test (which is starting to get ridiculous, and probably breaks 24 hours in the case of the Note 3).

AT Smartphone Bench 2013: Web Browsing Battery Life (4G LTE)

On LTE the Note 3 does very well, coming just shy of the pack of iPhones, at just over 8 hours. Interestingly enough it’s just north of the G2s as well, which do have a smaller battery but also smaller display. The Note 3 also is the first device to ship with Qualcomm’s QFE1100 envelope tracker solution from the RF360 front end portfolio, which lowers power consumption by up to 20 percent and heat dissipation by up to 30 percent by allowing the power amplifiers to follow the desired output waveform. There’s more on that later in the cellular section.

AT Smartphone Bench 2013: Web Browsing Battery Life (WiFi)

On WiFi the Note 3 does better by 22 percent, but not the kind of huge jump I’m used to seeing between cellular and WiFi testing. This tells me the Note 3 battery life is really gated by the display, which is almost always the largest consumer of power in a device. That said the Note 3 does very well all things considered, especially in comparison to the APQ8064 (Fusion 3) phones which came before it, like SGS4. New silicon and new process inside MSM8974 definitely helps move battery life forward here with the race to sleep game.

Charging is an interesting story on the Note 3, but primarily because of what doesn’t change. The Note 3 continues to use Samsung’s tablet charging specification and charger, which has 2 amps of maximum output. The Note 3 draws 2 amps over a considerable amount of the charging curve, like other Samsung devices (in the linear part of the charge curve). USB 3.0 doesn’t change things up here quite yet with the new supported charge voltages that are coming eventually with the power delivery specification.

Device Charge Time - 0 to 100 Percent

The Note 3 does charge faster overall compared to the SGS4 however thanks in part to the new PMIC (PM8941) which is part of the overall 8974 platform story.

 

S Pen Performance: CPU, GPU, NAND & USB 3.0
Comments Locked

302 Comments

View All Comments

  • doobydoo - Saturday, October 19, 2013 - link

    'The chipset IS in fact performing as the benchmark indicates'

    No, it isn't. The chipset CAN'T reach the same speeds for any non-benchmark application, for reasons such as battery life and heat.

    Your argument that bickering about performance is redundant is also stupid, because if it was, Samsung wouldn't feel the need to cheat them.
  • esterhasz - Wednesday, October 2, 2013 - link

    I would be much in favor of standardized qualitative testing. Have a set five people panel use the phone for a day in their normal workflow and use a questionnaire for performance rating. Sure it's subjective, but users are subjects last time I looked.
  • Demigod79 - Thursday, October 3, 2013 - link

    Although I too would like to a cheat vs non-cheat result in benchmarks (perhaps mark out cheat results in a different color or something), Anand did state clearly that this was cheating. There was no glossing over this fact, he laid it out explicitly and said that he wanted this practice to stop (for all OEMs that do it).

    He also mentioned that it's unlikely that OEMs will stop doing this. It's easy for the OEMs to do and makes their products look better (and frankly, it's only technical geeks who care about things like this, and we only represent a tiny segment of smartphone buyers). If it sells more products, then they will do it (although I find it about as frivolous as the Nvidia and then-ATI battle to have the fastest GPU, simply for the sake of wearing the performance crown for a couple of months).

    That being said though, this is benchmarks we're talking about here. Benchmarks do not represent real-world usage, and never have. All you have to do is look at 3DMark, which was criticized for some time as being too artificial (CPU speeds hardly mattered, whereas in real life CPU speed matters greatly). Benchmarks are, by nature, highly artificial tests meant to measure performance in a specific area. Although you can complain that cheating in benchmarks give a false impression of performance compared to other devices, you cannot say that such cheating misrepresents real-world usage since it doesn't represent real-world usage in the first place.
  • DanNeely - Tuesday, October 1, 2013 - link

    "The impact is likely small since most of these tests should drive CPU frequencies to their max state regardless (at least on the CPU side), but I'm going to make it a point to call out this behavior whenever I see it from now on."

    Unfortunately this isn't the case. By decompiling benchmarks and changing package names to disable the cheat function Ars Technica discovered that the GN3 is inflating benchmark scores by 20-50%. Most got a 20% boost; Linpack was an outlier at 50%.

    http://arstechnica.com/gadgets/2013/10/galaxy-note...
  • Anand Lal Shimpi - Tuesday, October 1, 2013 - link

    This is unfortunately something we've seen on a lot of devices, not just Samsung. Google Experience devices aren't affected, but we've seen it on the SGS4 and HTC One among others.

    Linpack isn't a very consistent test and it's too short to drive frequencies up consistently, which is why I'm guessing it's an outlier. The 20% end is higher than expected, it's entirely possible that Samsung is lifting a thermal limit as well as driving CPU frequencies up.

    I don't like any of it and I do want to see companies stop doing it. I was hoping we would see an end to it with the Note 3 but it looks like that was wishful thinking.

    Take care,
    Anand
  • Wojciech - Tuesday, October 1, 2013 - link

    Have you thought about doing an article about 'fixing' benchmark scores by other OEM's?
    If you're saying that HTC is doing the same with One then maybe LG is doing something similar and maybe even Sony.
    Normal behavior by Google experience devices would explain often lower scores than customized devices running on the same hardware platform.

    Don't you think that would be an interesting topic to examine?
    Right now I fear that more and more OEM's are going to start doing the same thing and the whole 'benchmark to determine real life performance' will be completely lost.
  • xype - Wednesday, October 2, 2013 - link

    You don't like any of it? But you still put up the graphs and numbers with an "Oh my."? People come here because AnandTech has a reputation of providing in-depth, honest reviews. Most people scan the text and go right to the graphs. Their takeaway will be a marketing lie that you didn't bother to correct because "A lot of companies do that."? Seriously?
  • Spunjji - Tuesday, October 8, 2013 - link

    If you skim the text and go to the graphs you will never, EVER get a representative review of anything. People come to Anandtech for analysis and they got that with this review. If they missed that then they might as well have gone to any of the other sites.
  • doobydoo - Saturday, October 19, 2013 - link

    Na, most people come to Anandtech because they know the graphics will have been performed in an objective and logical way. I would bet that the vast majority of readers don't read the text associated with such images.

    And that doesn't mean that they should go to other sites.
  • Squuiid - Tuesday, October 1, 2013 - link

    "It's also interesting to note that the Galaxy Note 3 appears to outperform all other Snapdragon 800 smartphones we've tested thus far. There's a couple of potential explanations here."
    You missed an explanation: Samsung cheat.
    From Ars:
    "The two functions applied to this list seem to be "PACKAGES_FOR_BOOST_ALL_ADJUSTMENT" which is no doubt the CPU booster, and "PACKAGES_FOR_LCD_FRAME_RATE_ADJUSTMENT" which makes it sound like they are also changing the display frame rate."

Log in

Don't have an account? Sign up now