Phone Efficiency & Battery Life

While not directly released to the Google Tensor, I also finished running the various battery tests for the Pixel 6 and Pixel 6 Pro, and there are some remarks to be made in regards to the power efficiency of the devices, and how the new SoC ends up in relation to the competition.

As a reminder, the Pixel 6 comes with a 4614mAh battery and a 6.4” 1080p 90Hz OLED screen, while the Pixel 6 Pro features a 5003mAh battery and a 6.71” 1440p 120Hz OLED display, with variable refresh rate from 10-120Hz.

Web Browsing Battery Life 2016 (WiFi) 60Hz

Starting off with the 60Hz web browsing results, both Pixel phones end up extremely similar in their longevity, at 14 hours runtime. The regular Pixel 6 is hard to compare things to as we don’t have too many recent phones with 90Hz displays in our results set, however the Pixel 6 Pro should be a direct comparison point to the S21 Ultras, as both feature 5000mAh batteries and similar display characteristics. The P6Pro here ends up slightly ahead of the Exynos 2100 S21 Ultra, which might not be too surprising given that the Tensor chip does end up at somewhat lower CPU power levels, even if performance is lower. It’s still quite behind the Snapdragon 888 variant of the S21 Ultra – which is again quite representative of the SoC efficiency differences.

Web Browsing Battery Life 2016 (WiFi) Max Refresh

Running the phones at their respective max refresh rates, both devices see larger drops, however the Pixel 6 Pro especially sees a more substantial hit. This time around, the 6 Pro ends up significantly behind the Exynos 2100 S21 Ultra, which had only a minor drop in the 60 -> 120Hz results.

PCMark Work 3.0 - Battery Life (60Hz)

Shifting over to PCMark at 60Hz, we see that there’s a larger difference in favour of the Pixel 6, as the Pixel 6 Pro ends up behind it in longevity by almost two hours. The 6 Pro still ends up in line with the E2100 S21U, however that device showcases significantly higher performance numbers in the test, which acts both as a performance metric for device responsivity as well as a battery life test.

PCMark Work 3.0 - Battery Life (Max Refresh)

At 120Hz, the 6 Pro ends up worse than the E2100 S21U, and quite worse than the S888 S21U.

When I was investigating the phones, the 6 Pro’s power behaviour was quite weird to me, as I saw best-case baseline power figures of around 640mW, and sometimes this inexplicably would also end up at 774mW or even higher. What this reminded me of, was the power behaviour of the OnePlus 9 Pro, which also suffered from extremely high baseline power figures. Both the 6 Pro and the 9 Pro advertise themselves as having LPTO OLED panels, but both of them very clearly do not behave the same as what we’ve seen on the Note20Ultra or the S21Ultra phones. The 6 Pro also only goes up to up to 750 nits 100% APL peak brightness in auto-brightness mode under bright ambient light, which is significantly lower than the S21U’s 942 nits. I think what’s happening here is that the Pixel 6 Pro simply doesn’t have the most state-of-the-art display, and thus is quite less efficient as what we find on the competition. It does kind of make sense for the price-point of the phone, but also explains some of the battery behaviour.

Naturally, the Tensor SoC also just doesn’t appear to be as efficient. Particularly many UI workloads would be run on the A76 cores of the chip, which just outright have a 30% perf/W disadvantage. The phone ends up OK in terms of absolute battery life, however performance metrics are lower than other devices.

I think the regular Pixel 6 here is just a much better device as it doesn’t seem to have any particular issues in display efficiency, even if it’s just a 1080 90Hz panel. There are naturally experience compromises, but it’s also a $599 phone, so the value here is very good.

US readers who are used to Qualcomm phones might also encounter efficiency regressions when under cellular data – we abandoned doing testing here many years ago due to the impossible task to get consistent test environments.

Google's IP: Tensor TPU/NPU Conclusion & End Remarks
Comments Locked

108 Comments

View All Comments

  • Speedfriend - Thursday, November 4, 2021 - link

    The average laptop costs $500 and most expensive laptops are bought by enterprises where Mac OS has a limited share. While the Macbookz are great devices, they are hobbled by poor monitor support at the Air end and cray prices at the MacBook Pro end. For most users the difference between the performance of a MacBook Pro and a $1000 laptop is unnoticeable except in their wallet!
  • dukhawk - Tuesday, November 2, 2021 - link

    The chip is very Exynos design related. Looking through the kernel source and there are a ton of Exynos named files.
  • dukhawk - Tuesday, November 2, 2021 - link

    https://android.googlesource.com/device/google/rav...
  • defaultluser - Tuesday, November 2, 2021 - link

    If anyone wants to know know why Nvidia is most interested in purchasing ARM, it's in order to put the inefficient Mali out of it's misery - and simultaneously replace it with their own license-able Geforce cores!

    Since ARM Corp started throwing in the GPU for free, they've had to cut GPU research (to pay for the increasingly complex CPU cores, all of which come out of the same revenue box!) But Nvidia has the massive Server Revenue to handle this architecture-design mismatch; they will keep the top 50% of the engineers, and cut the other cruft loose!
  • melgross - Tuesday, November 2, 2021 - link

    That may be a side effect. But the reason for purchasing g it would be maki g money, and controlling the market. Yes, it’s true that Nvidia wa t to control all graphics and to turn the GPU into the main programming aim.
  • TheinsanegamerN - Tuesday, November 2, 2021 - link

    If nvidia wanted to do that they could simply license ARM and make their own superior chip. The fact they have fallen flat on their face every time they have tried speaks volumes.

    they want ARM for patents and $$$, nothing more.
  • defaultluser - Wednesday, November 3, 2021 - link

    When a place like Rockchip can sell an Arm chip bundled with Mali for Peanuts, you can understand why superior GPU wasn't enough to win Phone customers!

    You also need integrated modem if you ever want to compete with Qualcomm (not something Nvidia was willing to do).

    But that bundling system has been shorting ARM Mali development for years (Qualcomm, Apple, and soon Samsung (via AMD) are all bringing better high-end options into the field - you know your performance/watt must be pathetic when a company like Samsung is getting desperate-enough to pay the cost of porting AMD GPU over to ARM architecture.
  • Kvaern1 - Sunday, November 7, 2021 - link

    "If nvidia wanted to do that they could simply license ARM and make their own superior chip."

    ''simply'

    No, no one can simply do that anymore and only two companies can. NVidia just bought one of them.
  • melgross - Tuesday, November 2, 2021 - link

    I’m wondering about several things here.

    I don’t see the reason for using the A76 cores being one of time. This is a very new chip. The competitors on the Android side have been out for a while. They use A78 cores. Samsung uses A78 cores. So time doesn’t seem to be the issuer here, after all it does use the X1. So I wonder if it isn’t the size of the core on this already large, crowded chip that’s a reason, and possibly cost. If the newer cores take up more area they would cost slightly more. These chips are going to be bought in a fairly small number. Estimates have it that last year, Google sold between 4 and 7 million phones, and that they’re doubling this year’s order. Either would still be small, and give no advantage to Google in volume pricing compared to other chip makers.

    The second is that you have to wonder if Google is following the Apple road here. Apple, of course, designs many chips, all for their own use. Will Google keep their chips for their own use, assuming they’re as successful in selling phones as Google hopes, or will they, after another generation, or two, when the chip is more of their own IP, offer them to other Android phone makers, and if so, how will Samsung feel about that, assuming their contract allows it?
  • SonOfKratos - Tuesday, November 2, 2021 - link

    I think they went for the A76 cores because of cost, like you said Tensor is already huge and the A78 or A77 cores would be more power efficient but they are also much bigger than the A76 on 5nm process. Even if they were to clock an A78 lower it would just be a waste of money and space on the chip for them. They probably had a specific budget for the chip which meant a specific die size. This is not Apple who is willing to throw as much money as they can to get the best performance per watt.

    The display was rumored to be an E5 display from Samsung display which is in their latest display so I don't know why Google is not pushing for higher brightness but it could be because of heat dissipation as well...I highly doubt Samsung gave Google their garbage displays lol Also Google does not utilize the variable refresh rate very well and it's terrible for battery life. I have also seen a lot of janky scrolling with 120Hz in apps like Twitter..it has hiccups scrolling through the timeline compared to my Pixel 3.

    The modem is very interesting probably more so than Tensor, this is the first competition for Qualcomm in the US at least. A lot of people have been saying that the modem is integrated in Tensor but why would Google integrate a modem that does not belong to them in "their" chip? That's like asking Apple to integrate Qualcomm modems in their chip. Also Samsung pays Qualcomm royalties for 5G so they probably have a special agreement surrounding the sale and implementation of the modem. It is definitely not as power efficient as Qualcomm's implementation but it's Good start. I got 400+ Mbps on T-Mobile 5GUC outdoors and 200 Mbps indoors (I don't know which band). It surprisingly supports n258 band like the iPhone.

Log in

Don't have an account? Sign up now