Conclusion

Samsung's System LSI business had a rough two years as their decision to go with ARM's big.LITTLE SoC architecture cost them a lot of market share, thanks in part to immature software and implementation issues. Usually in the past Samsung's own Exynos SoCs were regarded as the more performant variant given the choice of Qualcomm's Scorpion CPU based solutions. This changed as the Exynos 5410 came out with a malfunctioning CCI, crippling the chip to the most battery inefficient operating mode of big.LITTLE.

Qualcomm's Snapdragon 800 capitalized on the new 28nm HPM manufacturing process, along with the advantage of being able to offer an integrated modem solution, and has dominated the market ever since. It's only now that Samsung is able to recover as the new 20nm manufacturing process allowed them to catch up and start to offer their own Exynos SoC in more variants of its products, a trend that I expect to continue in Samsung's future lineup.

The Note 4 with the Exynos 5433 is the first of a new generation, taking advantage of ARM's new ARMv8 cores. On the CPU side, there's no contest. The A53 and A57 architectures don't hold back in terms of performance, and routinely outperform the Snapdragon 805 by a considerable amount. This gap could even widen as the ecosystem adopts ARMv8 native applications and if Samsung decides to update the phone's software to an AArch64 stack. I still think the A57 is a tad too power hungry in this device, but as long as thermal management is able keep the phone's temperatures in reign, which it seems that it does, there's no real disadvantage to running them at such high clocks. The question is whether efficiency is where it should be. ARM promises that we'll be seeing much improved numbers in the future as licensees get more experience with the IP, something which we're looking forward to test.

On the GPU side, things are not as clear. The Mali T760 made a lot of advancements towards trying to catch up with the Adreno 420 but stopped just short of achieving that, leaving the Qualcomm chip a very small advantage. I still find it surprising that the Mali T760 is able to keep up at all while having only half the available memory bandwidth; things will get interesting once LPDDR4 devices come in the next few months to equalize things again between competing SoCs. Also ARM surprised us with quite a boost of GPU driver efficiency, something I didn't expect and which may have real-world performance implications that we might not see in our synthetic benchmarks.

It's the battery life aspect that I think it's most disappointing to me. It's a pity that Samsung didn't go through more effort to optimize the software stack in this regard. When you are able to take advantage of vertical integration and posses multi-billion dollar semiconductor manufacturing plants with what seem to be talented SoC design teams, it's critical to not skimp out on software. I might be a bit harsh here given that the battery disadvantage was just 12% in our web-browsing test and might be less in real-world usage, and the GPU battery efficiency seems neck-and-neck. Still, it's the wasted potential from a purely technical perspective that is disheartening.

This is definitely a wake-up call to ARM and their partners as well. If the software situation of big.LITTLE isn't improved soon I'm fearing that ship will have sailed away, as both Samsung and Qualcomm are working on their custom ARMv8 cores.

So the question is, is it still worth to try and get an Exynos variant over the Snapdragon one? I definitely think so. In everyday usage the Exynos variant is faster. The small battery disadvantage is more than outweighed by the increased performance of the new ARM cores.

Battery Life & Charge Time
Comments Locked

135 Comments

View All Comments

  • MrSpadge - Tuesday, February 10, 2015 - link

    Excellent work, guys! But let me point out something:

    1. Your Geekbench numbers show spectacular gains for the A53. They're actually so good that the multi-threading speed-up almost always exceeds the number of cores (judging by eye, have not run the numbers). For A7 the speedup factor is a bit less than 4, right where it should be. It certainly looks like the big cores were kicking in, at least partly.

    2. You wonder that power scales sub-linearly with more cores being used. Actually this is just what one would expect in the case of non-ideal multi-thread scaling (as the A7 shows in Geekbench). The load of all ccores may be the same, percentage-wise, but when running many threads these are competing for L2 cache and memory bandwidth. There will be additional wait times under full load (otherwise the memory subsystem is vastly oversized), so each core is busy but working a little less hard.

    3. The lower performance per W of the A53: this may well be true. But if the big cores did interfere, it could certainly explain the significantly increased power draw.

    4. You hope for A53 at 1.5 - 1.7 GHz. I agree for CPUs which use only A53's. But as little cores this is not necessary. If you push a design to higher frequencies at the same technology node you always pay in terms of power efficiency (unless you start bad / unoptimized). Better let the little cores do what they're best at: be efficient.

    ... I'm off to the next page :)
    MrS
  • Andrei Frumusanu - Tuesday, February 10, 2015 - link

    Regarding your concerns on the big cores turning on: They did not. I made sure that they're always offline in the little testing. As to why GeekBench scaled like that, I can't answer.
  • jjj - Tuesday, February 10, 2015 - link

    Geekbanch scales like that sometimes, TR looked at this phone last week and they notice the Shield tab scaling better than 4x in some tests.
    For the A53 better scaling over A7 maybe it's the memory subsystem? No clue how heavy GB is on it but the A53 does have more memory BW and maybe faster NAND (you don't have storage tests for the Alpha ).
  • MrSpadge - Tuesday, February 10, 2015 - link

    Sometimes one gets such results if the MT code path uses newer libraries to get multi threading, but also introduce other optimizations the tester is not aware of. Or they have to rewrite code/algorithm to make it multi-threaded and thereby create a more efficient version "by accident". I don't know if any of this applies to GB, though.
  • tipoo - Tuesday, February 10, 2015 - link

    It's done that before, I think the cache is almost certainly involved. Maybe it has good cache reusability for multicore in its testing.
  • SydneyBlue120d - Tuesday, February 10, 2015 - link

    Excellent article, thanks a lot for that.
    I'd like to dig deeper about why Samsung is ruining the wonderful Wolfson DAC, I remember that even the legendary François Simond shared the issue with Samsung Developers without getting a response, maybe You will be able to have a response :) Also, I remember an audio quality test suite, why didn't You use it in this review? Thanks a lot.
  • Andrei Frumusanu - Tuesday, February 10, 2015 - link

    We realized testing audio without having the proper equipment is a futile exercise and does not really portray audio quality of a device. Only Chris has the professional equipment to objectively test audio, such as done in the iPhone6 review: http://www.anandtech.com/show/8554/the-iphone-6-re...
  • PC Perv - Tuesday, February 10, 2015 - link

    Thank you so much for ditching SunSpider, and thank you for explaining why. It has gone way too long.
  • PC Perv - Tuesday, February 10, 2015 - link

    I know you are trying to be nice to your colleague and give them a benefit of doubt, but to mercilessly critical eyes of mine your colleagues (Joshua Ho & Brandon Chester) have shown and time and time again they are not qualified to review Android products. I hope Mr. Frumusanu and Mr. Smith will take charge of this area in the future and limit those two to Apple product reviews.

    I would love to read quality reviews and analysis like this instead of their tamper tantrum.
  • PC Perv - Tuesday, February 10, 2015 - link

    For example, in the comment section here,

    http://www.anandtech.com/comments/8795/understandi...

    [Q]The increase in brightness for AMOLEDs at 80% APL rather than 100% APL is not very significant, and changing testing to accommodate AMOLED's idiosyncrasies doesn't seem like a good idea either. To put it in perspective, even if I had tested the Nexus 6 at 80% APL in the review my conclusion about the brightness being sub-par would have been exactly the same.[/Q]

    That is what Brandon Chest had to say about brightness of the Nexus 6's screen. So arrogant, so biased. But according to the scientific data provided by this article (pg. 10), APL can indeed make a huge difference in AMOLED screen's brightness, given power target. For example, at display power target is 1W, 70% APL will raise the brightness by approximately 40%. At 50% APL, max brightness practically doubles.

    I was appalled by the disparaging and arrogant attitude by Brandon Chester when it comes to correctly evaluating Android devices. He also made similarly nonsensical "arguments" in his tablet recommendation article for the holidays.

Log in

Don't have an account? Sign up now