64-Bit Support

ART was designed in mind with modularity of the various target architectures in which it is supposed to run on. As such, it provides a multitude of compiler-backends targeting today’s most common architectures such as ARM, x86 and MIPS. In addition, 64-bit support for ARM64, x86-64 and while still not implemented, also MIPS64.

While we have gone more in depth of the advantages and implications of switching over to 64-bit architectures in the iPhone 5s review, the main points to take away are the availability of an increased address space, generally increased performance, and vastly increased cryptographic capabilities and performance, all while maintaining full 32-bit compatibility with all existing apps.

An important difference that Google is applying over Apple, at least inside VM runtime applications, is that they are using reference compression to avoid the usual memory bloat that comes with the switch to 64-bit. The VM retains simple 32-bit references.

Google has made available some preview benchmarks showcasing the performance gains both on x86 and ARM platforms. The x86 benchmarks were executed on a Intel BayTrail system, and show a 2x to 4.5x speedup in various RenderScript benchmarks. On the ARM side, the crypto performance gains over 32-bit were showcased on an A57/A53 system. Both of these are relatively non-representative of one should really expect in real-world use-cases so they’re not that useful as a performance prediction.

However Google also made some interesting numbers available on one of their internal build-systems called Panorama. Here we can see a 13 to 19% increase in performance by simply switching over the ABI. It is also good to see how ARM’s Cortex A53 is able to make a bigger impact on performance when in AArch64 mode than the A57 cores.

Google claims that 85% of all current Play Store apps are immediately ready to switch over to 64 bit - which would mean that only 15% of applications have some kind of native code that needs targeted recompiling by the developer to make use of 64-bit architectures. This is a great win for Google and I expect the shift over to 64-bit to be very fast once silicon vendors start shipping 64-bit SoCs in the coming year.

Conclusion

In many points, Google has delivered its “Performance boosting thing” and addressed much of the shortcomings that have plagued Android for years.

ART patches up many of the Achilles’ heels that comes with running non-native applications and having an automatic memory management system. As a developer, I couldn’t have asked for more, and most performance issues that I needed to work around with clever programming no longer pose such a drastic problem anymore.

This also means that Android is finally able to compete with iOS in terms of application fluidity and performance, a big win for the consumer.

Google still promises to evolve ART in the future and its current state is definitely not what it was 6 months ago, and definitely not what it will be once the L release is made available in its final form in devices. The future looks bright and I can’t wait to see what Google will do with its new runtime.

Garbage Collection: Theory and Practice
POST A COMMENT

137 Comments

View All Comments

  • metayoshi - Wednesday, July 2, 2014 - link

    Nothing you say is wrong, but I think you hit the nail on the head with this sentence when it comes to Android: "It might make development easier for entry-level developers, but it certainly isn't an efficient way to do things when performance and user experience are important."

    I personally think Android didn't care that performance was so bad in the early days. The point of Android, from what I can tell, was to make things Open Source and make it easy for developers. As you said, having the OS manage memory itself, it's meant to make programming easy. I think that's what made it attractive to the likes of Motorola, Samsung, and HTC in the beginning. I think that's what made it popular with the OEMs, and eventually, that's what users were getting used to.

    Yes, precompiled code in interpreters are nothing new. But ART is changing what Android can do. It's not a new concept, I agree with you. But again, Android has had different priorities from the beginning than, say, writing purely in C and/or assembly for mission critical or safety critical systems where real time better be real time or else that car/plane/space shuttle will crash, or even in other not as critical embedded systems like HDDs and SSDs where performance and power matters more than anything. I think Android has always been about the easiness in its development environment, just like Java, and that's just where they put their priorities first. Now that their development environment has been pretty well founded, I think they're making the right steps with improving performance, first with the JIT compiler in 2.2, "Project Butter" in Jelly Bean, and now making the default environment ART instead of Dalvik in Android "L". They just had different priorities, and well... look at where Android is now.
    Reply
  • Hyper72 - Friday, July 4, 2014 - link

    I think you're completely right about ease of development being the priority for Android early on, after all they had to establish a market and needed apps quickly and easily. After Google bought the OS it suddenly got lots of developer attention and they just ran with the setup as it was. If Google had made lots of changes at that time they might as well have rolled their own. Reply
  • errorr - Thursday, July 3, 2014 - link

    The answer is in the article, it was about memory management really and once it was baked in all the development was to improve what already existsed.

    After Oracle sued them (pending) over Dalvik and creating their own VM it became abundantly clear that they needed to tear down the whole thing and start over.
    Reply
  • tacitust - Thursday, July 3, 2014 - link

    Google adopted Java for Android because it was a mature programming language, popular with developers, that they didn't have to create from scratch and had features (i.e. running in a VM) that made it easy to create secure apps that would run on a multitude of different hardware platforms. Java also had an affordable (i.e. free) development environment (Eclipse) that Google could build their development tools around.

    Clearly, with the incredible growth Android has enjoyed over the last six years, the decision to go with Java was anything but a mistake.

    As for compiler technology, the necessity to run the same apps on multiple hardware architectures precluded the use of traditional desktop and server based compilers, and the technology behind JIT compilers certainly hasn't been standing still over the last decade. The performance and battery deficits caused by the current VM environment are certainly not as bad as you think they are, given that modern Android tablets come pretty close to matching IOS which only has one hardware platform and architecture to worry about and where the software can be tightly integrated with that sole platform. It's not as good, no, but it's good enough for Samsung to sell millions of phones in direct competition with the iPhone.

    Yes, the time has come for Google to move on, but there should be nothing amazing about their use of a Java-based platform that has served them very well over the past six years. It was the right decision at the time.
    Reply
  • grahaman27 - Saturday, July 5, 2014 - link

    Well said. Reply
  • NetMage - Tuesday, July 8, 2014 - link

    I think they could have produced a much better product if they had used C++ instead - native performance and battery life when it was needed in the early days, and probably faster than ios performance today. Reply
  • iAPX - Wednesday, July 2, 2014 - link

    So why not people upgrade if it works so well on Android side? Reply
  • zodiacsoulmate - Thursday, July 3, 2014 - link

    Very impressive Reply
  • mstestzzz000 - Thursday, July 3, 2014 - link

    Inaccuracy in the article:
    "This new allocator, “rosalloc” or Rows-of-Slots-Allocator, ..."

    If you look at the source code for rosalloc (line 39 of https://android.googlesource.com/platform/art/+/ma... they call it "A runs-of-slots memory allocator"
    Reply
  • Milind - Thursday, July 3, 2014 - link

    I think you are absolutely right there. I doubt that merely doing AOT compiling is going to produce faster results and that's exactly what I experienced when I switched from Dalvik to ART in 4.4. Of course there are going to be more improvements in L since the code itself has improved. I mean who was launching an app on Android and wishing it would *launch* faster? There may have been apps that took their time launching. But not too many. On the other hand, better garbage collection and other improvements will certainly help in run-time performance. AOT is not doing anything much compared to JIT.

    I always wondered why Google didn't buy Sun. Both companies have similar DNA (certainly better than Oracle and Sun) and Android could have used all the expertise Sun had in building JVMs and Real Time Java in Android and the rest of Google. They could have sold off the hardware division to IBM/Oracle and not have had to deal with the heart ache and drama of the lawsuit.
    Reply

Log in

Don't have an account? Sign up now