With the latest I/O conference, Google has finally publicly announced its plans for its new runtime on Android. The Android RunTime, ART, is the successor and replacement for Dalvik, the virtual machine on which Android Java code is executed on. We’ve had traces and previews of it available with KitKat devices since last fall, but there wasn’t much information in terms of technical details and the direction Google was heading with it.

Contrary to other mobile platforms such as iOS, Windows or Tizen, which run software compiled natively to their specific hardware architecture, the majority of Android software is based around a generic code language which is transformed from “byte-code” into native instructions for the hardware on the device itself.

Over the years and from the earliest Android versions, Dalvik started as a simple VM with little complexity. With time, however, Google felt the need to address performance concerns and to be able to keep up with hardware advances of the industry. Google eventually added a JIT-compiler to Dalvik with Android’s 2.2 release, added multi-threading capabilities, and generally tried to improve piece by piece.

However, lately over the last few years the ecosystem had been outpacing Dalvik development, so Google sought to build something new to serve as a solid foundation for the future, where it could scale with the performance of today’s and the future’s 8-core devices, large storage capabilities, and large working memories.

Thus ART was born.

Architecture

First, ART is designed to be fully compatible with Dalvik’s existing byte-code format, “dex” (Dalvik executable). As such, from a developer’s perspective, there are no changes at all in terms of having to write applications for one or the other runtime and no need to worry about compatibilities.

The big paradigm-shift that ART brings, is that instead of being a Just-in-Time (JIT) compiler, it now compiles application code Ahead-of-Time (AOT). The runtime goes from having to compile from bytecode to native code each time you run an application, to having it to do it only once, and any subsequent execution from that point forward is done from the existing compiled native code.

Of course, these native translations of the applications take up space, and this new methodology is something that has been made possible today only due to the vast increases in available storage space on today’s devices, a big shift from the early beginnings of Android devices.

This shift opens up a large amount of optimizations which were not possible in the past; because code is optimized and compiled only once, it is worth to optimize it really well that one time. Google claims that it now is able to achieve higher level optimizations over the whole of an applications code-base, as the compiler has an overview of the totality of the code, as opposed to the current JIT compiler which only does optimizations in local/method chunks. Overhead such as exception checks in code are largely removed, and method and interface calls are vastly sped up. The process which does this is the new “dex2oat” component, replacing the “dexopt” Dalvik equivalent. Odex files (optimized dex) also disappear in ART, replaced by ELF files.

Because ART compiles an ELF executable, the kernel is now able to handle page handling of code pages - this results in possibly much better memory management, and less memory usage too. I’m curious what the effect of KSM (Kernel same-page merging) has on ART, it’s definitely something to keep an eye on.

The implications to battery life are also significant - since there is no more interpretation or JIT-work to be done during the runtime of an app, that results in direct savings of CPU cycles, and thus, power consumption.

The only downside to all of this, is that this one-time compilation takes more time to complete. A device’s first boot, and an application’s first start-up will be much increased compared to an equivalent Dalvik system. Google claims that this is not too dramatic, as they expect the finished shipping runtime to be equivalent or even faster than Dalvik in these aspects.

The performance gains over Dalvik are significant, as pictured above; the gains are roughly a 2x improvement in speed for code running on the VM. Google claimed that applications such as Chessbench that represent an almost 3x increase are a more representative projection of real-world gains that can be expected once the final release of Android L is made available.

Garbage Collection: Theory and Practice
Comments Locked

136 Comments

View All Comments

  • Krysto - Thursday, July 3, 2014 - link

    I thought it was clear that the ART in L is NOT the one in KitKat, and has been revamped quite a bit. The final one, 5 months from now, will probably have big changes, too.
  • Notmyusualid - Thursday, July 3, 2014 - link

    Will keep an eye out for it, but I'm expecting this to be no big deal now.
  • ergo98 - Wednesday, July 2, 2014 - link

    Too much has been made regarding AOT and JIT. Note that Dalvik generally only JITs the DEX once, storing the result in /data/dalvik-cache.

    The big difference between Dalvik and ART is simply that ART was rewritten from the ground up based upon everything they learned from the Dalvik experience.
  • errorr - Thursday, July 3, 2014 - link

    That and because of the Oracle lawsuit over Dalvik which is nicely mooted by ART.
  • doubledeej - Wednesday, July 2, 2014 - link

    It never ceases to amaze me how many problems that were solved decades ago in computing are problems on modern computing platforms.

    Real compilation of code has been around forever -- the norm, in fact, for desktop and server computing with a few notable exceptions. Yet somehow taking what effectively amounts to interpreting code (just-in-time compilation is very similar to interpretation) and switching to compiling it ahead of execution is being touted as a new idea.

    The fact that Android has pretty much been completely reliant upon JIT running in a VM has always made me scratch my head. As clearly spelled out in the article, it cause huge performance issues, along with significant hits to battery life. And we're talking about mobile devices where we've got relatively low-power CPUs and GPUs, little memory, and finite battery capacity. But it has been the way that Android has worked from the beginning. Crazy that it hasn't really been addressed until now.

    And the idea that operating systems and development languages be in charge of garbage collection, and people being surprised that it causes performance hits, seems odd to me too. Managing your own memory isn't that hard to do. And it is a hell lot more efficient doing it yourself than making the language or OS figure out how to do it. It's a "clean up your own mess and put things back where you want them" vs. "make someone else do it and let them try to figure out where things go" situation. It might make development easier for entry-level developers, but it certainly isn't an efficient way to do things when performance and user experience are important.

    Because the developers that I work with aren't accustomed to managing memory, we're constantly running into issues. We've got scripts that allocate dozens or hundreds of megabytes of RAM and don't free it when they're done. They'll go through 3, 4, or 5 more of these processes within a single script, not freeing memory they're done with along the way, so by the time the script is done running hundreds of megabytes that aren't needed are still tied up. Because the language can't be sure if data is going to be used again it hangs around until the script has finished running.

    Create dozens or hundreds of instances of one of those scripts and you've got a performance nightmare. Relying on a language or OS to do garbage collection will have the same net result.
  • metayoshi - Wednesday, July 2, 2014 - link

    Nothing you say is wrong, but I think you hit the nail on the head with this sentence when it comes to Android: "It might make development easier for entry-level developers, but it certainly isn't an efficient way to do things when performance and user experience are important."

    I personally think Android didn't care that performance was so bad in the early days. The point of Android, from what I can tell, was to make things Open Source and make it easy for developers. As you said, having the OS manage memory itself, it's meant to make programming easy. I think that's what made it attractive to the likes of Motorola, Samsung, and HTC in the beginning. I think that's what made it popular with the OEMs, and eventually, that's what users were getting used to.

    Yes, precompiled code in interpreters are nothing new. But ART is changing what Android can do. It's not a new concept, I agree with you. But again, Android has had different priorities from the beginning than, say, writing purely in C and/or assembly for mission critical or safety critical systems where real time better be real time or else that car/plane/space shuttle will crash, or even in other not as critical embedded systems like HDDs and SSDs where performance and power matters more than anything. I think Android has always been about the easiness in its development environment, just like Java, and that's just where they put their priorities first. Now that their development environment has been pretty well founded, I think they're making the right steps with improving performance, first with the JIT compiler in 2.2, "Project Butter" in Jelly Bean, and now making the default environment ART instead of Dalvik in Android "L". They just had different priorities, and well... look at where Android is now.
  • Hyper72 - Friday, July 4, 2014 - link

    I think you're completely right about ease of development being the priority for Android early on, after all they had to establish a market and needed apps quickly and easily. After Google bought the OS it suddenly got lots of developer attention and they just ran with the setup as it was. If Google had made lots of changes at that time they might as well have rolled their own.
  • errorr - Thursday, July 3, 2014 - link

    The answer is in the article, it was about memory management really and once it was baked in all the development was to improve what already existsed.

    After Oracle sued them (pending) over Dalvik and creating their own VM it became abundantly clear that they needed to tear down the whole thing and start over.
  • tacitust - Thursday, July 3, 2014 - link

    Google adopted Java for Android because it was a mature programming language, popular with developers, that they didn't have to create from scratch and had features (i.e. running in a VM) that made it easy to create secure apps that would run on a multitude of different hardware platforms. Java also had an affordable (i.e. free) development environment (Eclipse) that Google could build their development tools around.

    Clearly, with the incredible growth Android has enjoyed over the last six years, the decision to go with Java was anything but a mistake.

    As for compiler technology, the necessity to run the same apps on multiple hardware architectures precluded the use of traditional desktop and server based compilers, and the technology behind JIT compilers certainly hasn't been standing still over the last decade. The performance and battery deficits caused by the current VM environment are certainly not as bad as you think they are, given that modern Android tablets come pretty close to matching IOS which only has one hardware platform and architecture to worry about and where the software can be tightly integrated with that sole platform. It's not as good, no, but it's good enough for Samsung to sell millions of phones in direct competition with the iPhone.

    Yes, the time has come for Google to move on, but there should be nothing amazing about their use of a Java-based platform that has served them very well over the past six years. It was the right decision at the time.
  • grahaman27 - Saturday, July 5, 2014 - link

    Well said.

Log in

Don't have an account? Sign up now