What Every Android Developer Needs to Know About Google I/O 2016

Check out the highlights from Google IO 2016 that every Android developer should know! By Huyen Tue Dao.

Leave a rating/review
Save for later


From May 18th to 20th, I was super lucky to have the opportunity to attend Google I/O.

This year was the 10th anniversary of Google’s annual conference, and it brought a change of venue: instead of the usual Moscone Center in San Francisco, it was outdoors at the Shoreline Amphitheatre right by the Google campus in Mountain View, CA.

The event certainly had a unique, more festival-like atmosphere at the outdoor venue than last year—as well as higher sunburn per capita. :]

But of course, it’s not all about the venue. A ton of Google I/O 2016 announcements and reveals aimed at us Android developers were at the heart of the conference. These announcements covered all sorts of areas of Android development from Android N to Android Studio to virtual reality to Firebase.

In this article, I will take you on a quick tour of the newest and shiniest of these announced Android goodies and let you know where you can get a closer look.

Here we go!

Android N

Let’s start with the basics: the SDK. During the I/O keynote, Google announced a new release of the newest version of the Android operations system: Android N Developer Preview 3!

A significant portion of the information presented on Android N included previously-announced features. However, DP3 also contains plenty of new features, APIs, and (of course) bug fixes. :]

One of these new APIs is the FrameMetricsListener API, which allows you to measure UI rendering performance at a per-frame granularity and collect data for a particular user interaction. While you could previously do this with adb shell dumpsys gfxinfo framestats, the FrameMetricsListener API allows you to perform measurements from within the app and is not limited to the 120 frame limit of dumpsys.

For anyone interested in developing virtual reality applications, DP3 introduces a new VR Mode that is specifically for high-quality VR apps and provides access to an exclusive VR-specific CPU, single buffer rendering (which speeds up display drawing), increased sensor performance, and stereo notifications. Overall, apps running in VR Mode should run with high-performance as well as reduced “motion-to-photon ratio” (the latency between user head movement and screen updates that can induce motion sickness).

In actuality, VR Mode is not just a new Android N feature. It is actually part of a new virtual reality platform that was one of the big keynote announcements.

Virtual Reality: Daydream

This new VR platform powered by Android is called Daydream. While the previous Cardboard platform allowed any phone on Android 4.1 and above the ability to run VR experiences, Daydream focuses on bringing high-quality, next-level VR to users.

Daydream consists of both new software and new hardware components. On the software side is Android N’s VR Mode that tunes the platform for optimal rendering and interaction. On the hardware side is a VR headset reference design, a new controller, and “Daydream-ready” smartphone specifications.

The controller—which might remind some of you Nintendo fans of a Wiimote–is small but packed with several sensors providing users three degrees of freedom to move and gesture within VR apps. While the controller (and really any of the Daydream hardware) is not yet available, there is a controller emulator app available as part of the Daydream Development Kit.

The headset reference design outlines a headset geared towards long term use, though no concrete examples are available just yet. The high-end smartphone specifications follow the Daydream’s progression to more high-quality, high-performance VR experiences, but it is important to note that Google’s VR lead, Clay Bavor, states that most, if not all, of currently available smartphones are not “Daydream-ready”.

Daydream itself will be out Fall 2016. However, if you are itching to be part of those next-gen VR apps, you can get started with a Daydream Development Kit if you have a Nexus 6P, a second phone running KitKat or above, and a Google Cardboard (or other VR Viewer).

Fair warning: You will need to install DP3 on the Nexus 6P. Also (as hinted above) the 6P is not actually “Daydream-ready” and so you might experience throttled CPU and GPU performance depending on your app’s work load.

Android Wear 2.0

A big announcement for Android developers of the Wear variety is Android Wear 2.0, a huge update and re-work of many of the Wear foundations both from a developer and a user perspective.

Standalone Apps

The most significant change in Android Wear 2.0 is the introduction of standalone apps. Apps maintain full functionality even if a user’s phone is far away or is turned off, and they will have direct network access to the cloud. Rather than being embedded inside of phone apps, Wear apps will instead be delivered as independent APKs. This will allow developers to update Wear apps independently of the matching phone app. This new independence of Wear apps will also mean new ways for users to authenticate independently of a phone.


Android Wear 2.0 adds keyboard input, which recognizes both normal and gesture typing, as well as Smart Reply (Inbox’s automatic responses generated by machine learning from previously received messages).


I did not know this before hearing about Android Wear 2.0’s new Complications API, but “complications” is in fact a horology term that describes “any feature in a timepiece beyond the simple display of hours and minutes” (Wikipedia). The Complications API allows watch face developers to connect with data provider apps to display extra information.

Material Design

Android Wear gets some more Material Design love with two new navigation components:

  • A navigation drawer which is analogous to the phone navigation drawer, allowing users to jump to different areas within an app.
  • An action drawer which lives at the bottom of the screen and provides context-specific actions.

Check out the Material Design for Wearables documentation for good practices in wearable design. For example, something I learned during one of I/O’s Wear sessions is to use darker color palettes for Wear apps because:

  1. Darker colors make watch screens less obtrusive in social environments.
  2. OLED displays use less power in rendering darker colors.
Huyen Tue Dao


Huyen Tue Dao


Over 300 content creators. Join our team.