Android Things enables you to build and maintain IoT devices at scale. We recently released Android Things 1.0 with long-term support for production devices, so you can easily take an IoT device from prototype to commercial product.
We packed Google I/O this year with Android Things content to inspire and empower the developer community, from talks and codelabs to interactive demos and a scavenger hunt. Here’s a closer look at the fun stuff we had on display that you won’t see on the shelves of retail stores.
We introduced a handful of new interactive Android Things demos across I/O, showcasing the AI and ML capabilities of the platform, so if you didn’t get an opportunity to attend this year, here’s a few of our favorites– perfect for exploring from wherever you are in the world!
Smart Flowers: Flos Mobilis
What do you get when you combine machine learning, Android Things and robotics? Flos Mobilis, a continuum robot where each flower is backed by an i.MX7D development board and a camera to run an embedded neural net model that controls the motion of the flower. This is all done offline with no data stored or transmitted.
Smart Flowers: Flos Affectus
What if a robot could respond to the way you feel? Flos Affectus is a cluster of robotic flowers that “bloom” and “un-bloom” depending on the expression detected on the user’s face. The 4 broad expressions Flos Affectus is trained to detect are: happy, sad, angry, surprised. Using a camera embedded in the head of the alpha flower, the flower cluster is able to detect the user’s face and infer the facial emotion. The flower cluster runs offline with no data stored or transmitted and demonstrates movement capabilities and on-device machine learning models.
Rosie the Android
Initially designed by a team of Google engineers for the annual Grace Hopper conference, Rosie the Android is a 5 foot selfie-taking Android, complete with machine-learning capabilities. Inspired by Rosie the Riveter, she’s a fully controllable robot that can take photos, respond to commands, wheel around and interact with those around her.
Did you take a selfie with Rosie at I/O? Redeem your unique access code at g.co/rosie
Smart Projector is built on Lantern, an Android Things project exploring the relationship between surfaces and content — augmenting real-world objects and environments with glanceable, meaningful data. It leverages the Google Experiments project known as Quick Draw, using the world’s largest doodling data set that has been shared publicly to help with machine learning research.
To learn more about Lantern or to start building your own, start here.
This modified Printrbot Smalls 3D Printer uses a real-time subsystem that showcases the flexibility of Android Things– a microcontroller does the low-latency motor control, while Android Things handles OpenGL rendering. By keeping most of the logic on a high-level platform like Android you make development and debugging much easier, thanks to Android’s great tooling.
The future of 3D printing? Making real-time control as easy and portable as the rest of Android Things.
Phew! That was just the tip of the demo iceberg. With so many demos and so many ways to use Android Things, it’s easy to start imagining all the things you can build! At I/O, we helped a lot of developers get started building their first Android Things device using the Android Things Starter Kit. We’re making these codelabs available, so you can get to them whenever you need, or build your own.
Missed the I/O talks? Catch the recordings of each Android Things talk, so you can start, pause, and rewind at your own leisure. Or, just lean back and watch them all.
On top of all the resources we just mentioned, we have a corpus of information on our developer documentation, and our new community website where you can see more inspiring projects and even submit your own. So, what are you waiting for? Pick up an Android Things Starter Kit and start building something today!