Google has rolled out the schedule for its upcoming I/O developer conference, and it’s packed with developer-centric sessions. While they promise to shed light on the forthcoming Android XR operating system, Google’s keeping things relatively quiet about Android XR for now.
Android XR hasn’t been in the spotlight since its announcement in December, alongside Samsung’s ‘Project Moohan’ mixed reality headset. Neither of these has an official release date yet, but they are expected to debut later this year.
Google has confirmed several Android XR features, including the eagerly anticipated passthrough camera access and has provided developer access to the Android XR SDK. However, there’s still curiosity about how it will measure up to the more established XR ecosystems, such as Meta’s Horizon OS and Apple’s visionOS.
The Google I/O conference will feature numerous livestreamed keynotes from May 20th to 21st, although only two developer talks specifically about Android XR have been announced. Unfortunately, neither of these will be livestreamed. However, there will be a ‘What’s New in Android’ livestream, which is expected to touch on Android XR.
Without substantial information from the livestream, the two developer sessions on Android XR suggest that Google’s making a developer-friendly push into XR. They aim to further integrate it into the larger Android ecosystem, though not quite ready for a big public reveal during livestreamed keynotes.
Here’s a glimpse of what the talks entail: Android XR is gearing up for a public release later this year, and Google is preparing a new XR toolchain that bundles Jetpack SceneCore and ARCore into an XR-specific version of Jetpack. Currently in developer preview, Jetpack XR allows developers to create spatial layouts using 3D models and immersive environments, promising a unified platform for AR and VR experiences.
The sessions also emphasize incorporating XR features into existing apps, such as 3D models, hand-tracking, and stereoscopic video. This highlights Google’s intent to attract developers beyond just the gaming sphere, likely aiming to bring Android XR up to speed with the broader Android ecosystem.
Additionally, Google plans to extend its declarative UI toolkit, Jetpack Compose, to XR. This move indicates the company’s desire to standardize UI design across mobile, tablet, and XR, simplifying the process of porting or adapting UIs for immersive environments.
Interestingly, one of the sessions highlights upcoming AI capabilities within Android XR, hinting at potential features like real-time object recognition, scene understanding, or AI-generated environments in the future.
It’s worth noting that since neither talk is being livestreamed, Google might not be ready to fully unveil Android XR just yet. There’s also anticipation around hearing more about Samsung’s ‘Project Moohan’ headset, expected to be the first with Android XR support.
In any case, I’ll be watching the livestreams and covering the technical talks, eager to hear any fresh updates.
Building Differentiated Apps for Android XR with 3D Content
Presented by Dereck Bridié, Developer Relations Engineer, and Patrick Fuentes, Developer Relations Engineer
"Join us to introduce Jetpack SceneCore and ARCore for Jetpack XR. We’ll guide developers through integrating immersive content like 3D models, stereoscopic video, and hand-tracking into existing apps. Explore new features in the Android XR SDK developer preview and gain essential insights for Android XR’s public launch later this year."
The Future is Now, with Compose and AI on Android XR
Presented by Cecilia Abadie, Senior Product Manager, and Jan Kleinert, Developer Relations Engineer
"Discover the future of immersive experiences with Android XR in this session, which unveils the latest updates to the Android XR SDK Beta launching at I/O. Learn about enhancements to Jetpack Compose for XR and cutting-edge AI capabilities. Understand how your existing investments in large screen development can expand into the thrilling world of Android XR."