XrMay 14, 2026

What XR Developers Should Actually Watch for at Google I/O Next Week

By Nina Castillo
Staff Writer, VR.org

Google I/O 2026 starts on May 19, one week after The Android Show laid out the consumer-facing XR story. The Android Show gave us Gemini Intelligence branding, fashion house partnerships, and Galaxy XR headset updates. I/O is where the developer story gets real. If you are building for spatial computing, here is what to watch.

Advertisement
Google I/O 2026 developer keynote branding and event preview
Image: Google / YouTube

Hands-On XR Glasses for the First Time

Google has confirmed that Android XR glasses will be available for hands-on demos at I/O. This is the first time developers and press will physically interact with the hardware outside of controlled briefings at Valve HQ or CES booths. Road to VR reported that Google has been cautious about public demos, preferring to wait until the software experience was polished enough to withstand scrutiny from a technical audience.

The demos will likely showcase both tiers of glasses announced at The Android Show: audio-only AI glasses for screen-free Gemini interaction, and display glasses with transparent in-lens overlays for navigation, translation, and notifications. For developers, the hands-on sessions are where you learn whether the field of view, latency, and interaction model actually work in practice, something no spec sheet can tell you.

Two Sessions That Matter

Google published the I/O session schedule in April, and two XR-specific technical talks stand out. The first is "Building differentiated apps for Android XR with 3D content," which covers Jetpack SceneCore and ARCore for Jetpack XR. This session will walk through adding immersive content like 3D models, stereoscopic video, and hand tracking to existing Android apps. If you already have a 2D app and want to extend it with spatial features when running on glasses or the Galaxy XR headset, this is the session that explains the APIs.

The second session is "The future is now, with Compose and AI on Android XR," which focuses on building XR-differentiated UI using Jetpack Compose and integrating Gemini AI capabilities directly into spatial experiences. This is where the Jetpack Projected APIs from SDK Developer Preview 3 get practical. Those APIs let phone apps push lightweight XR experiences to connected glasses, and this session should clarify how Compose components translate from a phone screen to an in-lens display.

PSVR2 This Week covering XR news and VR updates for May 2026
Image: YouTube

The Toolchain Consolidation

The bigger story at I/O is likely the consolidation of Google's spatial computing tools. ARCore, Jetpack SceneCore, and the XR-specific Jetpack libraries are being bundled into a unified XR toolchain. This is significant because it means developers will have one set of APIs for building both AR and VR experiences on Android XR rather than juggling separate SDKs for different use cases.

The practical impact is that if you build an AR feature using ARCore for Jetpack XR today, it should work on Samsung Galaxy Glasses when they ship later this year without requiring a separate codebase. The same is true for VR experiences targeting the Galaxy XR headset. Google wants Android XR to be what Android was to phones: one platform that scales across a range of hardware from different manufacturers. The toolchain consolidation is how they make that real for developers.

Android 17, which Google previewed at The Android Show, deepens this story further. The operating system represents an architectural merger of Android, Chrome OS, and Android XR into a single unified system. Apps built for Android 17 are meant to run natively on phones, tablets, Chromebooks, and XR devices without separate builds. For developers, that is either an enormous simplification or an enormous testing matrix, depending on how well Google executes on the promise.

Discussion about Steam Frame and the broader VR platform ecosystem in 2026
Image: YouTube

What to Watch For

The Android Show covered the consumer messaging. I/O is where Google needs to prove the developer experience is ready. Watch for three things: first, whether the hands-on glasses demos show production-quality software or early prototypes with rough edges. Second, whether the unified toolchain actually reduces complexity or just adds another abstraction layer. Third, whether Google announces a public release timeline for the Android XR SDK beyond "developer preview."

The keynote streams live at 10 AM PT on May 19, followed by the developer keynote at 1 PM PT. Both sessions and the full I/O schedule are available at io.google/2026.

Advertisement