SoftwareMay 7, 2026

Google Just Opened Android XR Glasses Development to Everyone. Here Is What Developers Get.

By Nina Castillo
Staff Writer, VR.org

Google just released Android XR SDK Developer Preview 3, and for developers watching the smart glasses space, this is the update that matters. For the first time, Android XR officially supports glasses development with dedicated libraries, a purpose-built UI framework, and an emulator that simulates glasses-specific hardware like touchpad input and limited field of view. The headset tools are more stable. But the glasses tools are entirely new.

Advertisement

This lands at a significant moment. Samsung Galaxy Glasses are expected this summer. Warby Parker's Google-backed glasses are targeting 2026. XREAL Project Aura is coming. Gentle Monster and Gucci are building Android XR devices for 2027. Developers now have something concrete to build against, months before any of that hardware reaches consumers.

Android XR SDK Developer Preview 3 announcement for AI glasses development
Image: YouTube

Jetpack Compose Glimmer

The headline library is Jetpack Compose Glimmer, a UI toolkit built specifically for AI glasses with transparent displays. If you have worked with Jetpack Compose on Android, the programming model will feel familiar. The difference is that Glimmer is optimized for the constraints of glasses: small display areas, transparent backgrounds, content that needs to be legible against the real world, and interactions that cannot demand the user's full attention.

Glimmer is designed to produce what Google calls "minimal and comfortable" UI. That means high contrast elements, simplified layouts, and consideration for the fact that glasses are worn all day. The toolkit acknowledges something that headset UI never had to: glasses UI competes with reality for your attention, and reality should usually win. The information shown needs to be glanceable, not immersive.

For developers, the practical upside is that you do not need to learn an entirely new framework. If you know Compose, you know most of Glimmer. The spatial considerations and transparency handling are new, but the component model, state management, and layout system carry over from standard Android development.

Jetpack Projected

The second library, Jetpack Projected, solves a different problem. Not every glasses experience needs to run natively on the glasses hardware. Some experiences are better driven by a phone and projected to the glasses as a secondary display. Jetpack Projected enables exactly this: a host device like a phone runs the app logic and pushes the XR experience to connected glasses via audio and video streams.

This matters for first-generation glasses that may have limited onboard processing power. Samsung's Jinju model uses a relatively small battery and a mobile-class chip. Offloading heavy computation to a phone while displaying results on the glasses is a practical architecture that avoids the thermal and battery constraints of running everything locally.

Projected also opens up existing Android apps to glasses without requiring a full rewrite. If your app already runs on a phone, you can use Projected to extend specific experiences to glasses hardware with considerably less engineering effort than building a native glasses app from scratch.

Jetpack Glimmer and Projected libraries explained for Android XR glasses development
Image: YouTube

The AI Glasses emulator

Testing glasses apps previously required physical hardware that does not exist in consumer hands yet. Developer Preview 3 adds an AI Glasses emulator to Android Studio that simulates glasses-specific interactions including touchpad input, voice commands, and the limited field of view that actual glasses hardware will provide.

The emulator matches real device specifications for field of view, resolution, and DPI. For XREAL Project Aura specifically, the emulator can simulate the 70-degree field of view that the hardware delivers, so developers can see exactly how their spatial content will appear in the user's peripheral vision. This is not a generic simulator. It models the actual constraints of announced hardware.

You need Android Studio Canary (Otter 3, Canary 4 or later) and emulator version 36.4.3 or above to access the glasses emulator. It runs alongside the existing headset emulator, so developers can test across both form factors from a single development environment.

ARCore for glasses

ARCore for Jetpack XR has been expanded to support AI Glasses in this preview. This means the same spatial understanding APIs that power headset AR, including plane detection, depth estimation, and environmental mesh, are now available in a glasses context. The implementation accounts for the fact that glasses have different camera placements and processing budgets than headsets, but the developer-facing API surface is intentionally similar.

This consistency across form factors is the point of Android XR as a platform. Write once, adapt minimally, deploy to headsets and glasses. In practice there will always be form-factor-specific tuning. But the shared API layer means that a developer who builds spatial features for Samsung Galaxy XR headset does not start from zero when adapting for Galaxy Glasses.

Android XR SDK Developer Preview 3 tutorial showing glasses development tools
Image: YouTube

What this means practically

The glasses hardware is coming whether developers are ready or not. Samsung, Warby Parker, XREAL, Gentle Monster, and Gucci have all committed to Android XR devices. Google I/O on May 19 will almost certainly include deeper sessions on glasses development, additional API announcements, and possibly new partner reveals.

Developer Preview 3 gives developers a roughly six-month head start before Samsung Galaxy Glasses ship to consumers. That is enough time to prototype, test, and polish a first-generation glasses app. The tooling is not final (this is a developer preview, not a stable release), but it is functional enough to build against and get familiar with the interaction model.

If you are an Android developer who has been watching the glasses space from the sidelines, this is the signal to start building. The emulator works. The frameworks exist. The documentation is live at developer.android.com/xr. The hardware will follow.

Advertisement