Google Research just dropped something that caught my attention in a big way. They're calling it Vibe Coding XR, and the idea is straightforward but powerful: you describe an XR experience in plain English, and Gemini builds it for you. Not a mockup. Not a wireframe. A fully interactive, physics-aware WebXR application that runs on Android XR headsets. In under 60 seconds.

What is Vibe Coding XR?

The concept of "vibe coding" has been floating around the dev community for a while now. You describe what you want, an AI writes the code, and you iterate from there. Google took that idea and applied it to one of the hardest development environments out there: spatial computing.

Building XR applications has traditionally been painful. You need to understand 3D engines, sensor integration, spatial mapping, hand tracking, and a dozen other systems just to get a basic prototype running. Most developers don't have the time or expertise to deal with all of that, which means a lot of great XR ideas never get built.

Vibe Coding XR removes that barrier. The workflow connects Gemini Canvas with an open source framework called XR Blocks, which handles all the complex spatial computing plumbing behind the scenes. You type or speak a prompt like "create a beautiful dandelion that blows away when I pick it up," and Gemini designs the scene, configures the physics, sets up hand interactions, and generates a working WebXR app you can immediately test.

How it actually works

The system is built on three layers. First, there's a specialized prompt that teaches Gemini how to think like an XR designer and engineer. It includes guidelines for room-scale environments, spatial layout, interaction distances, and best practices that would take a developer weeks to learn.

Second, XR Blocks provides the technical foundation. It's built on WebXR, three.js, and Google's LiteRT.js, and it manages all the subsystems that spatial computing requires: environmental perception, hand tracking, physics simulation, and AI integration. Google open sourced the entire framework on GitHub.

Third, Gemini's long-context reasoning connects the dots. It reads your prompt, plans the experience across multiple steps, selects the right components from XR Blocks, and generates code that actually runs. The whole process typically completes in under a minute.

You can test results in a simulated environment on desktop Chrome before deploying to a headset, which makes the iteration loop incredibly fast. Advanced features like depth sensing, hand interaction, and physics work best on Android XR devices like the Samsung Galaxy XR.

The demos are impressive

Google showed off several examples that demonstrate the range of what's possible. A math tutor that lets you explore geometric theorems by pinching and rotating 3D shapes. A physics lab where you balance a scale by picking up and dropping weights with your hands. A chemistry simulation where you can ignite different gases and observe the reactions safely in mixed reality.

They even recreated the Chrome Dino game in XR, with voxelized dinosaurs and cacti rushing toward you on a semi-transparent lane in your room. What would normally take hours of development was prototyped in minutes.

The one that stood out to me was the Schrodinger's cat demonstration. You pinch to put a cat in a box, approach the box and it splits into two versions showing both alive and dead states simultaneously, then pinch again to collapse the superposition into reality. That's a genuinely creative use of spatial computing to explain a concept that's almost impossible to visualize on a flat screen.

Why this matters for the XR industry

The biggest bottleneck in XR right now isn't hardware. Headsets are getting better every year. The bottleneck is content. There aren't enough developers building XR experiences because the development tools are too complex and the audience is still relatively small.

Vibe Coding XR attacks that problem directly. If a teacher can describe an interactive science lesson and have it running on a headset in under a minute, the calculus changes completely. If a game designer can prototype a spatial mechanic by talking to an AI instead of spending days in Unity, more ideas get tested and more good ones survive.

Google's approach is also smart from a platform strategy perspective. They're making it dead simple to build for Android XR at a time when the Samsung Galaxy XR is in the market and more Android XR devices are on the way. Lower the barrier to entry for developers and you get more apps. More apps drive more headset sales. It's the same flywheel that made mobile app stores explode.

Try it yourself

The whole thing is live right now. You can access the XR Blocks Gem demo at xrblocks.github.io/gem from Chrome on desktop or an Android XR headset. The XR Blocks framework is open source on GitHub at github.com/google/xrblocks. Google's team will also be demoing it at ACM CHI 2026 in person.

Whether you're a developer looking for a faster way to prototype or just someone who's curious about where spatial computing is headed, this is worth checking out. Vibe Coding XR isn't going to replace professional XR development anytime soon, but as a rapid prototyping tool, it's a glimpse of what building for spatial computing could look like when the tools finally catch up to the vision.