SoftwareMay 7, 2026

Meta's Interaction SDK Now Works in Unreal Engine and on Non-Meta Headsets. That Changes Things.

By Nina Castillo
Staff Writer, VR.org

Meta just made its best VR development tools available to studios that do not use Unity and do not ship exclusively on Quest. Interaction SDK v69 adds official Unreal Engine support for the first time and extends Unity compatibility to non-Meta headsets. The library of interaction primitives that made Quest apps feel polished (grab, poke, raycast, throw, two-handed manipulation) now works regardless of which engine you build in or which hardware you target.

Advertisement

This is a bigger deal than it might seem at first glance. Meta's Interaction SDK has been one of the most complete hand and controller interaction libraries in VR development, but it was Unity-only and Quest-only. That limited its reach to studios already committed to both. Removing those restrictions means the quality bar for VR interactions across the ecosystem is about to rise.

Meta Quest 3 headset, the platform Meta's Interaction SDK was originally built for before the v69 cross-platform expansion
Image: Wikimedia Commons / CC BY-SA 4.0

What Unreal developers get

The ISDK Unreal integration ships as a plugin that works alongside the existing Meta XR Unreal Plugin. It targets Unreal Engine 5.4 and later. The interaction set matches what Unity developers have had access to: Raycast for distant pointing, Poke for close-range touch, Grab for picking up and manipulating objects, Throw with proper release velocity, and one and two-handed object transformations for resizing and rotating.

Both controller and hand tracking interactions are supported out of the box. Developers do not need to implement their own hand tracking interpretation or build custom grab mechanics from scratch. The SDK provides tested, production-ready implementations that handle edge cases like object handoff between hands, surface detection for poke targets, and velocity estimation for thrown objects.

For Unreal studios that have been building their own interaction systems from the ground up, this represents hundreds of engineering hours they can now skip. The quality of grab physics, hand pose detection, and controller mapping in ISDK has been refined across dozens of shipping Quest titles. Getting that for free in an Unreal project is meaningful.

Non-Meta headset support on Unity

The Unity side of this update is just as significant. Starting with v69, the Unity implementation of ISDK supports headsets beyond Quest. The specific list of supported devices is not fully detailed in the announcement, but the positioning is clear: if you build with ISDK on Unity, your interactions should work on whatever XR hardware your players are using.

This matters for studios shipping on multiple platforms. A game releasing on Quest, PSVR2, and PC VR previously needed separate interaction implementations for each platform, or needed to use a third-party abstraction layer. ISDK on Unity now functions as that abstraction layer, with the backing of Meta's engineering team maintaining it.

The practical implication is that Quest-quality interactions can ship on non-Quest hardware without additional development cost. Studios that build against ISDK get a consistent interaction quality across devices, and players on any headset benefit from the polish that Meta has invested in the SDK over multiple hardware generations.

Unreal Engine logo, marking the first official Meta Interaction SDK support for Epic's engine starting with Unreal Engine 5.4
Image: Epic Games / Wikimedia Commons

UI components included

Alongside the cross-platform interaction update, Meta shipped a comprehensive set of UI components for both Unity and Figma. These are production-ready interface elements (buttons, panels, sliders, menus) designed specifically for VR interactions. The Figma integration means designers can prototype VR interfaces in Figma and hand off specifications that map directly to the SDK's component library.

This is the kind of infrastructure work that does not make headlines but quietly improves the quality of VR software across the board. When every studio has access to well-designed, tested UI components rather than building their own from scratch, the baseline quality of VR apps improves. Users benefit even if they never know the technical details behind it.

Why Meta is giving this away

The strategic logic is straightforward. Meta sells Quest hardware. Quest hardware is more valuable when VR software is better. VR software is better when developers have better tools. If Meta's interaction SDK makes non-Meta headset games better, those developers are still more likely to ship on Quest because they are already using Meta's tools. The SDK is a developer acquisition strategy disguised as generosity.

There is also a standards play here. If ISDK becomes the default way developers handle VR interactions across engines and platforms, Meta effectively sets the interaction standard for the industry. That is influence even when developers ship on competing hardware. The grab mechanics, the poke detection, the raycast behavior all feel like Meta expects them to feel, on every headset.

Sony PlayStation VR2, one of the non-Meta headsets that can now receive Interaction SDK behaviors when developers build with the Unity ISDK on v69 or later
Image: Wikimedia Commons

For studios evaluating their interaction middleware options in 2026, ISDK just became considerably harder to ignore. It is free, well-documented, actively maintained by one of the largest XR engineering teams in the world, and now works on both major game engines and across hardware platforms. The only question is whether developers trust Meta enough to build their interaction layer on Meta's SDK. Given the alternative is building everything yourself, the answer for most studios will be yes.

Advertisement