ArMay 4, 2026

Apple Is Testing Four Smart Glasses Designs. None of Them Have a Display.

By Sam Whitfield
Contributing Writer, VR.org

Apple is not building AR glasses. At least not yet. What Apple is building, according to Mark Gurman at Bloomberg, is a pair of AI smart glasses that look like premium eyewear, integrate deeply with the iPhone, and rely on cameras and speakers rather than any kind of visual display. The company is currently testing four different frame designs internally, and the product could be unveiled as early as September 2026 at the iPhone 18 event, with retail availability in 2027.

Advertisement

This is Apple entering the smart glasses market on terms that look remarkably similar to what Meta pioneered with Ray-Ban. No heads-up display. No spatial computing layer. Cameras, microphones, speakers, and AI processing that uses your eyes as input and your ears as output. The difference is that Apple intends to do it with the design standards, material quality, and ecosystem integration that Apple brings to everything it ships.

Apple Smart Glasses 2026 concept showing four frame design styles
Image: YouTube

Four frames, one philosophy

The four designs currently in testing span a range of face shapes and style preferences. According to Gurman's reporting, the options include a large rectangular frame, a slimmer rectangular frame similar to what Tim Cook wears daily, a larger oval or circular frame, and a smaller oval or circular frame. Apple may ship some or all of these at launch, offering the kind of style choice that eyewear customers expect from a premium brand.

The material is acetate, not plastic. Acetate is what high-end eyewear brands like Oliver Peoples and Moscot use for their frames. It is more durable, more polished in appearance, and carries a tactile quality that plastic does not. Apple reportedly refers to the design language internally as the "icon," which suggests they want these glasses to be visually identifiable as Apple products without looking like technology strapped to your face.

Color options being explored include black, ocean blue, and light brown, with Gurman noting that Apple intends to offer "many" colors at launch. The approach mirrors Apple Watch, where color and material variety helps position a tech product as a fashion accessory.

The camera system

Apple's camera implementation diverges from Meta's approach in one visible way. Where Ray-Ban Meta uses small circular camera lenses that blend into the frame, Apple is reportedly testing vertically oriented oval lenses with surrounding indicator lights. This makes the camera more visible but also more capable.

Two cameras are included. A high-resolution primary camera handles photo and video capture, producing images meant to match iPhone photo quality and share seamlessly to social media. A second, lower-resolution wide-angle camera provides environmental awareness for hand gesture recognition and feeds visual context to Siri through Apple Intelligence.

The dual-camera system is where Apple's approach starts to differentiate. The wide-angle lens watching for hand gestures means you can potentially control the glasses without touching them or speaking aloud. Raise a hand to dismiss a notification. Pinch to take a photo. The gesture vocabulary has not been detailed, but MacRumors reported in late April that gesture control is a confirmed feature in development.

Smart glasses market competition between Apple Meta Samsung and Google in 2026
Image: YouTube

Why no display matters

The decision to ship without a display is strategic, not a limitation. Apple has the technology to put a micro-LED display in glasses. The company spent years developing micro-LED before reportedly scaling back that program. The choice to omit a display from the first generation is about three things: weight, battery life, and social acceptability.

Displays add weight. Weight determines how long people will wear glasses before taking them off. Battery life on display-equipped glasses is currently measured in hours, not days. And the social perception of someone wearing glasses with a visible screen is still closer to "tech person" than "normal person." Apple clearly wants these glasses to pass the test that Ray-Ban Meta passed: can you wear them all day without anyone noticing they are smart glasses?

The display will come later. Samsung is taking the same two-step approach with Galaxy Glasses (Jinju ships without a display in 2026, Haean adds micro-LED in 2027). Google's Warby Parker partnership starts with non-display AI glasses before adding visual overlays. The industry has converged on this roadmap: AI glasses first, AR glasses second.

The competitive landscape Apple is entering

By the time Apple ships in 2027, the market will not be empty. Meta will have two or three years of Ray-Ban iteration behind it and an installed base exceeding 2 million units. Samsung's Galaxy Glasses will have been on shelves for months. Warby Parker's Google-backed glasses will be available. Gentle Monster and Gucci will be entering with Android XR designs.

Apple's advantage is ecosystem lock-in. If these glasses integrate with iPhone the way AirPods do, with instant pairing, seamless audio handoff, iMessage notifications, and Siri that actually works because it has full device context, then every iPhone owner becomes a potential customer without needing to switch platforms or learn new software. That is a distribution advantage nobody else in the smart glasses space can match.

Apple's disadvantage is time. Every month that passes between now and 2027 is a month where Meta, Samsung, and Google are building user habits and developer ecosystems that Apple will need to displace. First mover advantage in smart glasses belongs to Meta right now, and Apple will need to offer something materially better, not just materially nicer, to pull users away from products they already own and like.

AR and AI smart glasses ecosystem 2026 showing multiple competing platforms
Image: YouTube

The September timeline is not confirmed. Apple could delay to early 2027 for the announcement, or it could tease the product at WWDC in June and formally announce in September. What is confirmed, based on Gurman's reporting and the supply chain activity around acetate frame production, is that Apple is no longer in the research phase. They are testing finished designs. They are choosing colors. They are building the thing. The only question is exactly when we see it.

Advertisement