Meta shared some big updates for its Ray-Ban Meta Smart Glasses during the Meta Connect keynote, but it also revealed a prototype for a future product: fully holographic AR glasses. After almost 10 years of work, CEO Mark Zuckerberg showed off his team’s first fully functioning prototype, named Orion.
Described as “the most advanced glasses the world has ever seen,” the Orion frames are genuinely very close to just being a normal pair of glasses, which is something even many of the current best smart glasses don’t achieve. In fact, for people who like chunky statement glasses, they’re not even too big as they are — though Meta says it plans on further developing the design to make it a bit “smaller and more fashionable” before bringing the product to market.
To meet their goals, the team behind Orion had to develop a prototype that was — first and foremost — a pair of glasses. That meant no headset, no wires, a weight of less than 100 grams, a wide field of view, and holographic displays that were sharp enough to pick up details and bright enough to see in different lighting conditions.
On top of all that, the wearer needed to be able to see through the glasses, and people around them needed to be able to see the wearer’s eyes through the lenses.
The result is a display that doesn’t use passthrough — what the wearer sees is the real physical world with holograms overlaid onto it. These holograms might be a cinematic screen, a desktop window for working, a game, a little app window for replying to messages, or even a hologram version of the person you’re on a call with.
The display isn’t a screen made of glass, but of silicon carbide and uses tiny projectors in the arms of the glasses that — get ready for the complicated stuff — “shoot light into waveguides that have nano-scale 3D structures etched into the lenses that can refract light” and “put holograms of different depths and sizes in the world in front of you.” The frames are made of magnesium to keep the glasses light and to radiate heat away instead of using a fan.
Zuckerberg said there was a battery in the arms of the glasses, but he also briefly mentioned a “small puck” that would be used to help power the wearable. As for how you interact with the glasses, they will use AI with voice, hand-tracking, eye-tracking, and something called a wrist-based neural interface.
The visuals at this point showed someone wearing a small wrist accessory while making some kind of gesture with their hand, and Zuckerberg mentioned being able to “send a signal from your brain to the device” to interact with it when other methods would be inconvenient. He didn’t elaborate any further on how this would work or what you would be able to do with it, however, so it seems this feature will be shrouded in mystery for now.
Before the company releases this as a consumer product, Zuckerberg says they still have a few things to refine. This included tuning the display system to make it sharper, improving the design, and working on the manufacturing process to make it more affordable. Until then, the prototype will be used as a dev kit, mostly internally but also with a few external partners, to build out the software experiences.
While there’s still work to be done, with any luck, these holographic AR glasses will become a real product we can buy in the next few years.