In addition to the actual glasses, Orion relies on two other pieces of kit: a 182-gram “wireless compute puck, which needs to stay near the glasses, and an electromyography (EMG) wristband that allows you to control the AR interface with a series of hand gestures. The puck I saw was equipped with its own cameras and sensors, but Meta told me they’ve since simplified the remote control-shaped device so that it’s mainly used for connectivity and processing.
When I first saw the three-piece Orion setup at Connect, my first thought was that it was an interesting compromise in order to keep the glasses smaller. But after trying it all together, it really doesn’t feel like a compromise at all.

You control Orion’s interface through a combination of eye tracking and gestures. After a quick calibration the first time you put the glasses on, you can navigate the AR apps and menus by glancing around the interface and tapping your thumb and index finger together. Meta has been experimenting with wrist-based neural interfaces for years, and Orion’s EMG wristband is the result of that work. The band, which feels like little more than a fabric watch band, uses sensors to detect the electrical signals that occur with even subtle movements of your wrist and fingers. Meta then uses machine learning to decode those signals and send them to the glasses.
That may sound complicated, but I was surprised by how intuitive the navigation felt. The combination of quick gestures and eye tracking felt much more precise than hand tracking controls I’ve used in VR. And while Orion also has hand-tracking abilities, it feels much more natural to quickly tap your fingers together than to extend your hands out in front of your face.
What it’s like to use Orion
Meta walked me through a number of demos meant to show off Orion’s capabilities. I asked Meta AI to generate an image, and to come up with recipes based on a handful of ingredients on a shelf in front of me. The latter is a trick I’ve with the Ray-Ban Meta Smart Glasses, except with Orion, Meta AI was also able to project the recipe steps onto the wall in front of me.
I also answered a couple of video calls, including one from a surprisingly lifelike . I watched a YouTube video, scrolled Instagram Reels, and dictated a response to an incoming message. If you’ve used mixed reality…