For years, Silicon Valley and Wall Street have questioned Mark Zuckerberg’s decision to invest tens of billions of dollars into Reality Labs. This week, Meta’s wearables division unveiled a prototype of its Orion smart glasses, a form factor the company believes one day could replace the iPhone. That idea sounds crazy… but maybe a little less crazy than it did a week ago.
Orion is a prototype headset that combines augmented reality, eye and hand tracking, generative AI, and a gesture-detecting wristband. Through micro LED projectors and silicon carbide lenses (which are quite expensive), Meta seems to have cracked a longstanding AR display challenge. The idea is that you can look through Orion — you know, like a pair of glasses — but also see application windows projected on the lenses that appear as if they’re embedded in the world around you. Ideally, you can use your hands, eyes, and voice to navigate the environment.
Though to be clear, Meta’s Orion smart glasses are chunkier than your average readers, reportedly cost $10,000 a pop, and won’t be available for sale anytime soon. We’re talking years from now. All the technology in Orion is relatively young, and all of it needs to get cheaper, better, and smaller to work its way into a pair of smart glasses you can buy at the mall. Zuckerberg says the company has already been working on Orion for 10 years, but there’s still no path to a sellable product.
However, Meta is hardly the only company trying to put a smartphone replacement on your face.
This month, Snap unveiled its latest generation of Spectacles smart glasses, which are larger than Orion and have a more limited field of view. One former Snap engineer called the latest Spectacles “obviously bad” — though you can actually order them. Google hinted during its I/O conference in May that it, too, is working on a pair of smart glasses, perhaps a revamp of its failed Google Glass experiment from last decade. Apple is reportedly working on AR glasses that sound a lot like Orion. And we can’t rule out Jony Ive’s new startup, LoveFrom, which he recently confirmed is working on an AI wearable with OpenAI (though we don’t know if they’re glasses, a pin, or something else entirely).
What’s brewing is a race among Big Tech’s richest companies to create a sleek pair of smart glasses that can do everything your smartphone can — and hopefully something more. Meta’s prototype made two things clear: there is something there, but we’re not “there” yet.
These devices are a notable departure from the Quest virtual reality headsets Meta has been pushing for years now, and Apple’s Vision Pro. There’s a lot of similar technology involved, like eye-tracking and hand tracking, but they feel completely different to use. VR headsets are bulky, uncomfortable to wear, and make people nauseous from staring at the displays. Sunglasses and eyeglasses, on the other hand, are relatively pleasant to wear and millions of Americans use them everyday.
To Zuckerberg’s credit, he’s been pushing the eyewear form factor for quite a long time, when it certainly was not popular to do so. It’s long been reported that Meta’s CEO hates that his popular social media apps have to be accessed through Apple’s phones (perhaps leading to the ill-fated Facebook Phone). Now, Meta’s competitors are also dipping their toes into eyewear computing.
Meta’s early investment here seems to be paying off. Zuckerberg gave a keynote presentation of Orion on Wednesday that we won’t be forgetting anytime soon, filling a room full of skeptical journalists with electricity and excitement. TechCrunch has not demoed Orion yet, but initial reviews have been very positive.
What Meta offers today is the Ray-Ban Meta: a pair of glasses with cameras, microphones, speakers, sensors, an on-device LLM, and the ability to connect to your phone and the cloud. The Ray-Ban Meta is far simpler than Orion, but relatively affordable at $299 — actually not much more than a regular pair of Ray-Bans. They’re kind of like the Spectacles 3 that Snap released a few years ago, though the Ray-Ban Meta glasses appear more popular.
Despite the vast differences in price and capabilities, Orion and Ray-Ban Meta are more related than you might think.
“Orion is really the future, and we ultimately want to go for the full holographic experience. You can think about Ray-Ban Meta as our first step there,” said Li-Chen Miller, a VP of product at Meta who leads its wearables team, in an interview with TechCrunch. “We really need to nail the basic things, like making sure it’s comfortable, people want to wear it, and that people find value in it every day.”
One of the things Meta is trying to nail with Ray-Ban Meta is AI. Currently, the smart glasses use Meta’s Llama models to answer questions about what you see in front of you, by taking pictures and running them through the AI system alongside a user’s verbal requests. The Ray-Ban Meta’s AI features today are far from perfect: The latency is worse than OpenAI’s natural-feeling Advanced Voice Mode; Meta AI requires very specific prompts to work right; it hallucinates; and it doesn’t have a tight integration with many apps, making it less useful than just picking up my iPhone (perhaps by Apple’s deisgn). But Meta’s updates coming later this year try to address these issues.
Meta announced it will soon release live AI video processing for their Ray-Bans, meaning the smart glasses will stream live video and verbal requests into one of Llama’s multimodal AI models and will produce real-time, verbal answers based on that input. It’s also getting basic features, like reminders, as well as more app integrations. That should make the whole experience a lot smoother, if it works. Miller says these improvements will filter up to Orion, which runs on the same generative AI systems.
“Some things make more sense for one form factor than the other, but we’re certainly cross-pollinating,” said Miller.
Likewise, she says some of Orion’s features may filter down as her team focuses on making the AR glasses more affordable. Orion’s various sensors and eye trackers are not cheap technologies. The problem is that Orion has to get both better and more economical.
Another challenge is typing. Your smartphone has a keyboard, but your smart glasses won’t. Miller worked on keyboards at Microsoft for nearly 20 years before joining Meta, but she says Orion’s lack of keyboard is “freeing.” She argues that using smart glasses will be a more natural experience than using a phone. You can simply talk, gesture with your hands, and look at things to navigate Orion; all things that come naturally to most people.
Another device that was criticized for lacking a keyboard was, ironically, the iPhone. Former Microsoft CEO Steve Ballmer infamously laughed at the iPhone in 2007, saying it wouldn’t appeal to business customers because it didn’t have a physical keyboard. People adapted though, and his comments sound naive more than 15 years later.
I think making Orion feel natural is definitely more of a goal than a reality at this point. The Verge notes in its hands-on review that windows occasionally filled the entire glasses lens, completely obstructing the user’s view of the world around them. That’s far from natural. To get there, Meta will have to improve its AI, typing, AR, and a long list of other features.
“For Ray-Ban Meta, we kept it very scoped to a few things, and then it does them really well,” said Miller. “Whereas, when you want to build a new, futuristic computing platform [with Orion], we have to do a lot of things, and do them all very well.”
Source link