The Biggest Rumors About the Next Generation of Meta Smart Glasses

As a die-hard fan of Ray-Ban’s Meta smart glasses (seriously, I love those things ), I squint at every leak and Zuckerberg comment, trying to figure out what’s next, though not all developments are created equal. The Oakley Meta smart glasses, which are up for pre-order now , will have longer battery life and an improved camera , but that’s more of a 1.5% upgrade than a next-gen leap. So let’s dive into the most intriguing leaks, educated guesses, and outright wishes for Meta’s next-gen smart glasses.
Meta is pushing its smart glasses in two directions: audio-enabled glasses developed in partnership with brands like Ray-Ban and Oakley, and more modern augmented reality glasses. I’ve collected rumors about both.
Orion: Meta’s Augmented Reality Smart Glasses Prototype
Let’s start with the big one: Orion. Officially unveiled in September 2024 , Orion is Meta’s prototype smart glasses platform, designed to combine augmented reality and artificial intelligence into a pair of wearable glasses. The goal is to “connect the physical and virtual worlds,” and if Meta can deliver on the promises made in its demo videos, Orion (or something similar) will be a serious competitor to smartphones in general.
But that’s still a big “if.” The current state of consumer AR smart glasses suggests there are significant hurdles to overcome before something like Orion becomes viable, affordable, and available in a store near you. Meta has already demonstrated the glasses to journalists, as you can see in the video below, but there are no plans to release them in their current form:
Orion’s capabilities are obvious—think of navigating to an airport gate and following a dotted line, or designing something in 3D and crawling underneath it to see the bottom—but the technology hasn’t reached its limits yet. It’s designed to replace glasses , a technology so good it hasn’t changed much since the 13th century . When the cool factor wears off, will Orion’s benefits be worth the technical challenges?
I wouldn’t wear Meta Ray-Bans if they required any effort to “operate” them: charge them straight from the case, put them on, and go. For something like Orion to be mainstream and not just a gadget novelty, I think it would have to be that easy to use. (Meta’s current concept for interacting with the glasses involves a smart band that you wear at all times.) Either way, “real” AR glasses may be years away from widespread availability, but Meta’s Hypernova smart glasses are (presumably) just around the corner.
Hypernova Smart Glasses by Meta
Coming soon (probably) is a pair of Meta smart glasses with a display. Meta is rumored to be releasing glasses with a built-in screen later this year. Supposedly called “Hypernova,” the glasses will do all the things the Ray-Ban Meta does, but will also let you run apps and display photos on a small screen projected onto one of the lenses. They’ll supposedly come with a “neural” wrist controller for gesture control, much like the one shown in Orion demos. Estimated price: $1,000 to $1,500.
While this rumor is unconfirmed, it seems plausible. Hypernova seems like a logical bridge between the pie-in-the-sky Orion concept glasses and the Ray-Ban Meta glasses we already have. In fact, there’s nothing stopping Meta from making them: smart glasses with HUDs and virtual HD screens like the XReal Pro have been around for years. While these “monitor replacement”-style augmented reality glasses aren’t meant for everyday wear, the only thing holding Meta back from releasing glasses with a modest display in an everyday frame is the company’s business plan.
In most cases, I think a small HUD on a pair of comfortable glasses would be more useful and less of a hassle than something like the Orion, just as texting is usually more convenient and less of a hassle than a Zoom call. One potential sticking point, however, is battery life. My main issue with my current Ray-Ban Metas is that they’re too heavy and the battery doesn’t last long. Adding an extra HUD seems like it could make both of those issues worse. If that can be resolved and the glasses are as comfortable to use as the Ray-Ban Metas, I’ll be first in line for a pair.
What can we expect from the next generation of Ray-Ban Meta smart glasses?
Let’s step away from the lofty, speculative future without phones and video glasses with the ostensible possibility of their appearance and talk about the direction in which the existing Meta smart glasses with audio and AI functions will most likely develop in the near future.
Last week , renders of supposed next-generation Ray-Ban sunglasses surfaced online . While there’s no good reason to believe the renders are legit — anyone could fake an image and call it a leak — the supposedly revealed features accompanying the renders are likely legit, but only because they’re obvious. According to the report, the next generation of Meta smart glasses will “have significantly longer battery life and improved AI features, including real-time object recognition and scene understanding,” which is tantamount to predicting that Apple’s next phone will have a better camera. Who would have thought that would happen?
A more detailed and interesting rumor has surfaced on tech site The Information . According to its sources, Meta is adding facial recognition to the next generation of glasses. There’s nothing technologically stopping Meta from implementing facial recognition now. In fact, the feature was supposedly planned for the current generation of Meta glasses, but was scrapped due to privacy concerns . It’s easy to see why facial recognition would raise alarm bells among privacy advocates. But for those of us who aren’t all that privacy-conscious, including myself, but regularly forget the names of people we meet, you can imagine how appealing it is.
Speaking of features that might seem dystopian , Meta is reportedly planning to include a feature in its next line of glasses that will monitor and analyze all of the user’s actions in real time. The AI will be constantly on and simply watching what’s going on through your eyes, so Meta AI will be able to say things like, “You parked in spot 6G,” or “You forgot to close the garage door.”
As someone with ADHD, I want this very much . I have my doubts about the wisdom of handing over literally every intellectual task to a machine, and I’m not thrilled about letting computers controlled by Mark Zuckerberg judge and exploit everything I do, but the first time my glasses helped me find my lost car keys, I would have forgiven anything.