Meta’s New “personal Superintelligence” Will Appear in Their Smart Glasses.

Meta announced the release of Muse Spark, an updated and improved artificial intelligence model, calling it a step toward “superintelligence.” This “embedded multimodal reasoning model” goes far beyond a chatbot and will soon be integrated into your glasses and social media. It’s already available in the Meta AI app , and is planned to be rolled out in the next few weeks along with an update to your smart glasses.
Instead of a one-size-fits-all approach, Muse Spark offers three levels of “thinking,” allowing users to control the depth of intelligence.
-
Instant Reply Mode: For quick questions and everyday conversations.
-
Thinking Mode: This mode is designed for solving more complex problems, so if you need help with math, science, or logic, this is the mode for you.
-
Reflection Mode: At its highest level, Muse Sparks employs multiple AI agents working in parallel and collaboratively to complete complex, multi-step tasks.
According to Meta, Muse Spark’s performance is comparable to or better than their Llama 4 Maverick model, while consuming more than an order of magnitude less computing power. This theoretically means high-level reasoning is possible without placing excessive load on the server.
While Muse Sparks will be available in a variety of locations, its comprehensive visual content integration seems ideal for smart glasses. Here are some ways Ray-Ban Meta and Oakley Meta users can utilize the new AI.
Artificial intelligence is now integrated into various tools.
One of the major improvements of Muse Spark over the previous Meta model is the way the new AI integrates visual information from different tools. Theoretically, you could point your glasses at a tangle of wires and electronic boxes and ask, “How do I connect this home theater system?” Or get step-by-step instructions for assembling IKEA furniture without even opening the manual. The AI will read the instructions and ensure you’re not installing anything upside down.
Muse Sparks will have the ability to reason about health.
Meta announced that its Meta Superintelligence Lab has collaborated with over 1,000 physicians to develop AI capabilities for health data analysis. Users will be able to create interactive displays, for example, that display nutritional information about foods and show which muscles are being used during exercise.
But how will this work in practice?
All of the above is “in theory.” Artificial intelligence hasn’t always lived up to expectations, even when it’s been touted to huge audiences . It’s one thing to demonstrate good results in lab tests, but the real challenge is how the technology performs in the real world, where lighting is inconsistent, Wi-Fi is slow, and furniture assembly instructions can be extremely complex.
While I didn’t delve into the technology, I did run a quick test by turning on “think mode” and sending Meta AI an image of a random set of audio equipment, shown below:
Not only did it correctly identify everything in the image, but it also offered several possible connection options and (correctly) identified the cables I needed. So, I’m looking forward to using it on my glasses. If you’d like to test it out yourself, Muse Spark is already running on meta.ai and in the Meta AI app, with firmware for smart glasses and social media integration expected soon.