I Checked Out These Emotion-Tracking Smart Glasses and Found Them Surprisingly Useful

We know smart glasses can play podcasts and put an AI assistant in your ear, but what if they knew how you were feeling ? That’s the idea behind Emteq Lab ’s Sense glasses. They’re not on sale yet, but the end goal is a lightweight pair of glasses equipped with sensors that read the smallest changes in users’ facial muscles, all with the goal of detecting changes in mood in real time to gain insight into health, eating habits, and more.
Emteq is one of a growing number of companies in the field of “affective computing,” technology designed to recognize, interpret, process, and/or model human emotions. For better or worse, the future is likely to be filled with such things.
How do emotion recognition glasses work?
The technology behind Emteq’s emotion-tracking glasses is complex, but the concept is simple: Sensors aimed inside the glasses track the electrical activity of the zygomaticus (smile) muscle group, the corrugator supercilii (forehead) muscle group, and the muscle groups that control the eyebrows, and combine that information with heart rate and head movement data, then combine it all into a real-time recording of your emotions that you can access on your smartphone.
That’s the idea, anyway. Whether any machine can accurately interpret each person’s emotions based on facial muscle movement is a tricky question. Research shows that basic emotions like happiness, sadness, surprise, and disgust are expressed on the face in similar ways across cultures, but cultural influences and individual differences affect how we express emotions. Some people have poker faces. Some people laugh when they’re scared. And everyone can smile when they’re sad.
Use cases for emotion-recognizing glasses
I recently spoke with Emteq CEO Steen Strand and saw a demo. The prototype of the Sense glasses appears to work as advertised, in a regular eyeglass frame. The ultimate vision for the technology covers everything from virtual meetings to mental health monitoring and diet tracking.
Make virtual meetings more “natural”
“When we communicate, you want to see my face, I want to see yours. We can react to each other,” Strand said. “If you want to do that virtually, you have to know what my face is doing.” The idea is that expression-recognition glasses could make avatars and virtual interactions more “real” by transferring what’s on your real face to your digital face.
For some types of virtual conversations, this would be great, but what if I don’t want to look bored during a meeting? Existing VR technology can do something similar, but Strand says Emteq’s technology offers a better solution. “A lot of existing technology, especially in VR, is just more power-hungry and computationally intensive,” Strand said. “We use these very lightweight, low-power sensors that just look at a little tiny part of your face, and from that we can infer what your entire face is doing.”
Mental health
According to Strand, constantly monitoring your real emotions could become an additional diagnostic tool for mental health professionals. “The gold standard for diagnosing depression right now is a questionnaire,” he said. “Not only does it have inherent bias, but it’s also time-sensitive. How you feel in one moment may be different from how you feel an hour later,” but keeping a constant record of your emotions is likely to be more indicative of your mental state.
For people who have difficulty recognizing what emotions their faces are displaying, whether due to a physical condition like facial paralysis or a mental health issue like autism, emotion recognition glasses can be a window into feelings most of us take for granted.
Healthy eating
Perhaps the most specific use of the Sense glasses is to monitor eating habits. These stats can track chewing habits, bite frequency, and eating speed — metrics that research has linked to weight management and digestive health. “You can see how many times you chew during a meal, how many bites, the intervals between your chews and bites,” Strand said. Some studies have linked eating speed to calorie intake during a meal , so in theory, micromanaging your chewing could help you reach your weight-loss goals, as long as it doesn’t drive you crazy at first.
For people who struggle with healthy eating or have medical conditions that require careful dietary control, this can be helpful. But it risks turning every meal into a performance review .
Big Questions: Privacy and Humanity
With any new technology, the logical question is, “How can this be used to further refine our everyday lives?” There’s no shortage of dystopian hypotheses here, as there are with any kind of affective computing. Imagine what advertisers and marketers would do with a record of how consumers feel about everything they see and experience, all day, every day. How much worse would algorithms be if they knew exactly how you felt about that TikTok? What if employers had a real-time readout of which workers were smiling and which were frowning? Imagine how a repressive government could use this technology against its own citizens.
It’s probably unfair to pin these big ideas on smart glass monitor technology, and Strand says Emteq isn’t in the business of collecting and selling generic emotional data. “Our philosophy right now is that this is medical-grade personal data that shouldn’t be shared,” Strand said. But data promises tend to “evolve” as companies grow and face financial pressure.
When will Sense glasses go on sale?
As for when you might get your own chewing-and-feeling monitoring features, the short answer is in the future, maybe. “Sometime next year, you could expect us to release something,” Strand said. “We’re still debating whether we’ll go direct-to-consumer. There are a lot of different ways to go to market with this technology. And so we’re still balancing some of those.”