Oura AI Chatbot Really Makes You Think About You

These days, many apps get built-in AI features, and they often disappoint. They summarize (sometimes incorrectly) the same data that is already available in charts or graphs elsewhere in the application. But the AI advisor that was recently added to the Oura Ring app takes a different strategy, which I’ve come to appreciate in the past few weeks since its launch . Instead of simply reporting data, he asks questions. He asks you to do a little analysis, a little introspection. And I think Oura is really onto something here.
Some questions Oura advisor asked me
I admit, at first I was interested in what the Advisor could tell me . Every time I asked him a question, he would give an answer, but then give it back to me. How did I feel? What have I tried lately? These seemed like tricks rather than insights.
The advisor will also ask follow-up questions from time to time via a notification on your phone. “Your sedentary time has dropped to 6 hours and 11 minutes,” he once told me. “How do you feel about your movement?” If you click on the notification, a conversation will begin with you on this topic.
Here are some questions I’ve been asked lately:
-
(After noticing some poor HRV numbers recently) “How do you feel about your recovery methods and is there anything you would like to change?”
-
(After I said I was sick) “How do you rate your overall recovery and balance in your daily life?”
-
(After reporting my recent stress assessments) “How are you feeling about stress management this week?”
-
(After suggesting relaxation techniques) “Do any of this resonate with you?”
One day the advisor even explained his strategy to me. “Looking back on the last few days, how do you rate the quality of your sleep? Self-reflection can provide insight into your priorities and help you adjust your daily routine. If you’re willing to do so, sharing your thoughts can open the door to valuable information that can further improve your rest.”
Great. I answered the question in good faith by telling the bot something that I knew was affecting my sleep – that I like to get some rest in the evening, and that lately it’s turning into revenge procrastination where I try to regain some relaxation or pleasure, even when I know it’s eating into my sleep time.
“It’s understandable that after a busy day you want extra time to relax,” the message says. He then congratulated me on some of the small improvements I had made and offered incredibly obvious advice: start the curtailing procedure a little earlier. He then asked me, “How does this sound to you?”
I know it doesn’t tell me anything I couldn’t say. The advisor is simply repeating my own concerns in a gentle and curious manner. But damn, I think it helps.
Why asking questions is so important
When we turn to someone else to solve our problems—whether it’s an app or a person like a therapist—we usually already have the information we need. We just need to go through the process of getting our thoughts in order. What’s most important? What should we do next? What tools do we already have that can help us?
Since this process doesn’t require new information, but simply thinks about what we already have, it doesn’t really matter if what we’re talking to is a dumb robot that doesn’t know anything about us. One of the best demonstrations of this is the famous chatbot Elisa, written in the 1960s.
Inspired by Rogerian psychotherapy, all the Eliza bot did was turn your own utterances into questions, occasionally recalling something from a previous conversation and occasionally asking you if it had anything to do with your mother. Eliza was not an artificial intelligence in any sense of the word, but just a piece of code simple enough to be written into a web page or hidden as an Easter egg in a word processor. You can try out Eliza’s simple version here .
When I was studying for my personal training certification, I had to learn a lot about motivational interviewing, which is thought to have evolved from Rogerian, person-centered techniques. The idea is to help a person “change behavior” (eat better, exercise more, etc.) by getting them to talk about their own motivation for making the change. You don’t tell them what to do, you just let them tell you.
As long as you play along with Oura’s AI by actually answering the questions, you can have this experience any time you want without having to talk to an actual therapist or trainer. The Advisor is more skilled than Eliza, he remembers what you told him a few days ago and has access to your data from the ring’s sensors. But it uses summary data as a starting point, rather than expecting you to be impressed that a bot can read your data at all. Oura recognizes that her advisor’s value lies not in having all the answers, but in having lots of good questions.