Your AI-Powered Fitness Trainer May Be Doing More Harm Than Good.

Did you know you can configure Google to filter out junk? Follow these steps to improve your search results, including adding my work on Lifehacker as a preferred source .

Need personalized training plans, real-time feedback, and 24/7 motivation, all without the expense of a human trainer? AI-powered personal trainers seem like the perfect solution. Download an app, answer a few questions about your goals and fitness level, and receive a customized training program. I’ve tested some of these apps myself and can definitely see their appeal. But more often than not, I see companies implementing AI in apps where it doesn’t belong.

Strava’s “Athlete Intelligence,” Garmin’s underwhelming Connect+ subscription , and Whoop’s recovery recommendations are just a few. And, as Lifehacker’s senior health editor Beth Skuerecki notes, people are increasingly turning to ChatGPT for training advice, which is annoying given how many high-quality free apps already exist.

You may also like

Healthcare professionals and coaches are increasingly noticing that clients are anxious about optimization and performance, and that they become discouraged when artificial intelligence deems their efforts unsatisfactory. Think of how something like ” close your rings ” for activity goals on the Apple Watch has taken over the country. Or Fitbit’s step count, even if the step goals themselves are complete nonsense . When their metrics don’t meet expectations, people feel like failures. It’s not because they’ve made a lack of progress, but because the algorithm has told them so.

Blind trust and obsession with data

Certified personal trainer Kara D’Orazio describes what she calls “digital guilt”—the anxiety that arises when you miss a workout notification or fail to keep up with an app’s demands. She recalls clients who arrived at her gym exhausted and demoralized, including one woman whose AI trainer prescribed six consecutive days of training without rest. The woman felt “lazy” due to muscle soreness—a natural physiological response that her digital trainer failed to recognize or acknowledge.

“People rely so much on algorithms that they lose touch with the real sensations of their bodies,” says D’Orazio. “A real trainer can tell when you’re stressed, when you haven’t had enough sleep, or when you just need to talk for five minutes before starting a workout. AI doesn’t do that. It only sees numbers—calories, steps, heart rate—not emotions, hormones, or mood.” Movement should improve your relationship with your body, not create anxiety about it.

This disconnect is especially dangerous given how closely your physical fitness is tied to your mental health. Marshall Weber , a certified personal trainer and owner of Jack City Fitness, has experienced the psychological consequences firsthand. “I’ve seen people become discouraged and even anxious by relying too heavily on AI-powered fitness tools,” he explains. “While it’s great that these apps can track everything, they lack balance and self-compassion in their fitness journey.”

I know that when I’m in a vulnerable psychological state, this lack of empathy can be devastating. As D’Orazio warns, “If we’re not careful, we’ll see a wave of people who are ‘in shape’ on paper but emotionally exhausted and disconnected from their bodies.” Constant feedback on performance is a surefire path to an unhealthy obsession with fitness goals.

Artificial intelligence simply cannot replace the human touch.

Outside of fitness, one of the most significant limitations of AI is its inability to recognize context. Adrian Kelly , a business and sports coach, highlights the risks involved: “Exercise can be a highly emotional experience, with ups and downs depending on whether we meet or fail to meet our own expectations.” He notes that traditional coach-client relationships offer something AI cannot: empathy, accountability, and trust based on a genuine human connection. An experienced coach recognizes the early signs of eating disorders, overtraining, or emotional distress. They celebrate victories that aren’t big, adjust plans when life gets tough, and remind you that rest is productive.

“The healthiest results come from developing trust, flexibility, and self-awareness—things a machine simply can’t measure,” says D’Orazio. “Movement should make you feel more human, not less.”

What do you think at the moment?

Dr. Ayesha Bryant , clinical consultant at Alpas Wellness, warns of an unhealthy obsession with health data encouraged by AI systems. “This over-quantification of fitness can lead to perfectionism or body dysmorphic disorder, especially in vulnerable individuals,” says Bryant. The problem is compounded by blind trust in the algorithm, where users continue to follow AI recommendations even when experiencing pain, burnout, or clear signs of a need for rest or medical attention.

Even if someone is knowledgeable enough to ignore AI recommendations, algorithmic validation is still necessary. It’s all too easy to switch from intrinsic to extrinsic motivation, forgetting that the whole point of moving the body is to experience pleasure.

Bottom line: find balance.

This doesn’t mean AI-powered fitness tools aren’t suitable for a healthy lifestyle. They can be useful for tracking data, setting reminders, or recording workouts. But they should complement , not replace, human advice and body awareness.

Weber recommends that anyone who exercises regularly “consider making an appointment with a physical therapist to ensure you’re still being kind to yourself.” Bryant agrees, emphasizing that “long-term well-being and quality of life depend on empathy, adaptability, and human connection.”

If the AI ​​revolution in the fitness industry is already here, we need to approach it with a clear vision. Your body is not a machine to be optimized. It is a complex intelligent system deserving of compassion, flexibility, and human understanding—something no algorithm can provide.

More…

Leave a Reply