Do Therapy Apps Really Protect Your Privacy?
Unsurprisingly, after a year of unprecedented (remember that word?) Isolation, the number of users looking for remote therapy and mental health apps has skyrocketed. In terms of raising awareness about mental health and reducing mental health stigma, the increase in the number of people seeking help is a positive development. However, like anything you download to your phone, therapy apps have data and privacy issues – perhaps even more so than other apps, given the confidential nature of what you disclose to virtual therapists.
Text-based therapy and mental health apps like Talkspace, MindDoc, and (perhaps best known) BetterHelp offer important benefits. These apps tend to be more accessible and certainly more convenient than face-to-face meetings. But in 2020, Jezebel reporters explored the “loosely regulated world of online therapy,” with a focus on BetterHelp. Likewise, earlier this year, Consumer Reports’ Digital Lab researchers evaluated seven popular therapy apps, including BetterHelp, to figure out what happens to your personal information after you share it with the app.
Read on to find out what these reports found so you can make an informed decision about whether text therapy apps are right for you.
What information does BetterHelp claim to protect?
BetterHelp is committed to taking privacy seriously. Here’s what they list in their Privacy Shield FAQ :
- Anything you say to your therapist is confidential.
- We do not partner or work with insurance companies, so there is no need to report, report or register anything to them.
- You can always click the “Shred” button next to every message you send so that it no longer appears on your account.
- All communications between you and your therapist are secured and encrypted with bank-grade 256-bit encryption.
- Our servers are spread across multiple Tier 4-AWS data centers for optimal security and protection.
- Our encrypted browsing (SSL) system follows current best practices.
- Our databases are encrypted and scrambled, so they are effectively rendered useless in the unlikely event that they are stolen or misused.
In addition, BetterHelp ‘s Web Page Encryption (SSL) system is provided by Comodo, “the world leader in data security,” according to Eric Silver of eCounseling.com .
So what does all this mean to you? Silver claims that the measures described above ensure that your information is as secure as possible. He even goes on to say that the files stored on the therapist’s computer are “just as [if not more] vulnerable to attacks like BetterHelp.” However, I would say that many users are worried not so much about hacker access to your encrypted information, but how BetterHelp knowingly collects and shares your information with third parties.
What does the privacy policy mean?
The privacy policies are not entirely known for their clarity and the latest updated BetterHelp policy is no exception. Jeff Gunther, a professional therapist who is @therapyden on TikTok , recently went viral for researching what he thinks is causing problems in BetterHelp’s practice. In several app videos, Gunther delves into the company’s statement that “the data that [BetterHelp] collects is not used for marketing or any other purpose, except as set out in this Privacy Policy”. Gunther thinks this could be paraphrased as BetterHelp, saying that “the way we collect data is not used for marketing, except when we use it for marketing.”
In addition to standard demographic information (such as your age, gender, and location), some of the most important data that BetterHelp collects relates to “ visitor activity ”. This includes the frequency with which users access the app, how much time they spend online, and how many messages they exchange with their therapist. The Jezebel report points out that while we’re obligated to share every thought we’ve ever had on social media, it still seems “rather creepy” that BetterHelp shares how often you talk to your therapist with third-party organizations like like Snapchat and Pinterest. – “even if it is written in small print.”
And even if it sounds like “the usual shenanigans of tech companies,” Gunther adds, “this is not normal in the mental health industry.” (It should be noted that Gunther closes his videos, arguing that no one should feel “guilty” for using BetterHelp, and that there are “amazing” therapists in this app; he disagrees with the company’s privacy policy and not with people who they are used.kinds of applications.)
Where does your information go?
When Jezebel reporters signed up to BetterHelp to track what information the company was collecting and transmitting , they found that – in line with the “purported goal of better tracking user behavior” – sensitive user information was indeed being passed on to advertisers. BetterHelp responded to Jezebel, stating that their methods are standard and that they “generally far exceed all applicable regulatory, ethical and legal requirements.” And they’re not wrong: Jezebel reports show BetterHelp is within its legal right to share its therapy habits with Facebook. This means that until the laws change, the problem will be the same that we face every time we “read” the “terms of service” for any application: we share personal information with tech companies that protect themselves in small print.
What about HIPAA?
HIPAA is a key federal health data law that sometimes protects your information. Unfortunately, “tech practices have gone beyond what laws like HIPAA were designed to do, and until the rules evolve, these companies have a duty to serve consumers better,” says Bill Fitzgerald, privacy researcher at Digital Lab CR, who ran the mental health app. research.
BetterHelp specifically notes that all of their therapists are HIPAA-compliant. This is remarkable when you consider that HIPAA does not apply at all to “mental health” applications with a broader definition, as Consumer Reports explains :
The law protects data not only because it is related to your health. It only applies to information collected and stored by “covered entities” such as insurance companies, health care providers, and “business partners” who provide services such as billing to them. These companies sign agreements with business partners that prohibit them from doing anything with the data other than helping suppliers run their business unless they get explicit permission from the patient.
However, a Consumer Reports investigation found BetterHelp’s HIPAA protections are getting blurry, especially when the app sends data to social media platforms like Facebook. BetterHelp President Alon Matas told Consumer Reports that Facebook “can use data from other apps at an aggregated level rather than on an individual basis.” The main takeaway here is that companies like Facebook can easily accept that you are using a mental health app and then combine that knowledge with other data points the app collects. Taken together, it’s safe to assume that the ads shown to you are based on your online therapy habits. So while it is Facebook’s policy that sensitive data such as your medical symptoms are not used for targeted advertising, mental health apps still provide a number of data points that can be considered fair play.
Making an informed decision
In all fairness, the privacy concerns discussed above are by no means exclusive to BetterHelp or any other mental health app. Consumer Reports notes similarities between mental health services and virtually every downloadable app (in particular, their report highlights the fact that all of these apps assign unique identifiers to individual smartphones that can be tracked and combined with other data for targeted advertising). The question remains whether it is okay for mental health services to operate in the same way as all the other apps we are used to collecting our data.
Mental health is a very personal matter, and in an ideal world we would not need to worry about how your information will be shared or misused by services such as BetterHelp. Alas, until laws and regulations go into effect, your decision to use text therapy apps will depend on your presentation of informed consent. If you’re already comfortable with posting your mental health information directly to Facebook, then using mental health apps could cost you privacy risks. Remote therapy options are critical tools for many people, and at this point it doesn’t seem like their concerns about privacy are causing you to ditch them entirely.
Find what works for you
Earlier, we talked about what you need to know if teletherapy can work for you , as well as the pros and cons of text therapy (especially for those who want to try, but are not ready for face-to-face communication). sessions). Finally, here’s our guide to finding the right therapist .