Privacy cracks in mental health apps are a major concern for consumers

The data: 28 out of top 32 mental health apps fall short on consumer privacy, according to new research by global nonprofit Mozilla.

  • BetterHelp, Woebot, and Talkspace have the worst privacy and security measures.
  • BetterHelp’s privacy policies are vague, Woebot reportedly shares personal information with third parties, and Talkspace collects chat transcripts, per Mozilla.
  • Chatbot Wysa and PTSD Coach appear to have tighter security measures than competitors.

What’s driving the data: There isn’t a ton of regulatory oversight for mental health apps. That means telemental health companies don’t have to be completely transparent about their privacy policies.

For example, the FDA says it uses “enforcement discretion” for a lot of healthcare apps. This includes mental health apps that are considered “low risk” to patients.

  • That means the FDA doesn’t check or regulate every mental health app on the platform.
  • This is likely because the agency can’t keep up with the sheer number of mental health apps available to US consumers. There are up to 20,000 mental health apps available for smartphones.

Why this matters: Many consumers are concerned about their data privacy on healthcare apps.

  • 32% of adults are “very concerned” or “somewhat concerned” (32%) about the privacy of their healthcare information on healthcare apps, according to a September 2021 Morning Consult poll of over 2,000 US adults.
  • That doesn’t mean people are completely avoiding healthcare apps, though. Some consumers are more willing to use apps to track exercise, sleep, and weight.
  • About 25% of consumers say they currently use an app to track their sleep. And 34% of adults indicate they would use an app to track their sleep, per Morning Consult.

The big takeaway: Mental health platforms will need to be ultra-transparent about patient privacy to maintain long-term consumer adoption.

Over 48% of US individuals said they’d be unlikely to use virtual care again if their health data was susceptible to a security breach, per Cynergistek.

  • Some mental health platforms have already recognized the importance of data practice transparency.
  • Digital mental health company Lyra breaks down the range of ways it collects patients’ information on its website, for instance.
  • It’s likely its emphasis on privacy played somewhat of a role in Lyra’s rapid expansion. Its corporate client list includes Uber, eBay, Morgan Stanley, and Zoom.

"Behind the Numbers" Podcast