Pocket worthyStories to fuel your mind

How Mental Health and Prayer Apps Fail Spectacularly at Privacy

Your smartphone can instantly connect you to safe spaces via therapists, guided meditation, or scripture. But how safe—and how private—is the data that comes from these interactions?

Pocket Collections

Read when you’ve got time to spare.

In partnership with
Mozilla Foundation

Turns out, researching the privacy and security of mental health apps is bad for your mental health. This is the first thing I learned while digging into the growing world of mental health apps in my role as creator and lead of Mozilla’s Privacy Not Included buyer’s guide.

May is Mental Health Awareness Month and it sure seems the world’s mental health could use some help these days. Enter mental health apps, designed to help people connect with therapists online, meditate, track moods and symptoms, pray, and even play games designed to make us happier. Thanks to the current mental health crisis, these apps are a rapidly growing industry with a lot of growing pains. Primary among those growing pains are frightening concerns about privacy and security. We’re watching health privacy issues run smack into our current out-of-control data economy and it’s scary.

For Privacy Not Included, we researched the privacy of 32 popular mental health and prayer apps, including Better Help, Talkspace, Calm, Headspace, Happify, Wysa, and Pray.com. And we learned that too many of these apps can share very personal information with advertisers for interest-based ad targeting, sell your information, gather even more information about you from places such as data brokers, and don’t always have strong security practices to protect all this very personal information they share with you. Yikes! As I was researching these products, I came across a number of articles that helped me understand the ongoing problems with mental health apps and privacy. I want to share them here, with hopes they’ll help contextualize our ratings and recommendations.

From our partners

Privacy Not Included: Mental Health Apps

Mozilla Foundation

Jen Caltrider: “It is wrong of me to say a good place to start learning about the privacy concerns of mental health apps is the buyer’s guide we created at Privacy Not Included? Because I’m rather proud of this important work and would love for you to check it out.”

The Therapy-App Fantasy

Molly Fischer
The Cut

JC: “Want to go deep into the problems surrounding mental health apps and online therapy? Make yourself a cup of tea (or something stronger) and settle in with this great article from The Cut.”

The Spooky, Loosely Regulated World of Online Therapy

Molly Osberg
Jezebel

JC: “Strict health privacy laws like HIPAA cover the contents of the conversations you’re having with your online therapist. But they don’t cover the fact that you’re talking to a therapist. That’s the sort of information Jezebel points out companies like Better Help can share with Facebook and others. Yup, Facebook can know if you’re seeing an online therapist, when, and how often. And it’s all completely legal and normal in our data economy. This article is well researched and eye-opening.”

Mental Wellness Apps Are Basically the Wild West of Therapy

Isobel Whitcomb
Popular Science

JC: “There are somewhere between 10,000 - 20,000 mental health apps available today. Researchers have only been able to look into a handful of them as they explode around the world. I enjoyed reading about what researchers know about them so far (spoiler: it’s not much).”

Your AI Chatbot Therapist Isn’t Sure What It’s Doing

Camille Sojit
Gizmodo

JC: “AI chatbots for mental health are a thing. And why not? For some, myself included, chatting with a bot is easier than chatting with a real, live human being. But whoa nelly! They raise a host of questions in the process. I love how this Gizmodo piece lays it all out.”

Researchers Spotlight the Lie of ‘Anonymous’ Data

Natasha Lomas
TechCrunch

JC: “Privacy policies (and I read a lot of them) love to crow about how they can anonymize your data and then do just about anything they want with it because it’s no longer linked to you. NOT SO FAST! Researchers found it’s actually pretty easy to re-identify de-identified data. Sorry for adding to your nightmares.”

Jen Caltrider

During a rather unplanned stint working on her Master's degree in Artificial Intelligence, Jen quickly discovered she’s much better at telling stories than writing code. This epiphany led to an interesting career as a journalist covering technology for CNN. Continuing down her random life path, Jen moved from CNN to digital media activism, where she helped pioneer the creative use of digital storytelling to try and leave the world a little better than she found it. That eventually brought her to Mozilla, where she created and now leads Privacy Not Included, a guide to help consumers shop smart for consumer tech that won’t invade their privacy.

Jen spends her days as a consumer privacy advocate helping people better understand issues around privacy, security, and artificial intelligence in their technology. Just exactly what she thought she’d be growing up in rural West Virginia. (not really, life is random…and wonderful).