Therapy apps are still failing their privacy checkups

A selection of emojis and icons for mental health apps against a pink background.
Some of the 27 mental health apps tested by Mozilla last year have improved. Most, however, have not. | Image: Mozilla / *Privacy Not Included.

An investigation into mental health apps has revealed that many of the most popular services are failing to protect the privacy and security of their users. Following up on a report from last year’s Privacy Not Included guide, researchers at Mozilla found that apps designed for sensitive issues like therapy and mental health conditions are still collecting large amounts of personal data under questionable or deceptive privacy policies.

The team re-reviewed 27 of the mental health, meditation, and prayer apps featured in the previous year’s study, including Calm, Youper, and Headspace, in addition to five new apps requested by the public. Of those 32 total apps, 22 were slapped with a “privacy not included” warning label, something Mozilla assigns to products that have the most privacy and personal data concerns. That’s a minor improvement on the 23 that earned the label last year, though Mozilla said that around 17 of the 27 apps it was revisiting still scored poorly — if not worse — for privacy and security this time around.

Replika: My AI Friend, a “virtual friendship” chatbot, was one of the new apps analyzed in the study this year and received the most scrutiny. Mozilla researchers referred to it as “perhaps the worst app we’ve ever reviewed,” highlighting widespread privacy issues and that it had failed to meet the foundation’s minimum security standards. Regulators in Italy effectively banned the chatbot earlier this year over similar concerns, claiming that the app violated European data privacy regulations and failed to safeguard children.

BetterHelp was also highlighted for improperly sharing its customers’ sensitive data with advertisers like Facebook and Snapchat after it had promised to keep such information private. In March, the online counseling company agreed to pay the Federal Trade Commission $7.8 million to settle charges against it for such behavior. Other mental health apps listed as having terrible privacy and security practices include Pride Counseling (owned by BetterHelp), Talkspace, Headspace, and Shine. Mozilla also noted that Better Stop Suicide, Liberate, and RAINN are no longer supported, and therefore unlikely to be receiving any critical security updates to protect users.

Meanwhile, some of the apps featured on last year’s list did see some improvements. Youper is highlighted as the most improved of the bunch, having overhauled its data collection practices and updated its password policy requirements to push for stronger, more secure passwords. Moodfit, Calm, Modern Health, and Woebot also made notable improvements by clarifying their privacy policies, while researchers praised Wysa and PTSD Coach for being “head and shoulders above the other apps in terms of privacy and security.”

Mozilla says that the results of this latest study don’t necessarily mean you have to stop using an app that scored poorly. The team has left custom tips on each of the apps reviewed in the report to provide guidance on how to preserve your privacy when using them.

Many of the issues outlined in Mozilla’s report play into wider concerns about the privacy of mental health apps. The increased demand for these services during the covid pandemic prompted lawmakers like Sen. Elizabeth Warren to investigate the relationships between therapy apps and online advertisers last year, believing that they could be unjustly profiting off of customers’ sensitive data. Mozilla claims that the market for mental health apps has grown by around $1 billion since 2022 alone.

Recent Articles

Related Stories