According to a new study by Mozilla researchers, mental health applications have lower privacy protections for users than most other types of apps. The study discovered that prayer apps had low privacy requirements as well. The Firefox browser maker explored the privacy and security features available for several mental health and prayer apps in its latest Privacy Not Included report, revealing some concerning issues. Some of the apps under investigation, such as headspace, Pray.com, Calm, and Talkspace, had millions of users and app downloads, placing many consumers at risk. According to Jen Caltrider, the Mozilla Privacy Not Included guide lead, “the vast majority of mental health and prayer apps are very intrusive.” Users’ most personal thoughts and sentiments, such as moods, mental state, and biological data, are tracked, shared, and profited from. The team looked at 32 mental health and prayer applications in the most recent version of the guide. 29 of the apps received a “privacy not included” warning flag, indicating that the team had reservations about the app’s data management practices. According to the researchers, the apps are designed for sensitive concerns such as mental health conditions, yet they capture vast quantities of personal data under opaque privacy regulations. Most apps had inadequate security standards as well, allowing users to register accounts with weak passwords despite the fact that they included highly personal data.
According to Mozilla, the apps with the poorest practices are Better Help, Youper, Woebot, Better Stop Suicide, Pray.com, and Talkspace. For example, the AI chatbot Woebot claims to collect information about users from third parties and share it for advertising purposes. Talkspace, a therapy company, collects user chat transcripts. Extensive data collection was one of the most common issues, with several apps collecting a large quantity of personal information about users. Despite the fact that several of the apps analysed said that they would not sell or share personal information with other parties, Mozilla discovered that many were doing exactly the opposite, presenting serious privacy issues. The Mozilla team claimed in a statement that it contacted the companies behind these apps many times to inquire about their policies, but only three answered. Talkspace said in a statement to TechRadar Pro that Mozilla’s article “lacks context from Talkspace and contains fundamental mistakes,” which they are working to correct. We have one of the most stringent privacy rules in the industry, and claiming that we collect user data or chat transcripts for purposes other than treatment delivery is false. Our privacy policy was recently changed to make it more visible to consumers and to clarify data-sharing procedures. In conjunction with the new privacy policy, Talkspace’s Notice of Privacy Practices, which can be found here, gives more information on how data is used.
Calm, one of the most popular apps with millions of users on iOS and Android, not only collects large amounts of personal information, but Mozilla discovered that it also gathers data from outside sources and uses multiple tracking and data collection tools to target ads and share information with a variety of third parties. Traditional mental health care in person can be difficult to come by for many individuals — most therapists have extensive waiting lists, and managing insurance and prices can be a huge roadblock to treatment. During the COVID-19 epidemic, the problem worsened as more people required medical attention. Apps for mental health aimed to fill that hole by making services more accessible and accessible. However, according to the research, this access may come at a cost in terms of privacy. In a statement, Mozilla researcher Misha Rykov described them as “data-sucking devices with a mental health app veneer.” To put it another way, a wolf in sheep’s clothing.
Leave feedback about this