Contents:
An investigation conducted by Mozilla researchers regarding mental health and prayer applications showed that the security and privacy of their users are not very important or, in some cases, do not matter at all.
Mozilla has recently published the results of a new analysis into these types of apps, which frequently handle, among others, delicate topics like depression, mental health understanding, anxiety, domestic abuse, post-traumatic stress disorder (PTSD), as well as religion-themed services.
As per Mozilla’s latest *Privacy Not Included guide, notwithstanding the sensitive data these apps handle, they regularly share information, permit weak passwords, target susceptible customers with customized ads, and have vague and poorly written privacy policies.
The corporation discovered that 25 out of the 32 applications aimed at mental wellbeing and religion that have been part of the study failed to comply with Mozilla’s Minimum Security Standards.
Mozilla has created a set of Minimum Security Standards that any manufacturer developing connected products must adhere to, and each product is evaluated against five criteria:
- Encryption
- Security updates
- Strong passwords
- Vulnerability management
- Privacy Policy
Mozilla’s *Privacy Not Included warning label is applied to products that the company has established to have the most issues protecting a user’s privacy and security. A product will receive the *Privacy Not Included warning label if it gets two or more warnings from the Mozilla on the following criteria:
- How the organization uses the data it gathers on customers;
- How users can manage their information;
- What is the company’s track record in terms of data security.
When it comes to protecting people’s privacy and security, mental health and prayer apps are worse than any other product category Mozilla researchers have reviewed over the past six years.
Talkspace, Better Help, Calm, Glorify, 7 Cups, Wysa, Headspace, and Better Stop Suicide were among the applications investigated by the organization. As a result, each app now has its own section where users can learn more about the software’s privacy and security ratings.
Better Stop Suicide is one of many suicide prevention apps available. The app is free and includes tools like feel-better tasks, emotional needs evaluations, and the ability to record your own life-saving message. The app, unfortunately, failed Mozilla’s test.
Holy vague and messy privacy policy Batman! Better Stop Suicide’s privacy policy is bad. Like, get a failing grade from your high school English teacher bad.
PTSD Coach and the AI chatbot Wysa were the only apps on the list that seemed to prioritize data management and user privacy.
Jen Caltrider, Mozilla’s *Privacy Not Included lead declared:
The vast majority of mental health and prayer apps are exceptionally creepy.They track, share, and capitalize on users’ most intimate personal thoughts and feelings, like moods, mental state, and biometric data. Turns out, researching mental health apps is not good for your mental health, as it reveals how negligent and craven these companies can be with our most intimate personal information.
If you liked this article, follow us on LinkedIn, Twitter, Facebook, Youtube, and Instagram for more cybersecurity news and topics.