Are mental well-being apps ethical and fair to users?

Anton P. | March 18, 2021

The fight against stigma surrounding mental health has motivated people to address their inner struggles head-on. Sometimes, the chosen support system can go beyond traditional face-to-face appointments. One coping mechanism, making impressive strides worldwide, refers to mental health apps. These tools might help people recognize toxic behavioral patterns, combat depression, anxiety, substance use, or other mental issues. However, while mental health trackers sound promising on paper, their use can backfire severely. Be it inaccurate self-diagnosis or over-reliance on numbers: users can obsess over gathered data without receiving proper help. Furthermore, mental health apps operate outside the jurisdiction of traditional health devices or tools. Thus, they might lack empirical evidence proving their efficacy or proper care of users’ personal data.

Are mental well-being apps ethical and fair to users?

What are mental health trackers?

The available selection for mental well-being apps is tremendous. Some might facilitate self-tracking, meaning that users can write down their symptoms and monitor their severity. Others could present specific treatment regimens, such as breathing techniques for people suffering from anxiety or panic attacks. Mood-trackers help recognize trigger points of emotional imbalance.

Thus, these tools have a lot in common with fitness apps, but instead of dealing with physical health, they coach self-help skills for emotional well-being. Mental health trackers are useful for familiarizing yourself with the inner dealings of your mind. They can also present recommendations for defusing highly-stressful situations and overwhelming emotions.

During the COVID-19 pandemic, experts emphasized the drastic deterioration of people’s mental health worldwide. People felt depressed, anxious, and were in overall low spirits due to the sudden uncertainties and transitions. While the pandemic took a toll on individuals’ emotional states, many had no possibility of face-to-face therapy sessions. This sparked a huge upswing in users trying out mental health apps. FDA even lifted some of its guidelines for psychiatry apps to ease the process of new app delivery. While deregulation meant to help, experts noted that it could aggravate the problems already present in the industry.

Main issues with apps allegedly nurturing mental health

Mental health apps can indeed help people beat the traditional roadblocks they face when seeking therapy. While receiving emotional support might typically be a positive development in one’s life, it can also be counterproductive. There are two major concerns when it comes to the use of mental health trackers.

  • Whether apps actually help. Specialists note the lack of scientific evidence behind mental health apps. Many providers claim to soothe users’ emotional states or diagnose certain disorders based on the symptoms provided. However, they typically do not present supporting evidence for the accuracy or overall effectiveness. In simpler terms, many apps will profess their efficacy but provide no evidence proving their claims. Thus, there are many concerns over whether mental health apps actually work. In worst-case scenarios, some app providers could take advantage of people’s genuine attempts to find help.
  • Whether apps create secure and private environments. Experts have highlighted the potential loss of privacy following the use of mental health apps. After all, the data collected by such applications is precious and confidential. It reveals users’ inner struggles, mental disorders, and other health data. There are two questions to discuss here. The first relates to the protection measures taken by the app providers to create a safe space for healing. They relate to the apps’ technical setup and resistance against data breaches or leaks. The second question, possibly even more disturbing, is the fact that app owners can be the ones sharing users’ data with third parties. While mental health trackers collect health data, HIPAA does not apply here. Thus, providers have no legal obligations to keep users’ information private.

How can apps for emotional well-being compromise privacy?

Users expect upfront explanations on how companies use their data. However, these requirements skyrocket when talking about mental health apps. In 2020, Jezebel had posted an extensive report discussing their findings on Better Help and Talk Space apps. Here are the key takeaways from this article:

  • Better Help initially collected information through a survey. It asked for users’ age, gender, sexual orientation, and details related to their mental health. This data, although anonymized, was shared with MixPanel (an analytics company).
  • Better Help shared metadata from online therapy sessions with Facebook. In fact, many companies like Google, Snapchat, or Pinterest received information about new users considering Better Help. In the case of Talk Space, the company shared metadata about users’ sessions with MixPanel. Both Better Help and Talk Space let Facebook know when users open their apps.
  • The owners of Better Help claimed to take their users’ privacy and confidentiality as a priority. It defended against these data-sharing accusations by stating that it did not share any personally identifiable information. However, experts were skeptical that such practices were ethical and fair to users. For one, the information about the apps’ use reached Facebook, linking people’s mental struggles with their identity online. It is unclear how exactly social media companies can exploit information about users’ mental states. However, it is likely to contribute to ad-delivery within these platforms.

This occurrence in the industry is not a one-off affair. A study in 2019 analyzed 36 popular applications for treating depression and aiding those quitting smoking. 29 shared users’ data with Google and Facebook. Disturbingly, only 12 app providers admitted to these practices in their privacy policies. Other applications engaged in this commercial data-sharing without informing their users. Thus, in addition to violating trust, they also breached the transparency requirements.

Sudden changes to privacy policies

Apps might feature a full-disclosure policy with no mentions of questionable data-sharing at the time of purchase or installation. However, providers have the power to change the terms users had agreed to initially. They might inform their client base of the alterations via their website or email message. However, they might not provide users with options to opt-out of the new set of terms. If rules governing the app usage no longer satisfy clients, their only option is to drop the app entirely. For people who might have relied on a specific app for months or even years, losing all their data can be devastating. Thus, experts note that stricter regulations should govern sudden policy changes, especially for health apps.

Recommendations for choosing the least intrusive apps

The requirements for mental health apps are utmost respect and care, transparency, and an evidence-based approach. These applications deal with potentially the most vulnerable information about us. Thus, you need to be fully aware of the way providers will handle your emotional journey.

The number one recommendation is never to skip privacy policies or terms of use documents. Many app owners might not fully disclose their data-sharing practices. For instance, they might mention that some bits and pieces of information will be shared under special circumstances. However, there might be no further explanations on where that data ends up going. In such a case, this red flag should be enough to make you look the other way. If possible, opt-out of having your data shared with third-party entities.

Additionally, be sure that the app encrypts communications between you and the therapist. However, the metadata may be fairly easy to extract. For instance, while app owners might not disclose the contents of your conversations, they might be able to share details surrounding it. If you are fairly interested in how a particular app deals with your privacy, do not hesitate to contact its providers. Do not be shy to ask hard or uncomfortable questions. If the app has nothing to hide, these concerns won’t be an issue to resolve. Lastly, pay attention to the extensive reviews or reports on mental health apps.

All these recommendations should help you choose an app that will serve you well without compromising your privacy. However, please take note that regulations governing health apps are not extensive. As a result, there are many gaps that providers can exploit. Hopefully, they will receive proper attention in the future and in the form of regulations.

Anton P.

Anton P.

Former chef and the head of Atlas VPN blog team. He's an experienced cybersecurity expert with a background of technical content writing.

Tags:

appsmetadata

© 2023 Atlas VPN. All rights reserved.