Mental Health Apps are Likely Collecting and Sharing Your Data

/ data privacy, metadata

May is mental health awareness month! In pursuing help or advice for mental health struggles (beyond just this month, of course), users may download and use mental health apps. Mental health apps are convenient and may be cost effective for many people.

However, while these apps may provide mental health resources and benefits, they may be harvesting considerable amounts of information and sharing health-related data with third parties for advertising and tracking purposes.

Disclaimer: This post is not meant to serve as legal or medical advice. This post is for informational purposes only. If you are experiencing an emergency, please contact emergency services in your jurisdiction.

Understanding HIPAA

Many people have misconceptions about the Health Insurance Portability and Accountability Act (HIPAA) and disclosure/privacy.

white paper with words "hipaa compliance" on top half

According to the US Department of Health and Human Services (HHS), HIPAA is a "federal law that required the creation of national standards to protect sensitive patient health information from being disclosed without the patient's consent or knowledge." There is a HIPAA Privacy Rule and a HIPAA Security Rule.

The Centers for Disease Control and Prevention (CDC) states the Privacy Rule standards "address the use and disclosure of individuals' health information by entities subject to the Privacy Rule." It's important to understand that the Privacy Rule covers entities subject to it.

Entities include healthcare providers, health plans, health clearing houses, and business associates (such as billing specialists or data analysts). Many mental health apps aren't classified as either; also, though there are a few subject to HIPAA, some have been documented not to actually be compliant with HIPAA rules.

white paper on a brown desk surface with "hipaa requirements" on top

What does this mean? Many mental health apps are not considered covered entities and are therefore "exempt" (for lack of better word) from HIPAA. As such, these apps appear to operate in a legal "gray area," but that doesn't mean their data practices are ethical or even follow proper basic information security principles for safeguarding data...

Even apps collecting PHI protected by HIPAA may still share/use your information that doesn't fall under HIPAA protections.

Mental health apps collect a wealth of personal information

Naturally, data collected by apps falling under the "mental health" umbrella varies widely (as do the apps that fall under this umbrella.)

However, most have users create accounts and fill out some version of an "intake" questionnaire prior to using/enrolling in services. These questionnaires vary by service, but may collect information such as:

  • name
  • address
  • email
  • phone number
  • employer information

First, account creation generally and at minimum requires user email and a password, which is indeed routine.

render of molecules on dark blue faded background

It's important to note your email address can serve as a particularly unique identifier - especially if you use the same email address everywhere else in your digital life. If you use the same email address everywhere, it's easier to track and connect your accounts and activities across the web and your digital life.

Account creation may also request alternative contact information, such as a phone number, or supplemental personal information such as your legal name. These can and often do serve as additional data points and identifiers.

It's also important to note that on the backend (usually in a database), your account may be assigned identifiers as well. In some cases, your account may also be assigned external identifiers - especially if information is shared with third parties.

Intake questionnaires can collect particularly sensitive information, such as (but not necessarily limited to):

  • past mental health experiences
  • age (potentially exact date of birth)
  • gender identity information
  • sexual orientation information
  • other demographic information
  • health insurance information (if relevant)
  • relationship status

betterhelp intake questionnaire asking if user takes medication currently

Question from BetterHelp intake questionnaire found in FTC complaint against BetterHelp

These points of sensitive information are rather intimate and can easily be used to uniquely identify users - and could be disasters if disclosed in a data breach or to third party platforms.

These unique and rather intimate data points can be used to exploit users in highly targeted marketing and advertising campaigns - or perhaps even used to facilitate scams and malware via advertising tools third parties who may receive such information provide to advertisers.

Note: If providing health insurance information, many services require an image of the card. Images can contain EXIF data that could expose a user's location and device information if not scrubbed prior to upload.

Information collection extends past user disclosure

globe turned to america on black background with code

Far more often than not, information collected by mental health apps extends past information a user may disclose in processes such as account creation or completing intake forms - these apps often harvest device information, frequently sending it off the device and to their own servers.

For example, here is a screenshot of the BetterHelp app's listing on the Apple App Store in MAY 2024:

betterhelp app privacy in the apple app store

The screenshot indicates BetterHelp uses your location and app usage data to "track you across apps and websites owned by other companies." We can infer from this statement that BetterHelp shares your location information and how you use the app with third parties, likely for targeted advertising and tracking purposes.

The screenshot also indicates your contact information, location information, usage data, and other identifiers are linked to your identity.

Note: Apple Privacy Labels in the App Store are self-reported by the developers of the app.

This is all reinforced in their updated privacy policy (25 APR 2024), where BetterHelp indicates they use external and internal identifiers, collect app and platform errors, and collect usage data of the app and platform:

betterhelp privacy policy excerpt

In February 2020, an investigation revealed BetterHelp also harvested the metadata of messages exchanged between clients and therapists, sharing them with platforms like Facebook for advertising purposes. This was despite BetterHelp "encrypting communications between client and therapist" - they may have encrypted the actual message contents, but it appears information such as when a message was went, the receiver/recipient, and location information was available to the servers... and actively used/shared.

While this may not seem like a big deal at first glance - primarily because BetterHelp is not directly accessing/reading message contents - users should be aware that message metadata can give away a lot of information.

Cerebral, a mental health app that does fall under the HIPAA rules, also collects device-related data and location data, associating them with your identity:

cerebral app privacy in the apple app store

According to this screenshot, Cerebral shares/uses app usage data with third parties, likely for marketing and advertising purposes. Specifically, they associate things like location data, user content, usage data, sensitive information, and diagnostic data with your identity. They collect data about the device(s) you use to access the service, which in some cases is routine but can be rather invasive, depending on what they collect and associate this data with.

Cerebral also collects information from third-party sources, such as social media sites and public sources (data brokers and people search sites), and aggregates the collected information. So, Cerebral may collect data from your social media profiles or use data sourced from data brokers to "serve you better" - whatever that means.

Even in cases where the app indicates on its App Store page not many pieces of data are collected... they may still harvest a considerable amount of information. Remember, the labels on the App Store are self-reported by developers.

Talkspace is such as case. In the Apple App Store, it is shown they "only" collect and link purchase data and identifiers to your identity:

talkspace app privacy in the apple app store

However, their privacy policy states they collect device data, such as the DeviceID (an identifier an app can retrieve from the device to uniquely identify it when interacting with the app's servers).

They also collect referrer URL information, internet service provider (ISP) (likely derived from your IP address), browser type, date/time stamp of resources accessed via rather invasive tracking technologies such as tracking pixels (like the Meta Pixel) and Google Analytics

talkspace privacy policy disclosing their collection of device

They also collect telemetry and analytics data, such as app metrics or usage statistics. This data is likely connected to your identity, given the other information Talkspace collects. In other words, they know a lot of information about you, your health, your device, and your use of the service:

talkspace privacy policy disclosing their collection of telemetry and device data

This is all in addition to data a user may disclose, such as legal name, email address, address, employer information, and payment information.

All combined, this is quite an extensive well of information - without ever adding the PHI into the mix. In fact, in 2022, Talkspace received letters inquiring about their information collection practices (similar to BetterHelp) from three US senators. This could imply that the data collection could be as excessive and naturally brings questions about the security practices or such data as well.

Mental health apps may share your information with third parties

Many of the "mental health" and "therapy" apps share your data - which may include PHI in some cases - with third parties, despite their claims not to sell your information (well, technically they aren't...)

concept of lines on a tech purple background

BetterHelp is a prime example of a mental health app/platform taking advantage of its users, engaging in deceptive practices itself and sharing sensitive user information with third parties. In March 2023, BetterHelp was slapped with a $7.8 million fine and banned by the US Federal Trade Commission (FTC) from sharing user health data.

BetterHelp's deceptive practices were outlined by the FTC. Their deceptive practices included promises not to share, sell, or disclose user email addresses to anyone, its use of a "HIPAA certified" logo despite never receiving a review from a government agency or third party regarding their information practices, and only used user information for "non-advertising purposes."

The FTC also called out BetterHelp's information sharing with third parties, saying that BetterHelp should have advertised using language such as "Rest assured – we plan to share your information with major advertising platforms, including Facebook, Snapchat, Criteo, and Pinterest."

This is because they were doing that, despite promising users they wouldn't. In the official FTC complaint, numerous instances were cited where BetterHelp disclosed personal information of its users to various third parties:

  • In 2017, BetterHelp uploaded the email addresses of all (almost 2 million) clients to Facebook to target them with ads.
  • BetterHelp disclosed IP addresses and email addresses to Snapchat to target them with ads.
  • For a six-month period, BetterHelp disclosed email addresses of over 70,000 visitors to Criteo

As mentioned previously, BetterHelp also harvested the metadata of messages exchanged between client and therapist; this metadata included when a message was sent, location information, and to who a message was sent/received. It also shared metadata information with third parties, such as Facebook (it wasn't Meta at the time).

green binary code concept on dark background in tunnel

The metadata shared allowed Facebook insight into the time of day users were using the BetterHelp app, for how long the app was used, and from where in the real world (location data) the app was being accessed. This information became connected to user Facebook profiles.

In April 2024, Cerebral was fined $7 million for disclosing clients' health information to third parties - such as Google, TikTok, and Meta (Facebook and Instagram) for advertising purposes. Cerebral's apps and websites contained embedded third party trackers, which allowed third parties access to data such as:

  • contact information
  • medical histories
  • insurance information
  • medicine prescriptions

This data was used in numerous advertisement campaigns, likely including targeted and retargeted ads, profile building, and overall user tracking across other platforms, websites, and apps.

From this incident the FTC "banned" (pending court-approved order) Cerebral from sharing health-related data.

In this case, similar to BetterHelp, Cerebral had also assured clients that their data was "safe" and not shared with third parties (outside lawful disclosure).

red binary with pink words "data security" in center

This information sharing happens even outside of these couple of specific cases; just because there isn't a big "breach" or a fine issued by a regulatory body doesn't mean other mental health apps are not sharing information about their users with third parties. Given the above, they likely are.

For example, Talkspace's privacy policy indicates they do use user information for advertising purposes, though once becoming a client, apparently take stronger measures to protect user information. Though, it appears they may still share some data with clinical and academic "approved research partners."

talkspace privacy policy where they say they use user data of and use of the services to share with researchers

MindDoc, a mental health app developed by psychologists with over 3 million users, may also share certain user information with third parties.

Even though MindDoc is based in Germany, which falls under GDPR privacy laws, and says in their privacy policy all users - regardless of jurisdiction - can enjoy the protections GDPR provides... their privacy policy still indicates they share user information with third parties. Though, according to this screenshot of MindDoc's privacy policy, appears it is because of the website's/platform's use of third party cookies from Google and Google Analytics:

minddoc privacy policy stating use of google analytics and cookies for marketing purposes

Headspace, which falls under the "mental health app" umbrella, but is ultimately a wellness app with a goal to "improve the health and happiness of the world" collects and shares information with third parties such as Google. While Headspace doesn't provide counseling or therapy sessions on its platform - and doesn't request that sort of information from users - it does collect personal data, such as:

  • name
  • email address
  • phone number

When combined, these data points can identify users enough to track them across different platforms.

Headspace also collects app usage data and users' Facebook IDs and explicitly shares that information with Google and Facebook (Meta). Naturally, if you are using Headspace regularly or at all, advertisers will know - and you could be targeted when you are "vulnerable" or "more profitable" however you may define that.

Can users protect their privacy while using mental health apps?

The scales are tipped against users in this context, due to the intimate nature of disclosing health information (which may be required to retrieve effective treatment or help). There are still some actions users can take to minimize the affect on their privacy while using these apps, though users should do so understanding the limits of such actions.

Use a different (or masked) email address

Users may want to consider registering for mental health apps with a masked email or email alias. Masked/alias emails can forward emails to your actual email address without disclosing your real email address.

Common masked/email aliasing services are SimpleLogin and addy.io - the software is driving the service is also open source, giving self-hosters an option to host their own instance.

Some encrypted/secure email providers may offer email aliasing as well.

Disable mobile advertising IDs

Users can opt out of some level of tracking by disabling the mobile advertising IDs on their smartphones. This reduces some information companies can use from your phone for advertising purposes.

Steps for iPhone users.

Steps for Android users.

Read disclosures/authorizations carefully

Some mental health apps (particularly those that offer therapy/counseling services) may request your consent to disclose notes/metadata from your online therapy sessions. This is different than just accepting a privacy or terms of use policy.

Be on the lookout for forms or notices with the term "Authorization" in their titles - often, these forms request your "authorization" to share certain aspects of your online therapy sessions... mostly for marketing purposes.

If you have already (or accidentally) given consent to such disclosure, you can generally revoke it. For example, for the mental health app Talkspace, users have to send their revocation to a designated email address.

Final thoughts

Mental health apps allow users to access mental health help and resources quickly and conveniently. However, users may find that data shared in confidence with mental health apps may be "used against them" for marketing, advertising, and other purposes.

With that said, stay safe out there! Remember, #mentalhealthmatters and #privacymatters.

Disclaimer: This post is not meant to serve as legal or medical advice. This post is for informational purposes only. If you are experiencing an emergency, please contact emergency services in your jurisdiction.

Next Post Previous Post