Go to page content

How your mental health data and information is sold to advertisers

mental health data cloud umbrella hero

More of our personal information is online than ever before. While there have been great steps forward in giving people more control over their personal data, and restricting what organisations can do with it, there are many practices that people find concerning. For example, the sharing of mental health data, for many of us the most personal, when we visit certain sites online.

What is mental health data?

We use the term ‘mental health data’ when describing personal data that concerns your mental health. By personal data, we mean any information that can be used to identify you. An example of mental health data would be your name and email address being linked to searches about depression.

What use is mental health data to advertisers?

Mental health data is valuable to advertisers because understanding exactly when a person is in a vulnerable state of mind means advertisers can target them more strategically. A person who is feeling low may invest more in products that alter their appearance, for example.

How your mental health data can inform targeted advertisements

If you were to visit a number of websites dedicated to mental health in order to read about depression or take a depression test, third-parties may receive this information, including your answers, and bid money to show you a targeted ad.

Platforms like Google, Facebook, and Amazon allow advertisers to target people who exhibit online behaviour that matches certain demographics.

As Hamish McRae in the Independent put it, 'instead of being a Facebook customer, people have become a Facebook product', sold to whatever advertiser pays to target you because of your online behaviour.

N.B. On average, in France mental health web pages placed 44.49 cookies on a visitor, in Germany 7.82 and in the UK 12.24.

Why the mental health data market is so valuable

The health data now being collected and shared, and the astounding volume of it, is extremely valuable. This is partly because of its extremely personal nature - the data you share with your menstruation app isn't likely to be information you'd share with others.

If an advertiser understands exactly what you feel and when, they can target you much more accurately. For example, the data of expecting mothers, who could be using menstrual apps to track fertility, mood and symptoms, is particularly valuable to advertisers; expecting parents are consumers who are likely to change to their purchasing habits. In the US, an average person’s data is worth $0.10, while a pregnant woman’s will be $1.50.

The astounding value of the health data market can perhaps best be seen by Washington’s statement that ‘as part of any post-Brexit trade agreement’ it wants unrestricted access to Britain's 55 million health records, ‘estimated to have a total value of £10bn a year’.

How this market exists despite data rules

You might find this subject matter surprising given the legal protections of ‘sensitive’ health data in the GDPR, but because of the vast number of data points that can be collected on a person, and the constantly developing ways in which companies can collect and use this data, the laws surrounding data protection are not always so clear-cut, and can leave room for some companies to act in, well, pretty questionable ways.

One of the loopholes that exists within the law is that although it’s illegal to collect and share health data that is personally identifiable for advertisement, or for other non-essential purposes, and so must be made anonymous when shared, it has proven to be fairly easy for companies to de-anonymise personal data. Phil Booth, coordinator of medConfidential, says ‘removing or obscuring a few obvious identifiers, like someone’s name or NHS number from the data, doesn’t make their medical history anonymous’, an individual can easily be identified by their ‘unique combination of medical events’.

Illustrating this, in an article in the Guardian by Toby Helm, it was revealed that Health secretary Matt Hancock has been selling the medical data of millions of NHS patients ‘to America and other international drug companies’ having ‘misled the public into believing the information would be “anonymous”'. The article revealed that in 2018, the government raised £10m by ‘granting licenses to commercial and academic organisations across the world’ to access 'anonymous' data; if patients didn't want their data to be used for research, they had to actively ‘opt out’ of the system at their GP surgery.

Another loophole is that although it's illegal to collect and share health data for advertising purposes, if a social media platform, for example, catches you in a ‘sad mood’ because of your online behaviour (perhaps you are liking tweets or facebook posts about feeling low), this is not necessarily classed as ‘health data’. Just because you google ‘what is depression’ or ‘am I depressed’ doesn't necessarily mean that you're depressed, you could just be curious about depression. So legally, a company could collect and sell the data of you visiting these websites, and target you with ads for feel-good vitamins or make-up, and use the defence that what they're trading is not health data.

Which mental health sites are sharing user data?

In September 2019, Privacy International (PI) published a report exposing how the majority of the top websites related to mental health share your data for advertising purposes.

Of the 136 popular mental health web pages related to depression PI analysed in the UK, France and Germany, (41 that were listed on Google France, 44 on Google Germany and 51 on the UK version of Google) 74.48% of web pages contained third-party trackers for marketing purposes. In the UK, 86.27% of all pages analysed contained third-party trackers for marketing purposes. These include ‘programmatic advertising’; the trading of your personal data to companies who eventually serve you targeted ads.

Some UK mental health websites that contain google ads:

Google’s statement about this on their Google Safety page is:

'We use data to serve you relevant ads in Google products, on partner websites, and in mobile apps. While these ads help fund our services and make them free for everyone, your personal information is not for sale'.

Your personal data is ‘not for sale’, but it's used by advertisers and Google to effectively and strategically target you for monetary benefit.

See PI’s full report for more detail on the websites analysed, methodology used, and more here.

Is mental health data also collected from apps?

Unfortunately, yes. The sharing of your mental health data is not just limited to the websites that you visit, but also what apps you use.

Research by Privacy International exposed how some of the top menstruation apps, such as MIA and My Period Tracker, trade details of your mood, your physical symptoms, and your sex life for advertising purposes.

These apps all inform Facebook when you open the app, and depending on your privacy settings, your activity on the app:

  • MIA by Mobapp Development Limited (2 million users)
  • My Period Tracker by Linchpin Health (over 1 million downloads on Google Play)
  • Mi Calendario by Grupo Familia (over 1 million downloads on Google Play)
  • Ovulation Calculator by Pinkbird (over 500,000 downloads on Google Play)

Glow (3 million users), is the most popular menstrual app used for tracking fertility. Glow's privacy policy states that although your personal data isn't directly sold to advertisers, companies that have contracted Glow for targeted advertising can learn, through embedded cookies, who individuals are and whether or not they are part of their target demographic, with only some data made anonymous.

The data that Glow, and the other menstrual apps listed above, can hold on you include:

  • Period tracker (the duration and frequency of your period).
  • Mood (for example; angry, stressed, anxious, calm, depressed, emotional).
  • Physicial symptoms (for example; acne, appetite, backache, bloating, cramps, fatigue, sex drive, indigestion, insomnia).
  • Sexual activity (details on frequency, protected or unprotected, and orgasms).
  • Lifestyle (if you went out, consumed coffee, alcohol, drugs, or smoked).
  • Body temperature.
  • Sleep duration.
  • Weight.
  • “Notes” (for any additional information, or as a diary).

Have these sites and apps changed their practices?

The original Privacy International report that exposed this was widely shared and caused a public outrage, which led some websites to reconsider their practices and limit the amount of data they shared. For example, the menstruation app Clue changed their practices to stop sharing personal data with Facebook.

Most websites, however, still share your mental health data with third-parties. In fact, the number of third-parties contacted by mental health websites has increased for all three countries listed in the report. For example, the page dedicated to treatments for depression on French website Eurekasante now contacts 71 third parties in comparison to 36 in September 2019.

Why most people don’t realise this is happening

While there have been leaps forward, such as GDPR, which requires that users are given clear and comprehensive information in order to give explicit consent, this doesn’t always happen.

Complete transparency and consent can be difficult to enforce when most people lead very busy lives, and consent boxes and privacy policies are opaque or overly-complicated. As Frederike Kaltheuner, PI's Director of Corporate Exploitation points out, most people 'don't have the time to navigate complicated consent boxes which nudge them towards consent'.

As well as people not having time to navigate complex policies, many websites don’t feature ads, giving the impression that they don't have commercial relationships with advertisers. Privacy International’s research found that a number of mental health websites include marketing trackers without displaying any ads, simply allowing data collection on their site. Perhaps worse, e-counselling.com, a UK website, did contain a google ad but didn't display a cookie banner, and was found to contact a number of 8 third parties.

privacy international table

Image credit: Privacy International

The ethics of collecting mental health data

There's a strong ethical case to make against the fact that it's now almost impossible for people to seek mental health information or take a depression quiz without countless third parties watching: ready to use this information to target individuals. Particularly given that most people don’t realise this happens, this is done on false terms and without clear consent.

Some do make the case for the sharing of mental health data. It could provide a new form of revenue for health care providers, with some estimates suggesting that the NHS could make almost £10bn a year. Similarly, mental health data could power artificial intelligence with capabilities, particularly for early diagnosis, that are far more efficient than those of human beings.

However, as Elizabeth Denham, the Information Commissioner at the ICO pointed out,

‘the price of innovation does not need to be the erosion of fundamental privacy rights’.

Perhaps some of the biggest questions that arise from mental health data mining are:

  • Do the benefits of mental health data mining, such as health tech innovation and revenue for the NHS, outweigh the possible loss of our most basic privacy rights?
  • If laws and regulations can't keep up with the problems that arise from data mining, can we just rely on serious breaches of our privacy rights, media exposition, and public outrage to push companies to change themselves?
  • Is the only real problem a lack of transparency for users? Or is it morally wrong to ever trade the mental health data of a person for advertisement purposes?

At Rightly, we think you should have control over your own data. To see which websites and apps have your data, sign up below.

SIGN UP

To let us know what you think about mental health data, tweet us @rightlydata, we'd love to hear from you. Or, to learn more about how you can protect yourself online, see our blog.

Rightly_Logo_Blue_on_White.png

By Klara Lee

References

BBC News. (2020). Mental-health information 'sold to advertisers'. [online] Available at: https://www.bbc.co.uk/news/technology-49578500 [Accessed 26 Feb. 2020].

Harris, J. (2020). Will having longer, healthier lives be worth losing the most basic kinds of privacy? | John Harris. [online] the Guardian. Available at: https://www.theguardian.com/commentisfree/2020/feb/03/longer-healthier-lives-privacy-technology-healthcare [Accessed 25 Feb. 2020].

McGoogan, C. (2020). NHS illegally handed Google firm 1.6m patient records, UK data watchdog finds. [online] The Telegraph. Available at: https://www.telegraph.co.uk/technology/2017/07/03/googles-deepmind-nhs-misused-patient-data-trial-watchdog-says/ [Accessed 25 Feb. 2020].

Privacy International. (2020). Mental health websites don't have to sell your data. Most still do.. [online] Available at: https://privacyinternational.org/report/3351/mental-health-websites-dont-have-sell-your-data-most-still-do [Accessed 25 Feb. 2020].

Privacy International. (2020). No Body's Business But Mine: How Menstruation Apps Are Sharing Your Data. [online] Available at: https://privacyinternational.org/long-read/3196/no-bodys-business-mine-how-menstruation-apps-are-sharing-your-data [Accessed 25 Feb. 2020].