Intended for healthcare professionals

Editorials

Commercial health apps: in the user’s interest?

BMJ 2019; 364 doi: https://doi.org/10.1136/bmj.l1280 (Published 21 March 2019) Cite this as: BMJ 2019;364:l1280

Linked Research

Data sharing practices of medicines related apps and the mobile ecosystem: traffic, content, and network analysis

  1. Claudia Pagliari, senior lecturer in primary care
  1. eHealth Research Group, Usher Institute of Population Health Sciences and Informatics, University of Edinburgh, Edinburgh, UK
  1. Claudia.Pagliari{at}ed.ac.uk

Study shows how sensitive data from health apps is finding its way to corporations

Excitement about digital health is at an all time high, with innovations in mobile personal computing, robotics, genomics, artificial intelligence, cloud based infrastructure, and more, promising to revolutionise the organisation, quality, cost effectiveness, inclusivity, and personalisation of patient care.12 Amid this celebration, the shadow of privacy risks continues to lurk, like an unwelcome guest at a party.345

In a linked paper, Grundy and colleagues (doi:10.1136/bmj.l920) examine the surreptitious tracking and profiling of people using medicines related apps, which can generate sensitive health data.6

Grundy and colleagues used an “app store crawling program” to identify the top 100 medicines related apps available to Android mobile users in the UK, USA, Australia, and Canada, combined with a search for endorsed apps on a medicines related agency website, a health app library, a systematic review, and their personal networks. Of the 821 apps screened, 24 met the criteria of managing drugs (for example, information, decision support, adherence, engagement), requesting at least one “dangerous” permission, claiming to collect or share user data, or requiring user interaction. These were tested multiple times, using dummy user profiles representing professionals and patients, to create a baseline, then the process was repeated, each time changing one type of user information. Comparing network traffic before and after the profile changes revealed how the new data were being transmitted from the app. Next, the authors used IP Lookup tools to identify the data recipients and analysed their company information, privacy terms, data sharing agreements, and business models. Recipients were classified as first parties (app developers), third parties (external entities receiving data from the app), and fourth parties (organisations that receive and might aggregate data from multiple third parties). Network analysis was then used to map and visualise the pathways through which data are potentially being shared.

“Dangerous” permissions

On average, apps requested four or more “dangerous” permissions for private information held on the user’s phone, or which affected the operation of other apps. Most transmitted encrypted data, but several used clear text. Nineteen of the 24 apps shared user data, which were received by 55 unique third parties. Third parties typically reserved the right to hold user information for their own commercial purposes. Some collected data from other apps, along with communications and behavioural information, building detailed user profiles across devices, which could be shared with business affiliates or sold on. Although most third parties were developers, 33% were advertising companies and 8% were investor owned. The fourth party network included 237 entities, including “families” of companies with the same owner. Of these, Alphabet (Google) and Facebook were able to receive the most types of data, either directly from the apps or through third parties, whereas Alphabet, Amazon, and Microsoft received the highest volume of app user data overall.

Although others have convincingly shown vast networks of data leakage by Android apps,78 this study is unique in focusing on apps that can yield highly sensitive information about people’s use of or need for medicines. As the authors’ note, although many of the data fields collected by these entities can seem innocuous, in combination they can be used to uniquely identify and profile users, effectively bypassing existing data protection and privacy laws.

Such tracking practices differ for apps and online search engines,7 but the dominant role of global corporations such as Google is evident in both—as was also shown in another study published this week on “ad tech surveillance on the public sector web.”9 With these digital apex predators voraciously consuming other companies and the data they generate, as well as the global talent pool of data scientists able to make best use of them, concepts such as “free market” and “democracy” are beginning to look decidedly 20th century.

Exploitative practices

National Health Service patients may be cushioned from the worst impacts of exploitative health data practices, unlike our US cousins, but we are not immune. A shadow economy of commercial data brokers is silently gleaning, linking, and commoditising behavioural information about our health, spending, political attitudes, movements, time spent online, social networks, and so on, which is already influencing our mortgages, employment, travel, and more. With corporate data brokers and public sector data centres now collaborating to “understand society,” it is not overly fanciful to predict future policy scenarios in which these insights affect our access to drugs or place on a surgical waiting list. For now, the threats to our privacy and self determination are arguably the most important.

An EU ePrivacy regulation10 that extends the General Data Protection Regulation (GDPR)11 to web trackers and profiling is under development, although it has been described as “sitting in the sidings, being mobbed by lobbyists.”12 As the study reported by Grundy and colleagues illustrates, issues of consent and legitimate interest are muddied in the multiparty data ecosystem of digital health apps. Meanwhile, the capacity of regulators such as the Information Commissioner’s Office (https://ico.org.uk) to enforce the rules on privacy is severely constrained by lack of manpower.13 Penalties for exploitative data practices are typically applied only after incidents have occurred, been spotted, and been reported, and it is likely that the majority slip under the radar.

Nevertheless, there is a good news story hidden in this work—for one thing, Grundy and colleagues showed that companies were more likely to declare their data sharing partnerships after the GDPR had come into force, albeit with an eye on the back door. More importantly, all the studies mentioned here show the value of digital forensic research methods, for uncovering the illicit practices and business relations underlying fine words about regulatory compliance. With advances in technologies such as bots9 and AI,13 regulators could soon have an effective dashboard of suspect apps and websites. How they choose to respond to it is another matter, but without more effort to tackle the problem, public trust in digital health will continue to hold back its future.

Acknowledgments

CP leads the Interdisciplinary Research Group in eHealth, the MSc in Global eHealth, and the consumer informatics theme of the NHS Digital Academy.

Footnotes

References

Log in

Log in through your institution

Subscribe

* For online subscription