Skip to content Skip to footer

The struggle to make health apps truly private

Photo illustration of a phone screen with a picture of a spilled pill bottle on it.
Christina Animashaun/Vox

Why privacy and patient advocates are worried that substance use disorder apps aren’t keeping data private.

Jonathan J.K. Stoltman already knew how hard it can be for people with addiction to find the right treatment. As director of the Opioid Policy Institute, he also knew how much worse the pandemic made it: A family member had died of an opioid overdose last November after what Stoltman describes as an “enormous effort” to find them care. So Stoltman was hopeful that technology could improve patient access to treatment programs through things like addiction treatment and recovery apps.

But then he consulted last year with a company that makes an app for people with substance use disorders, where he says he was told that apps commonly collected data and tracked their users. He worried that they weren’t protecting privacy as well as they should, considering who they were built to help.

“I left after expressing concerns about patient privacy and quality care,” Stoltman told Recode. “I’m a tech optimist at heart, but I also know that with that widespread reach they can have widespread harms. People with an addiction already face substantial discrimination and stigma.”

So Stoltman reached out to Sean O’Brien, principal researcher at ExpressVPN’s Digital Security Lab, last March, asking if his team could analyze some apps and see if Stoltman’s concerns were founded. O’Brien, who has extensively studied app trackers, was happy to help.

“I had a responsibility to find out what data [the apps] collected and who they might be sharing it with,” O’Brien told Recode.

The results are in a new report that examined the data collection practices in a number of apps for opioid addiction and recovery. The research, which was conducted by ExpressVPN’s Digital Security Lab in partnership with the Opioid Policy Institute and the Defensive Lab Agency, found that nearly all of the apps gave third parties, including Facebook and Google, access to user data. O’Brien said he didn’t think anyone on his team “expected to find so much sloppy handling of sensitive data.”

Researchers couldn’t tell if that data was actually going to those third parties, nor could they tell what those third parties were doing with that data when and if they got it. But the fact that they could get it and that the apps were built to give them that access was enough to alarm privacy researchers and patient advocates. The report illustrates just how bad apps can be at privacy — even when they’re bound by the highest legal and ethical requirements and serve the most vulnerable population. And that developers can’t get privacy right for these kinds of apps doesn’t bode well for user privacy in all the apps we give sensitive data to.

“Smartphone users are simply not aware of the extent that they can be identified in a crowd,” O’Brien said. “If a user of a leaky app becomes a patient and is prescribed medication, the sharing of that info could create rippling effects far into the future.”

Adding to the problem is the rise of telehealth during the pandemic, which also came with a few loosened privacy restrictions to enable health care providers to see patients remotely after abruptly being cut off from in-person visits. Getting people the health care they need is, of course, a good thing. But the sudden move to telehealth, medical apps, and other online health services for everything from therapy to vaccine registrations also made more apparent some of the shortcomings of health privacy laws when it comes to protecting patient data.

There are a lot of gray areas surrounding what those laws are supposed to cover. And in general, apps are built to constantly (and, often, furtively) exchange user data with several other parties and services, some of which use that data for their own purposes.

How apps give away your data …

The ExpressVPN report looked at 10 Android apps, many of which provide medication-assisted treatments, or drugs that reduce cravings and ease withdrawal symptoms, via telehealth.

Those apps have become more widely used in the past year and a half, as they’ve expanded their coverage areas and raised millions in venture capital funds. They’ve also benefited from a temporary waiver of a rule that requires first-time patients to have an in-person evaluation before a doctor can prescribe Suboxone, which alleviates opioid withdrawal symptoms. Unless and until that rule is restored, an entire treatment program can be done through an app. That might lower the barriers to access for some people, especially if they don’t live close to a treatment provider, but the report found that it may also expose their data to third parties the apps use to provide certain services through, among other things, software development kits, or SDKs.

SDKs are tools made by third parties that app developers can use to add functions to their apps that they can’t or don’t want to build themselves. A telehealth app might use Zoom to provide videoconferencing, for example. But these SDKs must communicate with their provider to work, which means apps are sending some data about their users to a third party. How much and what type of data is exchanged depends on what the SDK needs and whatever restrictions the developer has placed, or is able to place, on it.

Some of the apps named in the report — Bicycle Health, Confidant Health, and Workit Health — told Recode that they have all the legally required agreements with their SDK vendors to protect any data exchanged, and that patient confidentiality is important to them.

“Using external tools to identify SDKs that are inside of apps and their function is difficult and typically problematic,” Jon Read, founder of Confidant, told Recode. He said that the Facebook SDK his app used was to allow users to voluntarily and easily share updates on their progress with their Facebook or Instagram friends. “No protected data was being shared with those services,” he added.

But some of the types of data those SDKs can access — like advertising IDs, which are unique to devices and can be used to track users across apps — indicated to researchers that they are collecting data beyond what the app or the SDK needs to function. And patients might not be comfortable about which vendors have access to their data without their knowledge. Facebook, Google, and Zoom, for instance, have all had their share of very public privacy issues, while most people probably have no idea what AppsFlyer, Branch, or OneSignal are or what they do (analytics and marketing, basically).

ExpressVPN also found that Kaden Health, which provides medication-assisted therapy and counseling services, gave the payment processor Stripe access to several identifiers and information, including a list of installed apps on a user’s device and their location, IP address, unique device and SIM card IDs, phone number, and mobile carrier name. Kaden also gave Facebook location access and gave Google access to the device’s advertising ID, according to the report. Kaden did not respond to a request for comment, but its privacy policy says “we also work with third parties to serve ads to you as part of customized campaigns on third-party platforms (such as Facebook and Instagram).”

This worries patient advocates who see the potential of these apps and how they remove barriers to access for some patients, but are concerned about the cost to patient privacy if these practices continue.

“Many people agree that addiction treatment needs to advance with the science,” Stoltman said. “I think you’d be hard-pressed to find people that think the problem is ‘we don’t give enough patient data to Facebook and Google.’ … Patients shouldn’t have to trade over their privacy to benefit corporate interests for access to lifesaving treatment.”

Yet many people do just that, and not just when it comes to opioid addiction and recovery apps. The report also speaks to a larger issue with the health app industry. Apps are built on technology that is designed to collect and share as much information about their users as possible. The app economy is based on tracking app users and making inferences about their behavior to target ads to them. The fact that we often take our devices with us everywhere and do so many things on them means we give a lot of information away. We usually don’t know how we’re being tracked, who our information is being shared with, or how it’s being used. Even the app developers themselves don’t always know where the information their apps collect is going.

That means health apps collect data that we consider to be our most sensitive and personal but may not protect it as well as they should. In the case of substance use disorder apps, patients are entrusting apps with intimate information about their stigmatized and, in some cases, criminalized health condition. But there are also apps that provide mental health services, measure heart rates, monitor symptoms of chronic illnesses, check for discounts on prescription drugs, and track menstrual cycles. Their users may expect a level of privacy that they aren’t getting.

… And why most of them are allowed to do it

Those users number in the millions: A 2015 survey found that nearly 60 percent of respondents had at least one health app on their mobile devices. And that was six years ago and before the pandemic, when health and wellness app use ballooned.

Silicon Valley clearly sees the potential of health apps. Big tech companies like Amazon and Google are continuing to invest in health care as more services move online, which leads to more questions about how these companies, some of which aren’t known for having great privacy protections, will handle the sensitive data they get access to. Recognizing their growth and how and why consumers use these apps, the Federal Trade Commission (FTC) even released a mobile health app-specific guide to privacy and security best practices in April 2016.

Five years later, it doesn’t appear that many health apps are following them. A recent study of more than 20,000 Android health and medical apps published in the British Medical Journal found that the vast majority of them could access and share personal data, and they often weren’t transparent with users about their privacy practices or simply didn’t follow them — if they had privacy policies at all. There have been reports that mental health apps share user data with third parties, including Facebook and Google. GoodRx, an app that helps people find cheaper prices for prescription drugs, was caught sending user data to Facebook, Google, and marketing companies in 2019. The menstrual tracker Flo has become a case study in health privacy violations for telling users that their health data wouldn’t be shared and then sending that data to Facebook, Google, and other marketing services. Flo reached a settlement with the FTC over those allegations last month and has admitted no wrongdoing.

Meanwhile, the Department of Health and Human Services waived certain privacy rules for telehealth for the duration of the pandemic to make more services available quickly when people were suddenly cut off from in-person care. That doesn’t apply to most of these apps, which, while classified as “health” apps, aren’t covered by medical privacy laws at all. Flo, for instance, got in trouble with the FTC over the deceptive wording of its privacy policy, which amounts to a consumer protection matter, not a health privacy one. But many of the opioid addiction recovery and treatment apps ExpressVPN looked at should be covered by the strictest medical records privacy laws in the country — both the Health Information Portability and Accountability Act (HIPAA) and 42 CFR Part 2, which specifically regulates substance use disorder patient records.

Part 2 was created to ensure the confidentiality of patient records in substance use disorder programs that receive federal assistance (which all but one of the apps ExpressVPN looked at do, though Part 2 doesn’t apply to all of the services they offer). The rule is written to ensure patients wouldn’t be discouraged from seeking treatment. Accordingly, Part 2 is more restrictive than HIPAA in terms of who has access to a patient’s records and why, and says that any identifying information about a patient (or de-identified data that can be combined with other sources to re-identify a patient) can only be shared with that patient’s written consent. There may also be state laws that further restrict or regulate patient record confidentiality.

But legal experts point out that those decades-old laws haven’t kept up with rapidly advancing technology, creating a legal gray area when it comes to apps and the data they may share with third parties. A spokesperson for the Substance Abuse and Mental Health Services Administration (SAMHSA), which regulates Part 2, told Recode that “data collected by mobile health apps is not squarely addressed by existing law, regulations, and guidance.”

“Patients should receive the same standard of confidentiality whether they’re meeting a provider face-to-face or seeking support through an app,” Jacqueline Seitz, senior staff attorney for Health Privacy at the Legal Action Center, told Recode. The report, she said, showed that they may not be.

Private health apps are possible, but they’re not easy to make

It doesn’t have to be this way. Experts say it is possible to build an app that should fulfill both the privacy and security expectations and the legal requirements of a substance use disorder app — or a health app, in general. It’s just much more difficult and requires more expertise to do so than to build an app without any privacy considerations at all.

“I would never say something is 100 percent secure, and probably nothing is 100 percent private,” Andrés Arrieta, director of consumer privacy engineering at the Electronic Frontier Foundation, told Recode. “But that’s not to say that you can’t do something that is very private or very secure. I think it’s technically possible. It’s just a willingness, or whether the company organization has the actual skills to do so.”

O’Brien agreed, saying app developers — albeit relatively few of them — have demonstrated that private and secure apps are possible. He said he saw no reason telehealth apps couldn’t do the same.

In fact, one of the apps ExpressVPN looked at didn’t have any tracking SDKs at all: PursueCare. The company told Recode that wasn’t easy to accomplish, and may not be permanent.

“I felt strongly about making sure we protect our patients as we grow,” PursueCare founder and CEO Nicholas Mercadante said. “But we also want to bring them best-in-class resources. So it is a balance.”

Mercadante added that PursueCare would likely, at some point, add a feature with a marketing SDK. “There’s almost no way to protect against all disclosures,” he said. The company would have to balance the privacy risks with health rewards when the time came.

If a health app isn’t necessary to provide patient care and consumers are properly informed about potential privacy violations, they can make their own decisions about what works best for them. But that’s not the case for every app, or every patient. If the only way you can get the help you need — whether it’s for opioid addiction recovery or any other mental or physical condition — is through an app, the privacy trade-off might be worth it to you. But it shouldn’t be one that you have to make, and you should at least be able to know you’re making it.

“Telehealth can provide us with the services we need while still preserving our privacy and, really, our dignity,” O’Brien said. “That won’t happen without honesty, transparency, and patients who call for serious change.”

If you or someone you know needs addiction treatment, you can seek help online through SAMHSA’s treatment locator or by phone at 1-800-662-4357.

Leave a comment

0.0/5