Mental Health Apps Often Share Data Without Users' Knowledge

Megan Brooks

April 23, 2019

Popular smartphone apps for depression and smoking cessation fall short when it comes to informing users of their data sharing and privacy practices, new research shows.

In a cross-sectional study of 36 top-ranked apps for depression and smoking cessation available in app stores, 29 (81%) transmitted data to services provided by Facebook or Google. But only 12 (43%) of 28 apps sharing data with Google and 6 (50%) of 12 apps sharing data with Facebook accurately disclosed this in a privacy policy.

"Informed choice is a fundamental principle in healthcare," Kit Huckvale, MBChB, postdoctoral fellow at the Black Dog Institute, UNSW Sydney, Australia, told Medscape Medical News

"What is concerning about our findings is that users of popular health apps often have no way to tell if Facebook or Google will be involved in handling their data in some way. Many people may not take issue with this. But everyone has the right to know and reach their own decision," said Huckvale.

The study was published online April 19 in JAMA Network Open.

Privacy Policies Not Enough

Twenty-five (69%) of the 36 apps evaluated did incorporate a privacy policy. Of these 25 apps, 22 of them (88%) told users how they primarily use the data collected, but only 16 (64%) disclosed other secondary uses for the data.

Huckvale believes providers who are recommending mental health apps to patients "should highlight — without overstating the risks — the high likelihood of data sharing with these companies. This is also a timely opportunity for providers to ensure that their own products accurately disclose any third-party data handling."

Huckvale said calls for further regulation of the health app marketplace are "understandable but run into problems of feasibility given the sheer numbers of apps. Nor is more legislation necessarily needed. Consumer privacy law is generally clear in this area, and companies such as Facebook and Google already require developers to clearly state the use of their services in privacy policies," he said.

Nevertheless, there are some opportunities for action, said Huckvale.

"Focussing on the quality of the smaller numbers of apps that are most likely to be used in clinical practice is a promising area. Curated app collections, such as those offered by PsyberGuide for mental health and the UK NHS, make it feasible to scrutinize apps in depth.

"But, as our study shows, this only works if you can reliably explore the detail of what apps are doing with data — just looking at privacy policies isn't enough," Huckvale said.

Huckvale also believes the clinical and research community needs to "get better at communicating to developers these kinds of safety and privacy-related priorities — and why we care about them in the first place."

Cause for Concern

Commenting on the study for Medscape Medical News, Adam C. Powell, PhD, president of Payer+Provider Syndicate, a Boston, Massachusetts–based consulting firm specializing in operational challenges faced by health insurance companies and hospitals, said the finding in this study that app privacy policies often misrepresent the underlying privacy practices of apps is "quite concerning."

To address the problem of privacy policy inaccuracy, there needs to be some degree of third-party oversight, said Powell. He noted that between 2002 and 2018, the Federal Trade Commission (FTC) brought 65 cases against companies that engaged in unfair or deceptive practices involving the use and protection of consumers' personal data.

However, the FTC probably can't be the sole enforcer, said Powell, adding that there is likely a need for other third parties to assist in the app review process. "Such organizations could be non-profit third parties, such as PsyberGuide, government-run entities, such as the NHS Apps Library (run by the UK), or even for-profit entities," he noted. 

"Rather than conducting enforcement actions against problematic apps, these third-parties can and do play a role in curating lists of high-quality apps. Privacy policies need to be one element considered within the curation process," he added.

Powell also noted that inadequate disclosure about data sharing is a small part of a larger problem surrounding privacy policies. In a recent study, he and his colleagues found that most privacy policies for mental health apps are written at a college reading level.

"Thus, even if it is possible for regulators to enforce truth in privacy policies, they may not play an important role in shifting behavior if they are too complex for people to understand, or if they take too long to read for most people to consider them," Powell said.

The study had no commercial funding. The study authors and Powell have disclosed no relevant financial relationships.

JAMA Netw Open. Published online April 19, 2019. Full text

For more Medscape Psychiatry news, join us on Facebook and Twitter


Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.