Health apps and at-home genetic tests: Can a law fix the bugs?
Have you ever used a gadget or app that tracks your steps, blood pressure or pregnancy?
Have you ordered a direct-to-consumer genetic test? Or have you taken a quiz to discover your “real” or “biological” age (perhaps seeking confirmation that your ultra-marathoning and paleolithic diet have stopped the aging clock, or that your complaint to your children that their behavior has taken years off your life is scientific fact)?
If yes, you may have encountered paragraphs of impenetrable legalese addressing how your data can be used and shared. Or you may have chosen to bypass all of that, thinking that surely protections exist to keep you safe from harm.
The sorry truth is that the major U.S. federal law protecting the privacy of personal health information dates from before the first iPhone, and has never caught up with a world in which lots of health information is generated outside a doctor’s office or hospital. Commentators have been drawing attention to this problem for years.
Evidence has been accumulating that the companies behind these gadgets, apps, tests and quizzes often offer incomplete explanations of their practices, or use impenetrable legalese as cover for claiming nearly complete control over consumer information. That said, some consumers seem less interested in protecting their data than in convenience or cool features (like an app telling you the kind of pastry your fetus resembles at each stage of gestation, or a test identifying the diet “matched” to your DNA).
Simply banning or subjecting them to onerous regulation would be contrary to recent legislative trends at the federal level. For example, a provision of the 21st Century Cures excludes software that encourages or provides support for maintaining a healthy lifestyle from FDA regulation.
In a new bill, Senators Amy Klobuchar and Lisa Murkowski have crafted a common-sense approach to taming the wild, wild west of health-focused, consumer-directed devices, services, applications, and software, including direct-to-consumer genetic testing services. Their proposed “Protecting Personal Health Data Act” would task the Secretary of Health and Human Services with considering:
- Uniform standards for consent
- Minimum standards for security
- Standards for de-identification
- Limitations on collection, use, and disclosure of data beyond specified purposes (including sharing with third parties and use of data for marketing)
Of note, one goal is to make it as easy to withdraw as to give consent. Further, the Act would give consumers a right to access the data collected about them by a device, service, application or software “operator,” a list of others who have accessed data, and a right to delete or amend data “to the extent practicable.” (There’s also a recent flurry of activity about this at the state level.)
Every new regulatory regime carries with it the possibility of unintended consequences. One casualty could be citizen science and open science initiatives. For example, the non-profit Open Humans platform allows citizen scientists to post data and set their own rules for access. Although those who select open access rules could end up facing some additional risks to privacy, individuals participating may choose to make that trade-off in order to advance research and promote the citizen science value of openness.
I don’t think “citizen science” or “open science” warrants laissez-faire. Openness needs to be an informed choice, with clarity about any trade-offs. At the same time, the resources of such initiatives are typically not equivalent to those of commercial actors, nor are the risks they pose typically of the same order. For example, there have been no reports of Open Humans tracking user activities offsite or converting data into a resource for third party product marketing.
So, along with a directive that any regulations account for differences in the nature and sensitivity of the data collected or stored, it might be helpful to include a directive to account for operator differences.
As this space continues to evolve, it will be interesting to observe whether supporters of bills like this one are able to build an effective bipartisan coalition (Klobuchar and Murkowski represent both sides of the aisle), and whether tech companies will come on board in order to preempt more sweeping privacy and consumer protection laws along the lines of the European Union’s General Data Protection Regulation.
-By Mary Anderlik Majumder, J.D., Ph.D., associate professor in the Center for Medical Ethics and Health Policy at Baylor College of Medicine