After several much-ballyhooed analyses projected an impending boom in the mobile health industry, it seems this once-geeky topic is being talked about everywhere lately. For those of us whom have been long-time advocates (and supporters) of the disruptive potential of mHealth technologies, we are thankful that this conversation is taking place. We’re in an era where the technology enablers are moving into place and getting better every day, the business models are starting to develop and the policy environment is – well – unsettled at best. At issue are the questions of whether and how much the FDA will regulate medical apps and how rules and regulations around privacy and security will circumscribe the kinds of products and services developers can offer. This uncertainty in the policy environment raises the risks for developer/entrepreneurs and the investors that would back them.
FDA regulation is a thorny issue and is especially challenging when a “medical device” consists of a phone carried around by tens of millions of Americans and software that could have been written in a weekend. The sheer volume of review that would have to take place is overwhelming in an era where agency budgets are going down not up. An account of the recent FDA approval of the mobile radiology app Mobile MIM gives one a sense of the difficulty here. And the lack of clarity has a chilling effect.
But aside from the question of how the technologies will be regulated, another fundamental concern needs to be addressed. That being, how will the data contained in an individual’s personal health record (PHR) – the “fuel” that powers the mHealth machine –be protected and secured? And what regulations will be put in place to do so? Until policy makers step in and provide guidance, the chilling effect on innovation will continue, furthering our reliance on an antiquated model of care that does little to empower patients to be active participants in their own health decisions.
But before providing the legal and regulatory framework app developers need to move forward, policy makers must recognize that PHRs are rapidly evolving. This evolution is something I recently discussed as part of the Office of the National Coordinator for Health Information Technology’s public roundtable “Personal Health Records: Understanding the Evolving Landscape,” where I laid out three themes that have emerged from several Robert Wood Johnson Foundation Pioneer Portfolio-funded projects I’ve helped coordinate. These themes include: separating the apps from the data, expansion of the definition of health information, and the increasingly social nature of health care, all of which I explain in more detail below. It is my hope that policy makers take all three into consideration when crafting policies that will inevitably shape the industry.
Separating the Apps from the Data
When examining the first theme, we must take a look back in history and recognize that PHRs were designed to house – and give and individual access to –medical records, along with a few features that helped display and interpret the information. Whoever maintained the data also provided the services that displayed, interpreted or otherwise helped you use the data. Increasingly, however, we’re seeing a movement towards separating these functions: the function of storing, maintaining, and providing access to the data serves as a platform while additional features and functions can be offered by third parties as apps. By now we all recognize this as the way the computer industry works – operating systems, like Windows and the MacOS (and now iOS and Android) serve as platforms and third-party developers offer hundreds of thousands of programs or apps that run on those platforms. When it comes to PHRs, this model enables a patient to choose features and functions based upon their own individual preferences and circumstances, instead of being limited to the options of what a single PHR vendor builds into service. The implication of this trend toward separating the apps from the data is that it makes it hard to regulate a “PHR service” or “PHR vendor,” unless one adopts a very narrow definition that focuses only on the storage, maintenance and control of access to the data.
Expansion of the Definition of Health Information
Individuals are increasingly becoming aware of the fact that health is not just what happens when you go to the doctor. Health happens where you live, work, learn and play. It happens 24 hours a day, 365 days a year. It is based on the behavioral decisions you make every day, and it is based on the circumstances in which you live. Apps that help people manage their health often draw from data one finds in a medical record (e.g. medications) and also data from people’s day-to-day experience that contribute to (or detract from) their health. In our national program Project HealthDesign, which is supporting research on this topic, we refer to this latter set of data as Observations of Daily Living (ODLs), which can include how much exercise you got on a given day, what your text messages say about your mood, or the duration and intensity of headaches to name a few. You can even include in this category relevant environmental data on ambient temperature or the level of particulates in the air. Regulations around the privacy of health information tend to focus on health information as information retained by health care service providers, but when you take a broader view of health and when app developers mix ODLs with traditional health data, it becomes very hard to draw a line between “health” information and other forms of personal information. Any future regulations will have to wrestle with the meaning of an expanded definition of health information. Perhaps the distinction between health information and any other information someone would like to keep private is no longer tenable.
The Increasingly Social Nature of Healthcare
Finally, policy makers must realize that individuals are increasingly sharing their data and health-related decisions with friends, family members, and an increasingly sophisticated care network. Health care is becoming more social in nature. And it’s not just social media or human to human interaction I’m talking about – many emergent mHealth technologies need to share data freely with software or other devices. A wireless blood pressure cuff talks to an app that then requests to deposit the data in a physician’s electronic health record (EHR). A smartphone app for medication reminders sends a query to an EHR to get up-to-date prescription information. Whether it’s person to person, device to device or app to app, sharing can add real value and regulations need to find a balance between helping people guard their privacy while also enabling the secure sharing of information.
Separating the apps from the data, expansion of the definition of health information, and the increasingly social nature of health care – all can potentially change how we view PHRs, and how we perceive health care at its very core. However, while mHealth shows tremendous promise, regulations could have a significant effect on how much of that promise is realized. Policymakers will need to walk a fine line between providing enough clarity to encourage developers to act while offering enough flexibility to accommodate a rapidly evolving shift in how health information is understood and used.
Based upon their extensive experience in the field, several team members from Project HealthDesign submitted public comments to the ONC following the PHR Roundtable at which I spoke. These comments do an excellent job of laying out Project HealthDesign’s vision for the future of PHRs and related privacy and security concerns, providing relevant examples from their work where innovation could be impeded by impractical policy. For example, if data-encryption requirements are so stringent that a device cannot share data with a software program or another device, then we’d lose out on the benefits of technology that uses trackers to automatically monitor a whole slew of ODLs for patients (read Anind K. Dey’s public comment for a wonderful explanation). Or, if privacy/reporting requirements for mHealth devices are created with yesterday’s definition of health information in mind, a wide-range of health apps could be adversely affected (see Katherine Kim’s and Gillian Hayes’ comments). These very concerns are currently being echoed by developers throughout the mHealth industry.
Wading through all of these thorny issues promises to be difficult for policy makers and regulators, but the ultimate solution may be as simple as providing individuals with the ability to control and share their own data as they see fit. The new reality of health care rests on that very notion; that patients should be at the center of their care. Tomorrow’s mHealth policies need to be created with that philosophy in mind.