What happens when renowned Beth Israel CIO and Harvard professor John Halamka explores what it will take to achieve real healthcare interoperability with a panel of fellow health IT leaders? An engaging, standing-room only discussion at HIMSS’16 in Las Vegas last month.
From identifying today’s interoperability enablers and barriers, to offering recommendations for how we can best address our “interoperability problem,” the Surescripts-hosted panel dove into the key issues that clinicians and technologists face today and assessed how we must move forward in order to enact real change. Some of the key themes discussed were:
- Interoperability barriers: Trust and regulations
- Learning from e-prescribing and health information exchange in action
- Incentives for technology adoption
- Metrics for measuring interoperability
Interoperability barriers: Trust and regulations
In terms of the most challenging barriers, the panelists were in agreement that trust and regulations were two significant factors. “Largely, today the barriers are not technical,” noted Peter DeVault, director of interoperability for Epic. “What we’re seeing as barriers are in governance and trust,” DeVault commented, also noting that Carequality is a much needed step in the right direction, providing a “trust fabric” that can be adopted by many different networks.
Erica Neher, clinical interoperability lead for SSM Health System, echoed the ability for Carequality to foster trust between technologies, but also mentioned that as a nation, we need to focus on educating and improving patient trust with the process of health information exchange as well. As much time is being spent educating staff on how to uniformly and consistently inform patients about consent, Neher notes that we need to do a much better job informing and educating patients about the process.
In looking ahead to what’s on the regulatory horizon, Halamka shared that “all of [the panelists] are slightly fatigued by the growing number of regulations we have to comply with,” noting that we’re “maybe there with frameworks and guidance, but more meaningful use stages? Maybe not,” he said.
Jim Murray with CVS Minute Clinic, discussed how common regulatory-driven conversations are and how this only heightens the need for scalability in how we approach technology. “The technology is there, the standards are there, the regulations certainly are there – we spend a lot of time talking about them. It’s one of the time sinks,” also noting that once a regulation is in place, by no means does that mean it is finalized. Instead, it is constantly evolving and technology must be able to evolve with it.
Learning from e-Prescribing: Incentives and Data Exchange
The evolution of healthcare interoperability can also be seen in the creation and adoption of e-prescribing mandates, mentioned Matt Koehler, vice president of product innovation for Surescripts. “Look back 10 years ago, and not all states accepted e-prescriptions. We had to work with each state individually, and it’s something we’re going to have to do with national information exchange as well,” he said, while also saying that industry partnerships are critical for changing legislation.
In examining incentives, DeVault commented that “there used to be a notion that if we incentivized interoperability, and we created a standard--let’s call it Direct--all of a sudden, interoperability would happen. What people didn’t realize is you need an entire ecosystem for interoperability to happen.” But when the ecosystem begins to partner and make a marked impact through initiatives like Carequality or through the use of a National Record Locator Service (NRLS), the possibilities become great.
In speaking about NRLS, DeVault discussed the impact of data availability on patient safety and care quality. NRLS “enables providers to see, definitively, where a patient’s records are located. Wherever the data naturally lives, if we can expose that information, it’s a huge benefit,” he said.
Metrics for Interoperability
Metrics were also a point of discussion, where Bob Barker, vice president of community connectivity solutions for NextGen, believes that “what’s been lacking over the years is metrics that show the value of interoperability to the provider and the patient base,” he said. “If you’re going to share data, there should be volumes of metrics that show the benefits to the patient’s health,” where Barker also noted the environment will—and must—continue to slowly shift towards patients demanding this information or choosing to seek care elsewhere.
In terms of measures of interoperability success, Halamka agrees that we need to look forward and carefully consider these metrics. “Interoperability has to be measured by the utility of the results,” he said, and for the ability of caregivers to be able to deliver the safest, highest quality care possible based on the data that was required and present. What can’t it be? “Here’s a sack of garbage. I delivered it. Check!”
Standards and their interpretation, patient consent and preferred consent models, and network security and patient data privacy were just some of the additional topics of discussion for the panel, where Koehler reminded the group that, despite challenges and barriers, “it’s important that we don’t stop innovating.” Halamka agreed. “It’s like working on security – you’re never going to be done,” said Halamka, who also noted the need to focus on moving forward versus getting caught in chasing perfection. “Having provider directories and having record locator services, these are at the core of what we need to move data,” believes Halamka, and we must continue to focus on improving data liquidity if we want to shift interoperability from a work in progress to an industry-wide reality.
For a full replay of the HIMSS’16 panel discussion, please see below.