Read full article: http://www.washingtonpost.com/sf/brand-connect/philips/transforming-healthcare/?origin=13_us_en_wpvbc_philipsnatwitter____nabcd_paid
Archives
All posts for the month March, 2017
Officials with the Department of Veterans Affairs repeatedly told lawmakers that the agency is moving on from its failed approach of building in-house IT systems, and is instead seeking commercial, off-the-shelf options to improve scheduling, EHR interoperability, and billing and claims processing.
In a hearing before the House Committee on Veterans’ Affairs on Tuesday, Rob Thomas, acting assistant secretary for information technology and CIO for the Office of Information and Technology at the VA, acknowledged previous failures in the agency’s attempt to modernize IT systems, but noted that it is no longer trying to build systems from the ground up.
“I’m confident we’re going to go commercial,” Thomas said, noting that the VA is considering commercial products and moving data to the cloud instead of spending money building customized in-house systems. However, he declined to offer a completion date.
RELATED: VA’s slow but steady push to modernize its technology
Lawmakers continued to criticize VA officials for using decades-old legacy systems, drawing comparisons to the overwhelming costs required to maintain an old car. As the Government Accountability Office (GAO) has previously pointed out, a large portion of the VA’s $4 billion IT budget goes to managing outdated systems that are more than 50 years old.
Since 2015, the GAO has included the VA’s IT systems and the lack of interoperability between the Department of Defense and the VA on its list of “high risk” areas, and an official at the hearing said it would be included on the agency’s 2017 list. The GAO has been critical of the VA’s efforts to modernize its systems, highlighting an estimated $127 million in wasted funding used to rebuild the VA’s outpatient scheduling system.
In a report released during the hearing, the GAO again criticized the VA for ongoing failures tied to its IT modernization efforts and recommended the agency develop concrete goals and metrics moving forward.
“The problem with federal government is that they are so reluctant to buy commercial products and change antiquated business practices,” David Powner, director of IT management issues at the GAO, said during the hearing. He added that the VA could save “hundreds of millions of dollars” by improving data consolidation efforts.
“Buying instead of building is the way to go,” he added.
Thomas agreed that the VA needs to aggressively shrink its footprint and spend less to maintain old legacy systems. He added that he is hoping for a speedy confirmation for David Shulkin, President Donald Trump’s selection to lead the VA, so they could roll out a timeline for purchasing commercial systems.
RELATED: Donald Trump picks David Shulkin to lead VA
Despite its history of modernization failures, Jennifer Lee, M.D., deputy under secretary for health for Policy and Services at the VA, did emphasize the success of the agency’s enterprise Health Management Platform (eHMP), which allows physicians to access patient information from other participating hospitals outside of the VA system.
Following failed attempts by the VA to modernize its VistA EHR system, the previous VA CIO, LaVerne Council, initiated a request for information to replace the system with a commercial product. Council previously testified that she cringes when she thinks about how old some of the VA systems are.
Thomas noted that he was he was chosen to replace Council to continue that transition toward an off-the-shelf product, noting that prior administrations did not have a coherent strategy for IT modernization or cybersecurity. But, given the VA’s history in missing deadlines, Powner urged Congress to hold quarterly meetings and manage the process “with a heavy hand to make sure deadlines are met.”
The processing required to prepare unstructured data for analysis can be cumbersome and prone to error. That’s why companies should do more to organize their data before it is ever collected.
Unstructured data — data that is not organized in a predefined way, such as text — is now widely available. But structure must be added to the data to make it useable for analysis, which means significant processing. That processing can be a problem.
In a form of modern alchemy, modern analytics processes now transmute “base” unstructured data into “noble” business value. Systems everywhere greedily salt away every imaginable kind of data. Technologies such as Hadoop and NoSQL store this hoard easily in its native unstructured form. Natural language processing, feature extraction (distilling nonredundant measures from larger data), and speech recognition now routinely alchemize vast quantities of unstructured text, images, audio, and video, preparing it for analysis. These processes are nothing short of amazing, working against entropy to create order from disorder.
Unfortunately, while these processing steps are impressive, they are far from free or free from error. I can’t help but think that a better alternative in many cases would be to avoid the need for processing altogether.
We all know how each step in a process mangles information. In the telephone game, as each person whispers to the next player what they think was said to them, words can morph into an unexpected or misleading final message. In a supply chain, layers exacerbate distortion as small mistakes and uncertainty quickly compound.
By analogy, organizations are playing a giant game of telephone with data, and unstructured data makes the game far more difficult. In a context where data janitorial activities consume 50% to 80% of scarce data scientist resources, each round of data telephone costs organizations in accuracy, effort, and time — and few organizations have a surplus of any of these three.
Within organizations, each processing step can be expensive to develop and maintain. But the growth in importance of data sharing between organizations magnifies these concerns. Our recently published report, “Analytics Drives Success with IoT,” associates business value with sharing data between organizations in the context of the internet of things. And, to foreshadow our report to be released in January, we observe similar results in the broader analytics context. But with every transfer of data, more processes need to be developed and maintained.
If this processing were unavoidable, then it would just be a cost of data sharing within or between organizations. A disconcerting point, however, is that there is (or could be) structure in the ancestry of much of the data that is currently unstructured. For example, for every organization that generates a web page based on data in a database, there are likely multiple organizations scraping that data (either sanctioned or unsanctioned) and then processing it to try to regain that structure. In the best case, that’s a lot of thrashing just to end up with data in its original form. In the worst case, it’s a lot of effort to put toward obtaining data with many errors.
By Mohana Ravindranath
January 6, 2017
The 2017 National Defense Authorization Act provides for the reorganization of the Pentagon’s buying office, and it could be a key part of outpacing adversaries’ tech development.
By February 2018, the undersecretary of defense for acquisition, technology and logistics, known as AT&L, will be split into two separate roles: one undersecretary of defense for research and engineering, and another for acquisition and sustainment.
“We’re not doing this for a business management reason,” Bill Greenwalt, a professional staff member to the Senate Armed Services Committee, said during a meeting hosted by the Professional Services Council on Friday. “There is a fear … that some of our potential adversaries are really boosting up their research and development functions [and] copying what we have.”
“We need to figure out a way to bring those technologies and those operational concepts into the Department of Defense,” he added.
AT&L needed to be split into smaller roles because it has “grown too big, tries to do too much and is too focused on compliance at the expense of innovation,” Senate Armed Services Committee Chairman Sen. John McCain, R-Ariz., said in a statement in November.
By Mohana Ravindranath
January 6, 2017
President Donald Trump and his new Pentagon leadership team come to power at a time of major change in the multibillion-dollar world of military acquisition. A measured approach from the administration, as well as from Congress, could be key to how well the Pentagon navigates this complex, ongoing system-wide reform.
For the Department of Defense (DoD) the bulk of acquisition regulations derive from procurement and acquisition laws enacted by Congress. From the Packard Commission in the mid-1980s to the Goldwater-Nichols Act in the late 1980s, the acquisition reform initiatives in the 1990s to the Weapons Systems Reform Act of 2009, new institutions, bureaucracies, and regulations were instituted.
In 2016, after the military service chiefs testified that they were not a part of the acquisition system, the Congress passed the National Defense Authorization Act (NDAA) that gave those positions limited but powerful voices in the process. While this was a significant change to the mandates of the Goldwater-Nichols Act, its implementation will take time and change cultures in the Pentagon. This has slowly begun to evolve and should continue during the Trump administration.
The Congress has now passed the 2017 NDAA, which includes roughly 250 pages in Title VIII, Acquisition Policies, Management and Related Matters. The effect on the institutions charged with implementing these mandates is yet unknown but is expected to be significant in terms of both implementation as well as manpower.
In addition, the 2017 NDAA realigned the Office of Under Secretary of Defense for Acquisition, Technology and Logistics, creating an undersecretary for research and engineering and one for acquisition and sustainment. At the same time, it created the position of chief management officer to oversee DoD business systems that, of course, affect the entire acquisition chain.
While many have complained that the current single position of undersecretary for acquisition, technology and logistics is too broad, it provided a management continuity and singular direction from the inception of an idea in basic research, flowing into program development and production and finally into support of the weapon system. Now, these elements will be split between at least two different undersecretaries, with different authorities, priorities, staffs and concepts. This could lead to potential bureaucratic conflicts between these two offices and their military department counterparts. The position of chief management officer could complicate things further.
All this occurs as a new administration moves into the Pentagon facing mandated reductions in personnel, military flag and general officers, members of the Senior Executive Service and finally a 20–25 percent cut in overall manpower. The nomination of a new deputy secretary, possibly someone from the business community, could be a significant moment for the process moving forward.
Fortunately, the roughly 50-page section 900 of the 2017 NDAA, related to the Office of the Secretary of Defense, gives the new secretary some breathing room by delaying the creation of the two undersecretary positions until February of 2018.
The acquisition system in the DoD is complex and ever-changing. It requires a trained and active workforce, one that has the support of the Congress. Allowing some of the reforms to take place in a measured fashion would probably be a wise choice for the administration and the Congress.
Irv Blickstein is a senior engineer at the nonprofit, nonpartisan RAND Corporation. He has held leadership positions in both the Department of the Navy and the Department of Defense.
This commentary originally appeared on RealClearDefense on February 21, 2017.
Article · February 22, 2017
As a researcher who studies electronic health records (EHRs), I’ve lost count of the times I’ve been asked “Why can’t the systems talk to each other?” or, in more technical terms, “Why don’t we have interoperability?” The substantial increase in electronic health record adoption across the nation has not led to health data that can easily follow a patient across care settings. Still today, essential pieces of information are often missing or cumbersome to access. Patients are frustrated, and clinicians can’t make informed decisions. When our banks talk to each other seamlessly and online ads show us things we’ve already been shopping for, it is hard to understand why hospitals and doctors’ offices still depend on their fax machines.
A big part of the reason is that interoperability of health information is hard. If it were easy, we would have it, or at least have more of it, by now. Though it’s a technological issue, it’s not just a technological issue. As we have seen in other industries, interoperability requires all parties to adopt certain governance and trust principles, and to create business agreements and highly detailed guides for implementing standards. The unique confidentiality issues surrounding health data also require the involvement of lawmakers and regulators. Tackling these issues requires multi-stakeholder coordinated action, and that action will only occur if strong incentives promote it.
Though billions in monetary incentives fueled EHR adoption itself, they only weakly targeted interoperability. I have come to believe that we would be substantially farther along if several key stakeholders had publicly acknowledged this reality and had made a few critical decisions differently. While it’s too late for “do-overs,” understanding initial missteps can guide us to a better path. Here is how those key stakeholders, intentionally or not, have slowed interoperability.
Policymakers
The 2009 Health Information Technology for Economic and Clinical Health (HITECH) legislation contained two basic components: a certification program to make sure that EHRs had certain common capabilities, and the “Meaningful Use” program, divided into three progressively more complex stages, that gave providers incentive payments for using EHRs. The legislation specified “health information exchange” (HIE) as one of the required capabilities of certified EHR systems. However, the Centers for Medicare and Medicaid Services (CMS) and the Office of the National Coordinator for Health IT (ONC) had substantial latitude to decide how to define health information exchange as well as how to include it in the Meaningful Use program and the accompanying EHR certification criteria.
CMS and ONC decided to defer the initial HIE criterion — electronically transmitting a summary-of-care record following a patient transition — to Stage 2 of the Meaningful Use program. While many of the capabilities required for providers to meet this criterion would have been challenging to develop on the Stage 1 Meaningful Use timeline, deferring HIE to Stage 2 allowed EHR systems to be designed and adopted in ways that did not take HIE into account, and there were no market forces to fill the void. When providers and vendors started working toward Stage 2, which included the HIE criterion, they had to change established systems and workflows, creating a heavier lift than if HIE had been included from the start.
While the stimulus program needed to get money out the door quickly, it might have been worth prioritizing HIE over other capabilities in Stage 1 or even delaying Stage 1 Meaningful Use to include HIE from the start. At a minimum, this strategy would have revealed interoperability challenges earlier and given us more time to address them.
A larger problem is that HITECH’s overall design pushed interoperability in a fairly limited way, rather than creating market demand for robust interoperability. If interoperability were a “stay-in-business” issue for either vendors or their customers, we would already have it, but overall, the opposite is true. Vendors can keep clients they might otherwise lose if they make it difficult to move data to another vendor’s system. Providers can keep patients they might otherwise lose if they make it cumbersome and expensive to transfer a medical record to a new provider.
As a result, the weak regulatory incentives pushing interoperability (in the form of a single, fairly limited HIE Meaningful Use criterion), even in combination with additional federal and state policy efforts supporting HIE progress, could not offset market incentives slowing it. Without strong incentives that would have created market demand for robust interoperability from the start, we now must retrofit interoperability, rather than having it be a core attribute of our health IT ecosystem. And, if there had been stronger incentives from the start, we would not now need to address information blocking: the knowing and intentional interference with interoperability by vendors or providers.
Another criticism levied at policymakers is that they should have selected or certified only a small number of certain EHR systems, because narrowing the field of certified systems would have at least limited the scope of interoperability problems. A few even advocated that Congress mandate a single system such as the VA’s VISTA. While such a mandate would have helped solve the interoperability issue, it would have violated the traditional U.S. commitment to market-based approaches. Moreover, the United Kingdom failed very visibly with a heavily centralized approach, and in a health care system the size of the United States, an attempt to legislate IT choices in a similar manner could backfire catastrophically.
EHR Vendors
Most observers assign EHR vendors the majority of blame for the lack of interoperability, but I believe this share is overstated. As noted above, by avoiding or simply not prioritizing interoperability, they are acting exactly in line with their incentives and maximizing profit.
Normally, the United States glorifies companies that behave this way. When neither policymakers nor providers were demanding interoperability, vendors risked harming their bottom lines by prioritizing it, and they cannot be blamed for acting in their economic best interest in wholly legal ways.
Nevertheless, senior leaders at EHR vendors should have been more willing to come forward and explain why their economic best interest was at odds with existing regulations. Instead, they often claim to have robust interoperability solutions when they do not, and similarly claim that interoperability is a top priority when it is not. This gap between rhetoric and reality makes it harder for providers and policymakers to demand greater interoperability.
Providers
As noted above, providers may not have a strong business case to prioritize interoperability. However, providers have professional norms and mission statements that should motivate them to pursue interoperability (or at least not actively interfere with it) to benefit their patients. As a result of these conflicting motivations, some provider organizations let competitive pressures drive interoperability decisions (including not demanding robust interoperability from their vendors), while others have chosen to pursue interoperability because it is best for their patients, even if this decision incurs competitive disadvantage. More providers may tip toward the former simply because today’s interoperability solutions are complex and costly. It is hard to justify investing in a complicated, expensive capability that also poses a strategic risk — a double whammy.
The emergence and rapid growth of Epic’s Care Everywhere platform (which connects providers using Epic) suggests that even in highly competitive markets, providers may easily tip the other way when the cost and complexity of interoperability are reduced. Therefore, any efforts that successfully reduce cost and complexity are highly valuable, though not a substitute for stronger incentives for providers (and vendors) to engage in interoperability.
As with vendors, we cannot fault providers for behaving in ways that are aligned with their incentives, but we can argue that their patient care mission requires, at a minimum, more public disclosure about the business rationales behind their interoperability decisions.
The point of the blame game is not to punish the players. It is to understand the dynamics at play and plot a path forward. Of the stakeholders, only policymakers have a clear, strong interest in promoting interoperability. Therefore, it is up to them to ensure that robust, cross-vendor interoperability is a stay-in-business issue for EHR vendors and providers. Once the business case for interoperability unambiguously outweighs the business case against it, both vendors and providers can pursue it without undermining their best interests.
March 16, 2017
@chrisnerney
Attaining full interoperability of electronic healthcare systems is a challenge that transcends technology, argues Dr. Julia Adler-Milstein, associate professor at the University of Michigan’s School of Information, School of Public Health.
Writing in NEJM Catalyst, an online publication of the New England Journal of Medicine, Adler-Milstein acknowledges that “health information is hard. If it were easy, we would have it, or at least have more of it, by now.”
Compounding the inevitable technology issues, she says, is the need to unite stakeholders and create an organizational framework that supports interoperability goals.
“As we have seen in other industries, interoperability requires all parties to adopt certain governance and trust principles, and to create business agreements and highly detailed guides for implementing standards,” Adler-Milstein writes. “The unique confidentiality issues surrounding health data also require the involvement of lawmakers and regulators.”
Though her article is titled “Moving Past the EHR Interoperability Blame Game,” Adler-Milstein cites in some detail how policymakers, EHR vendors, and providers each have impeded progress toward widespread healthcare interoperability.
The root of the problem is that policymakers failed to create regulatory incentives to promote interoperability that were strong enough to offset market incentives, such as vendors wanting to lock in customers by engaging in information blocking. EHR vendors thus lacked any incentives to prioritize interoperability, according to Adler-Milstein.
And while many providers have opted to pursue interoperability because they believe it will help them deliver better care to patients, Adler-Milstein writes that “some provider organizations let competitive pressures drive interoperability decisions (including not demanding robust interoperability from their vendors).”
“It is hard to justify investing in a complicated, expensive capability that also poses a strategic risk — a double whammy,” she says.
Adler-Milstein concludes that the onus is on policymakers to move the interoperability needle.
“Only policymakers have a clear, strong interest in promoting interoperability,” she says. “Therefore, it is up to them to ensure that robust, cross-vendor interoperability is a stay-in-business issue for EHR vendors and providers.”
You can read the entire article here.
Julia Adler-Milstein, PhD
Associate Professor, School of Information, School of Public Health, University of Michigan
Article link: http://catalyst.nejm.org/ehr-interoperability-blame-game/