healthcarereimagined

Envisioning healthcare for the 21st century

  • About
  • Economics

Epic Systems, Leading Defense EHR Bidder, Slammed for Lack of Interoperability – Nextgov

Posted by timmreardon on 10/06/2014
Posted in: Blue Button, Data Science, DoD, Global Standards, Health Care Costs, Health Care Economics, Health IT adoption, Health Outcomes, Healthcare Delivery, Healthcare Informatics, Healthcare Security, Helathcare Delivery, Innovation, Integrated Electronic Health Records, Military Health System Reform, Mobile Healthcare, National Health IT System, Open Data, Patient Centered Medical Home, Patient Portals, PCMH, Quadruple Aim, Quality Measures, U.S. Air Force Medicine, U.S. Army Medicine, U.S. Navy Medicine, Veterans Affairs. Leave a comment

By Bob Brewin

October 3, 2014
Epic
Epic Systems, considered the front-runner for the Defense Department’s $11 billion electronic health record contract, has come under sustained criticism for lack of interoperability with other EHRs, including most recently a front-page story in The New York Times last Sunday.

The Times story reported the privately held Epic, which partnered with IBM for the defense EHR contract, “and its enigmatic founder, Judith R. Faulkner, are being denounced by those who say its empire has been built with towering walls, deliberately built not to share patient information with competing systems.”

Interoperability between Epic and other EHRs is possible, but only after hospitals pay high fees, the Times reported.

Modern Healthcare, in a recent article on Epic, said, “While interface fees are common across the EHR industry, some observers say Epic’s leading role in the EHR market means it has a disproportionate negative effect on interoperability.”

This March, in a report on a variety of medical technologies, the Rand Corporation described the Epic EHR as a “closed platform,” which “can make it challenging and costly for hospitals to interface their EHR with the clinical or billing software of other companies.”

Large Hospital Systems Embrace Epic

Large hospital systems, such as Kaiser Permanente, the second largest health care system in the country with 8.6 million patients – compared to 9.6 million in the military health system – have embraced Epic.

Rand said Epic has “established itself as the enterprisewide solution of choice for large private health care systems and academic medical centers, irrespective of ongoing concerns about its limited interoperability and less-than-ideal usability.”

As the biggest player in the market, Epic has derived huge benefits from the $24 billion of incentive payments paid out by the Centers for Medicare and Medicaid Services to clinician and hospitals that adopt EHRs, Rand said.

At a House Energy and Commerce Committee hearing July 17, Rep. Phil Gingrey, R-Ga, who’s also a doctor, noted the incentive payments were made to encourage interoperability.

He asked: “Is the government getting its money’s worth?… It may be time for the committee to take a closer look at the practices of vendor companies in this space, given the possibility that fraud may be perpetrated on the American taxpayer.”

Modern Healthcare reported Epic officials blame negative perceptions about the company and its software on misinformation spread by rivals.

Interoperability Key in Pentagon’s Planned EHR System

Interoperability is a key requirement for the new defense EHR system, which will replace the existing Armed Forces Health Longitudinal Technology Application outpatient EHR and the inpatient Essentris system. The new EHR needs to exchange data with Department of Veterans Affairs systems as well as with civilian clinicians and hospitals covered by the TRICARE insurance plan.

EPIC did not respond to a query from Nextgov on the interoperability of its system.

Andrew Maner, the leader of IBM’s federal services division, said in an email: “Epic is the most open solution by any measure. They have the highest-rated electronic health record suite with the most large-scale successful implementations in the U.S. and are securely transmitting billions of pieces of patient data through interfaces and open APIs.”

He said Epic’s health record exchange platform “also leads the nation, transmitting 5 million patient records a month between 295 Epic organizations and over 7,500 non-Epic organizations.”

(Image via kozirsky/Shutterstock.com)

By Bob Brewin

October 3, 2014

http://www.nextgov.com/health/health-it/2014/10/epic-systems-leading-defense-ehr-bidder-slammed-lack-interoperability/95792/

VA Awards Contract for Independent Assessment of Health Care to Non-Profit Firm – VA Press Release

Posted by timmreardon on 10/04/2014
Posted in: Health Care Costs, Health Care Economics, Health IT adoption, Health Outcomes, Healthcare Delivery, Healthcare Informatics, Healthcare Security, Innovation, Integrated Electronic Health Records, National Health IT System, Quadruple Aim, Quality Measures, Veterans Affairs. Leave a comment

VA press release

FOR IMMEDIATE RELEASE
October 1, 2014

VA Awards Contract for Independent Assessment of Health Care to Non-Profit Firm

Choice Act Requires Third Party Assessment of Processes; Firm Will Serve as Program Integrator

Washington – The Department of Veterans Affairs (VA) today announced that the MITRE Corporation, a not-for-profit company that operates multiple federally funded research and development centers, has been awarded a contract to support the Independent Assessment of VA health care processes, as required by the Veterans Access, Choice and Accountability Act of 2014 (“Choice Act”). MITRE Corporation will serve as program integrator.

Section 201 of the Choice Act directs VA to enter into one or more independent, third-party contracts for an assessment of the hospital care, medical services and other health care processes in VA medical facilities. The program integrator will be responsible for coordinating the outcomes of the assessments conducted by the third-party entities according to the scope of the contracts. The program integrator is required to report the independent assessment results to Congress within 60 days of the assessment’s conclusion.

“This independent assessment is a key element in our effort to rebuild trust with Veterans and our other stakeholders,” said Secretary of Veterans Affairs Robert A. McDonald. “It will provide the Department a way to transparently review our vital programs, organizations, and business practices to make us a better and more accountable VA for the Veterans we serve.”

Working with Congress, Veterans Service Organizations, and other stakeholders, VA has taken steps to implement Choice Act legislation, including:
•Establishing a Program Management office to oversee planning and implementation of the legislation across the Department.
•Putting in place the mechanisms to execute the outlined facilities with the authorization provided to carry out major medical facility leases.
•Working through the contracting process to extend the pilot program called Project ARCH to ensure the continued expanded access for Veterans in rural areas provided by that program.
•Holding Industry Day to seek input on how best to provide administrative support including issuing Veteran Choice Cards.

Laying the groundwork for data-driven science – National Science Foundation

Posted by timmreardon on 10/04/2014
Posted in: Big Data, Data Science, Genetic Data, Global Standards, Health Care Economics, Health IT adoption, Health Outcomes, Healthcare Delivery, Healthcare Informatics, Healthcare Security, Innovation, Integrated Electronic Health Records, NSF. Leave a comment

October 1, 2014

The ability to collect and analyze massive amounts of data is rapidly transforming science, industry and everyday life, but what we have seen so far is likely just the tip of the iceberg. Many of the benefits of “Big Data” have yet to surface because of a lack of interoperability, missing tools and hardware that is still evolving to meet the diverse needs of scientific communities.

NSF 1

This image and related research data–one of numerous projects being shared and stored using SeedMe–shows a simple model of a geodynamo used for benchmark codes. The view is from the center towards one of the poles, and the cones show convective flow towards higher temperature (light green to dark green with increasing velocity) in a spiraling form caused by rotation. The shells of various colors depict temperature, increasing from the outer boundary towards the interior.

Credit: Amit Chourasia, UC San Diego; Ashley Willis, University of Sheffield; Maggie Avery, UC San Diego; Chris Davies, UC San Diego/University of Leeds; Catherine Constable, UC San Diego; David Gubbins, University of Leeds.

 

One of the National Science Foundation’s (NSF) priority goals is to improve the nation’s capacity in data science by investing in the development of infrastructure, building multi-institutional partnerships to increase the number of U.S. data scientists and augmenting the usefulness and ease of using data.

As part of that effort, NSF today announced $31 million in new funding to support 17 innovative projects under the Data Infrastructure Building Blocks (DIBBs) program. Now in its second year, the 2014 DIBBs awards support research in 22 states and touch on research topics in computer science, information technology and nearly every field of science supported by NSF.

“Developed through extensive community input and vetting, NSF has an ambitious vision and strategy for advancing scientific discovery through data,” said Irene Qualters, division director for Advanced Cyberinfrastructure at NSF. “This vision requires a collaborative national data infrastructure that is aligned to research priorities and that is efficient, highly interoperable and anticipates emerging data policies.”

This year’s data cyberinfrastructure awards build capacity and capability across the nation and across research communities and complement previous awards.

“Each project tests a critical component in a future data ecosystem in conjunction with a research community of users,” Qualters said. “This assures that solutions will be applied and use-inspired.”

NSF sees these building blocks as digital components that can be joined together to develop the foundations for a robust data infrastructure. The building blocks encompass hardware, software and networking tools, as well as the communities and people who manage data and who are the practitioners of data science.

Of the 17 awards, two support early implementations of research projects that are more mature; the others support pilot demonstrations. Each is a partnership between researchers in computer science and other science domains.

One of the two early implementation grants will support a research team led by Geoffrey Fox, a professor of computer science and informatics at Indiana University. Fox’s team plans to create middleware and analytics libraries to allow data science to work at large scale on high-performance computing systems (also known as supercomputers).

Fox and his interdisciplinary team plan to test their platform with several different applications, including those used in geospatial information systems (GIS), biomedicine, epidemiology and remote sensing.

“Our innovative architecture integrates key features of open source cloud computing software with supercomputing technology,” Fox said. “And our outreach involves ‘data analytics as a service’ with training and curricula set up in a Massive Open Online Course or MOOC.”

Other institutions collaborating on the project include: Arizona State University, Emory University, Rutgers University, University of Kansas, University of Utah and Virginia Tech.

The other early implementation project is led by Ken Koedinger, professor of human computer interaction and psychology at Carnegie Mellon University. Whereas Fox’s team focuses on problems in sensing and the life sciences, Koedinger’s team concentrates on developing infrastructure that will drive innovation in education.

The team will develop a distributed data infrastructure called LearnSphere that will make more educational data accessible to course developers, while also motivating more researchers and companies to share their data with the greater learning sciences community. LearnSphere will include a graphical user interface, a library of analytical methods and a wide variety of educational data gathered from such sources as interactive tutoring systems, educational games and MOOCs.

“We’ve seen the power that data has to improve performance in many fields, from medicine to movie recommendations,” Koedinger said. “Educational data holds the same potential to guide the development of courses that enhance learning while also generating even more data to give us a deeper understanding of the learning process.”

Other institutions collaborating on this project include: MIT, Stanford University and the University of Memphis.

The DIBBs program awarded each early implementation project $5 million over 5 years.

The second group of awards supports pilot demonstrations that build upon the advanced cyberinfrastructure capabilities of existing research communities to address specific challenges in science and engineering research and extend those data capabilities to meet broad community needs. The awards provide $1.5 million over 3 years.

Among the projects supported by DIBBs awards are efforts to develop cyberinfrastructure to visualize geo-chronological data, like uranium dating of corals (College of Charleston); data capture and curation for materials science research (University of Illinois Urbana-Champaign); and efforts to manage data emerging from the Laser Interferometer Gravitational-wave Observatory or LIGO (Syracuse University).

The DIBBs program is part of a coordinated strategy within NSF to advance data-driven cyberinfrastructure. It complements other major efforts including the DataOne project, the Research Data Alliance and Wrangler, a groundbreaking data analysis and management system for the national open science community.

2014 NSF DIBBs Awards

Geoffrey Fox, Indiana University: Middleware and High Performance Analytics Libraries for Scalable Data Science

Ken Koedinger, Carnegie Mellon University: Building a Scalable Infrastructure for Data-Driven Discovery and Innovation in Education

Victor Pankratius, MIT: An Infrastructure for Computer Aided Discovery in Geoscience

Klara Nahrstedt, University of Illinois at Urbana-Champaign: Timely and Trusted Curator and Coordinator Data Building Blocks

Jerome Reiter, Duke University: An Integrated System for Public/Private Access to Large-scale, Confidential Social Science Data

Hsinchun Chen, University of Arizona: DIBBs for Intelligence and Security Informatics Research and Community

Santiago Pujol, Purdue University: Building a Modular Cyber-Platform for Systematic Collection, Curation, and Preservation of Large Engineering and Science Data–A Pilot Demonstration Project

James Bowring, College of Charleston: Collaborative Research: Cyberinfrastructure for Interpreting and Archiving U-series Geochronologic Data

Stephen Ficklin, Washington State University: Tripal Gateway, a platform for next-generation data analysis and sharing

Feifei Li, University of Utah: STORM: Spatio-Temporal Online Reasoning and Management of Large Data

Duncan Brown, Syracuse University: Domain-aware management of heterogeneous workflows: Active data management for gravitational-wave science workflows

Rafal Angryk, Georgia State University Research Foundation, Inc.: Systematic Data-Driven Analysis and Tools for Spatiotemporal Solar Astronomy Data

Jia Zhang, Carnegie Mellon University: An Infrastructure Supporting Collaborative Data Analytics Workflow Design and Management

Giridhar Manepalli, Corporation for National Research Initiatives (CNRI): User Driven Architecture for Data Discovery

Shaowen Wang, University of Illinois at Urbana-Champaign: Scalable Capabilities for Spatial Data Synthesis

Amit Chourasia, University of California, San Diego: Ubiquitous Access to Transient Data and Preliminary Results via the SeedMe Platform

Christopher Jenkins, University of Colorado at Boulder: Porting Practical NLP and ML Semantics from Biomedicine to the Earth, Ice and Life Sciences

-NSF-

Media Contacts
Aaron Dubrow, NSF, (703) 292-4489, adubrow@nsf.gov

Program Contacts
Irene Qualters, NSF, (703) 292-2339, iqualter@nsf.gov

Related Websites
Data Infrastructure Building Blocks: http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=504776

The National Science Foundation (NSF) is an independent federal agency that supports fundamental research and education across all fields of science and engineering. In fiscal year (FY) 2014, its budget is $7.2 billion. NSF funds reach all 50 states through grants to nearly 2,000 colleges, universities and other institutions. Each year, NSF receives about 50,000 competitive requests for funding, and makes about 11,500 new funding awards. NSF also awards about $593 million in professional and service contracts yearly.

Article link: http://www.nsf.gov/news/news_summ.jsp?org=NSF&cntn_id=132880&preview=false

Get News Updates by Email

Useful NSF Web Sites:
NSF Home Page: http://www.nsf.gov
NSF News: http://www.nsf.gov/news/
For the News Media: http://www.nsf.gov/news/newsroom.jsp
Science and Engineering Statistics: http://www.nsf.gov/statistics/
Awards Searches: http://www.nsf.gov/awardsearch/

Concerns persist over Phila. VA benefits office – Delaware Online

Posted by timmreardon on 10/04/2014
Posted in: Health Care Costs, Health Care Economics, Health IT adoption, Health Outcomes, Healthcare Delivery, Healthcare Informatics, Healthcare Security, Innovation, Integrated Electronic Health Records, National Health IT System, Quality Measures, Rehab Medicine, Uncategorized, Veterans Affairs. Leave a comment

William H. McMichael, The News Journal 8:50 p.m. EDT October 3, 2014

PEMBERTON, NEW JERSEY – The problems continue at the Philadelphia Veterans Affairs benefits office despite scathing congressional testimony more than two months ago about mail and records manipulation, the woman who blew the whistle on the problems told members of the House VA committee Friday.

I “regret to tell you that things have not changed, and that accountability is greatly lacking for the management officials involved,” said Kristin Ruell, a quality services representative in the Pension Management Center at the Philadelphia Regional Office, which manages the Wilmington benefits office near Elsmere. “The practices of data manipulation have continued at the Philadelphia RO.”

“We do understand the … seriousness of the concerns about the operation in Philadelphia that have been raised,” responded Diana Rubens, regional office director since Aug. 26. “And I want to assure you, we share those concerns, and we’re quickly taking action to address those issues.”

The Veterans Benefits Administration, separate from VA’s medical side, processes disability compensation and pensions claims and provides other services.

The Wilmington office, dwarfed in size by the Philadelphia office it falls under, wasn’t mentioned during the hearing. Wilmington, however, has sent disability compensation claims to Philadelphia since at least fiscal year 2011 – the same year in which “boxes” of claims sent to Philadelphia were found unprocessed and piled up, said Ruell, who first described the problem on Capitol Hill on July 14.

That means some of the Wilmington cases – at least 10 disability compensation claims from fiscal years 2011 and 2012 and about 300 pending appeals in fiscal 2013 – could be in that same sort of limbo, Ruell said.

“Any case that comes in our building, I notice the same issues, regardless of where it’s from,” Ruell said in an interview following the field hearing, held at the campus of Burlington County College in Pemberton, New Jersey, an area rich with vets served by the Philadelphia office.

“I see problems across the board,” said Ruell, who began working for VA in August 2007. “These issues happen because employees are rushed, and they’re forced to meet a production standard at the end of the day. So sometimes, it’s not about going the extra mile for the veteran, because they won’t have a job if they fail their standards.”

The claims traveled in the opposite direction as well; 512 cases were “brokered” from Philadelphia to Wilmington in fiscal year 2013. The transfers to and fro, which became a major issue in 2013 when VA started getting roundly criticized for its large claims backlog, did not go unnoticed by disability compensation claims workers.

“It seems like it was kind of like a shell game, where they’re just shifting these cases from Philly to Delaware – and then saying, look, we’re making progress,” said Christian Dejohn, a claims handler in Philadelphia’s Veterans Service Center. “We think that a lot of people in the Philly office are aware that was going on. Of course, we were very disappointed.”

“It’s shuffling,” said Ryan Cease, like DeJohn an Army vet and a veterans service representative in the service center’s appeals department who, along with DeJohn, has cooperated with congressional investigators. “It’s basically shuffling.”

The hearing, before Reps. Jon Runyan, R-New Jersey, and Dina Titus, D-Nevada, was to hear “additional concerns” beyond those raised last summer, when Ruell told the full committee she been made aware of improper shredding of military mail, data manipulation and beneficiaries receiving improper benefits payments – and has been subjected to four years of retaliatory harassment as a result.

The data manipulation issue stemmed from a directive, since rescinded, that, misapplied, allowed staffers to give unadjudicated claims a more current data – a “discovered date.”

“A memo was used to minimize the average dates pending of the claim to make the regional office’s number look better,” Ruell told the committee in July.

VA’s inspector general substantiated those concerns during an unannounced visit in June. Its investigation continues, said Linda Halliday, the IG’s assistant inspector general for audits and evaluations.

Runyan expressed particular concern over the IG’s identification of several instances of duplicative pension payments – the result of duplicate records in the center’s electronic system. “If neither workload management nor fiscal stewardship are priorities, what do you see as the priority there?” he asked Halliday.

“I believe what is driving this is to meet production metrics at the expense of making the right decisions and processing the veteran’s claim according to how it should be processed,” Halliday replied.

In other words, DeJohn and Cease said and as Ruell indicated, production goals processors are expected to meet.

“The point system is a real problem,” DeJohn said. “The VA point system.”

“The point system basically evaluates your productivity,” Cease said. “It also covers your accuracy. So for a person to say you have a productive day based on how many points you did per day, a lot of people would cherry-pick and say, well, I’m going to pick the easy work, put aside the hard work, and just gain points.”

An “easy” claim, he said, would be one with fewer individual medical conditions.

DeJohn said he was fired in 2012 for “alleged low numbers,” winning his job back after 1½ years.

The system, the two claims workers said, remains in place – as do the repercussions felt by those speaking out. This, in spite of new VA Secretary Robert McDonald’s promise to protect them.

DeJohn said he’s received death threats. Ruell has felt more subtle retaliation.

“They’re very creative in the things that they do to employees,” she said after the hearing. “They make it look like it’s a legitimate, legal thing, but … I never feel like I’m wanted in that building. I’ve never felt appreciated for anything I’ve brought forward. I basically show up because people rely on me to do the right thing and help report things.

“That’s why I come back,” she said. “I would never choose this job again, if it wasn’t for helping veterans.”

IG spokeswoman Cathy Gromek said to look for the IG’s final report on the Philadelphia regional office in late November or early December.

Contact William H. McMichael at (302) 324-2812 or bmcmichael@delawareonline.com. On Twitter: @billmcmichael

Military Health Systems Review – Final Report

Posted by timmreardon on 10/03/2014
Posted in: DoD, Health Care Costs, Health Care Economics, Health IT adoption, Health Outcomes, Healthcare Delivery, Healthcare Informatics, Healthcare Security, Innovation, Integrated Electronic Health Records, Military Health System Reform, Mobile Healthcare, National Health IT System, Open Data, Patient Centered Medical Home, Patient Portals, PCMH, Quadruple Aim, Quality Measures, Rehab Medicine, U.S. Air Force Medicine, U.S. Army Medicine, U.S. Navy Medicine. Leave a comment

MHS Review - Main

Military Health System Review – Final Report Appendices

Posted by timmreardon on 10/03/2014
Posted in: DoD, Emergency Medicine, Health Care Costs, Health Care Economics, Health IT adoption, Health Outcomes, Healthcare Delivery, Healthcare Informatics, Healthcare Security, Innovation, Integrated Electronic Health Records, Military Health System Reform, National Health IT System, Patient Centered Medical Home, PCMH, Quadruple Aim, Quality Measures, U.S. Air Force Medicine, U.S. Army Medicine, U.S. Navy Medicine. Leave a comment

MHS Review - Appdx

Hagel Orders Improvements in Military Health Care – DoD News

Posted by timmreardon on 10/03/2014
Posted in: DoD, Health Care Costs, Health Care Economics, Health IT adoption, Health Outcomes, Healthcare Delivery, Healthcare Informatics, Healthcare Security, Innovation, Integrated Electronic Health Records, Military Health System Reform, Mobile Healthcare, National Health IT System, Patient Centered Medical Home, Patient Portals, PCMH, Quadruple Aim, Quality Measures, U.S. Air Force Medicine, U.S. Army Medicine, U.S. Navy Medicine, Warrior Transistion Units. Leave a comment

By Cheryl Pellerin
DoD News, Defense Media Activity

WASHINGTON, Oct. 1, 2014 – Defense Secretary Chuck Hagel has ordered improvements in the Military Health System, saying a 90-day review of the system that found it comparable in access, quality and safety to care offered on average in the private sector is not good enough for service personnel and their families.

Click photo for screen-resolution image
Defense Secretary Chuck Hagel and Deputy Defense Secretary Bob Work, right, and Dr. Laura Junor discuss the Military Health System during a briefing for reporters at the Pentagon, Oct. 1, 2014. DoD photo by Glenn Fawcett
(Click photo for screen-resolution image);high-resolution image available.

“We have the finest military in the world,” Hagel said during a briefing today on results of the review. “Our men and women in uniform and their families deserve the finest health care in the world.”

In May, the defense secretary ordered a comprehensive review of the Military Health System, or MHS, to be led by Deputy Defense Secretary Bob Work.

It sought to assess whether access to medical care in the MHS met defined standards, whether the quality of health care in the MHS met or exceeded defined benchmarks, and whether the MHS had created a culture of safety with effective processes for ensuring safe and reliable care of beneficiaries.

Pockets of excellence

“The review found pockets of excellence, significant excellence which we’re very proud of,” Hagel said, “and extraordinary doctors, nurses and staff who are deeply dedicated to the patients they serve.”

But he said, “It also found gaps, however, and facilities that must improve.”

The bottom-line, the secretary said, “is that the military health care system provides health care that is comparable in access, quality and safety to average private-sector health care. But we cannot accept average when it comes to caring for our men and women in uniform and their families. We can do better; we all agree that we can do better.”

Hagel said he’s directing the department to take steps to ensure that the entire military health care system is not just an average system but a leading one.

First steps

“These are first steps but they will help our hospitals and clinics foster a stronger culture of safety, quality and accountability,” he added, “a culture that must become second nature to all who execute DoD’s critical health care system and our mission.”

Hagel has also directed all health care facilities identified as outliers in categories of access, quality and safety to provide action plans for improvement within 45 days.

The secretary has also directed Dr. Jonathan Woodson, assistant secretary of defense for health affairs, and the military services surgeons general to ensure that the department has unified standards for purchased and direct care.

Hagel also ordered them to establish a mechanism by which patients and concerned stakeholders can provide ongoing input.

System-wide performance management

“I’m also directing the department’s health care leadership to establish a system-wide performance management system that will help scrutinize lapses and monitor progress,” the secretary added. “And to enhance transparency I’m requiring that all … data on our health care system be made publicly available.”

By the end of the year, Hagel said, DoD will have a detailed implementation plan to ensure that MHS becomes the top-performing system those in the department expect it to be and want it to be.

Work said the Defense Department has no higher priority than its men and women.

Sacred compact

“They are the true secret weapon that the United States has … and they deserve the finest health care that we can possibly provide. It’s a critical part of the sacred compact that we have made … and when the secretary asked me to do this I was actually quite excited.”

Work said he was born into a Marine family and experienced the MHS in the continental United States and oversea, as a Marine, through NROTC, and later as a Marine with a family — wife who is a former Army nurse and a daughter.

“I feel that I have a lot of firsthand experience on what this system provides. I know it pretty well and I share the secretary’s commitment on getting it right,” he said.

Work said the department was happy to hear that its health care system is comparable on average with the national civilian health care system.

A leading organization

“But as the secretary said, he does not expect us to be average. He wants us to be a leading organization and he has tasked us to do so,” Work added and said that after meeting with veterans’ service organizations and other interested groups, the department now has a good idea about areas where improvements are needed.

“This will be the start of a process in which we all commit ourselves to becoming a leading organization.”

(Follow Cheryl Pellerin on Twitter @PellerinDoDNews)

Article link: http://www.defense.gov/news/newsarticle.aspx?id=123314

Cloud Computing Accounts for Just 1% of HHS’ IT Budget, GAO Finds – iHealthBeat

Posted by timmreardon on 09/27/2014
Posted in: GAO Reports, Uncategorized. Leave a comment

Friday, September 26, 2014

Although cloud computing could result in significant cost savings for HHS, the department has allocated just 1% of its overall budget to such services, according to a new Government Accountability Office report, Health Data Management reports.

Report Details

For the report, GAO examined the cloud computing services of seven federal agencies, including HHS. The report compared the status of such services in fiscal year 2014 budgets with its previous report in 2012, about three years after the Office of Management and Budget issued the federal Cloud First policy, which requires agencies and departments to implement cloud-based services whenever there is a cost-effective, reliable and secure opportunity to do so.

Report Findings

The GAO report found that HHS:
•Increased its number of cloud computing services from three in July 2012 to 36 in July 2014 (Slabodkin, Health Data Management, 9/26); and
•Increased its spending on cloud computing services from $26 million in FY 2012 to $64 million in FY 2014, which corresponded to an increase from 0% of its IT budget to 1% of its IT budget.

However, the report found that only four of HHS’ 36 current cloud computing services reduced costs, for a total savings of $1,190,000.

GAO cited two major reasons why cloud computing services did not result in savings:
•The change to such services was intended to improve service, and not to reduce costs; and
•In some cases, such services created an additional service or improved quality of service, which created added costs that offset any savings.

In the latest report, GAO found that departments and agencies cited several challenges to implementing cloud computing services, including:
•Funding for implementation;
•Having appropriate expertise for acquisition processes;
•Meeting new network infrastructure requirements;
•Meeting federal security requirements; and
•Overcoming cultural barriers.

Recommendations for HHS

GAO recommended that HHS:
•Assess all IT investments for potentially migrating to a cloud computing services; and
•Establish timelines for IT investments that have not been evaluated for migration to cloud computing services.
HHS said it agreed with the recommendations (GAO report, September 2014).

Article link:http://www.ihealthbeat.org/articles/2014/9/26/cloud-computing-accounts-for-just-1-of-hhs-it-budget-gao-finds

Report link: http://www.gao.gov/assets/670/666133.pdf

The Implications of Overuse Medical Care – AHRQ Interview with Rosemary Gibson

Posted by timmreardon on 09/27/2014
Posted in: Health Care Costs, Health Care Economics, Health Outcomes, Healthcare Delivery, Healthcare Security, Quadruple Aim, Quality Measures. Leave a comment

Editor’s note: Rosemary Gibson, MSc, is Senior Advisor to The Hastings Center and an editor for JAMA Internal Medicine. She led national initiatives to improve health care quality and safety at the Robert Wood Johnson Foundation for 16 years. She is the principal author of The Treatment Trap, which explores reasons behind overuse of unnecessary medical treatment. We spoke with her about overuse of medical care and its effect on patient safety.

Interview

Dr. Robert Wachter, Editor, AHRQ WebM&M: What is the magnitude of overuse in medicine?

Rosemary Gibson: We don’t measure the extent of overuse so we don’t know its magnitude. I believe it is pervasive.

RW: As you look at the pathophysiology of it, are you convinced that financial drivers—which I guess I interpret as being the more you do the more you get paid—are the dominant reason for overuse? What evidence supports that?

RG: Yes, financial incentives are the dominant cause of overuse. But other factors contribute to overuse. Uncertainty in medicine drives overuse. A natural proclivity may be to do something in a context of uncertainty. Fear of malpractice suits drives overuse. Beliefs are a factor. Physicians and patients have their beliefs about medicine and its possibilities and limitations.

The enthusiasm factor is widespread, as Dr. Mark Chassin, CEO of The Joint Commission, reminds us. Even when evidence proves that an intervention is not effective, enthusiasm for it overwhelms objectivity. A case in Maryland of unnecessary stents involving nearly 600 people illustrates this phenomenon. A physician who assumed responsibility for the care of some of these patients, after the doctor involved stopped practicing, said that a small subset still believed that the physician who performed the unnecessary procedures saved their lives. The enthusiasm factor is alive and well because of what I call “the marinated mind.” Our minds have been marinated by the media and marketing to believe that more is better. We need to dilute the marinade.

Finally, publicly-traded companies that sell drugs, devices, and equipment are a powerful driver of overuse. Their primary fiduciary duty is to shareholders and they are obligated to maximize shareholder value. This goal can be achieved only by selling more products and increasing the price. Selling more requires doctors to use more of their products in surgeries, tests, and prescribing practices.

RW: How should we tackle this problem?

RG: Different solutions are needed to address the varied causes of overuse and its harm. There are two kinds of harm from overuse. The first occurs when a patient has an unnecessary procedure, such as back surgery or cardiac procedure. The patient is exposed to risks of the procedure.

Second, as productivity demands increase, stress on the delivery system to do more in the same amount of time heightens the risk of mistakes and the potential for harm. I use the I Love Lucy chocolate factory video to illustrate this point. In the video, Lucy and Ethel are wrapping chocolates on an assembly line. The conveyor belt speeds up and they do workarounds to handle the faster pace. They demonstrate to their supervisor that they can manage the faster pace, when in fact they cannot. The supervisor ramps up the pace even more, and everything goes downhill.

That’s exactly what’s happening in health care. When a system with defects is forced to operate at a faster pace, performance degrades and mistakes increase, creating more opportunity for preventable harm.

RW: So the reasons for overuse include financial incentives, cultural beliefs, marketing, and production pressures. Take us through what you think some of the solutions are.

RG: Identify outliers where overuse is apparent and publicly report hospital-specific data. Public reporting of hospitals’ early elective deliveries without medical necessity is an example of the utility of transparency. It raised awareness and motivated improvement.

Public reporting of double chest CT scan data from Hospital Compare is another good example of how transparency generated improvement. The New York Times and the Washington Post published interactive maps using Medicare data that identified hospitals around the country that had high rates of unnecessary double chest CT scans. Transparency raised public awareness of overuse and motivated reductions in inappropriate use. As we begin to develop a critical mass of early adopters of appropriate use, the next step is to cut reimbursement for double scans, thereby aligning incentives to provide appropriate care.

RW: Should we dial down reimbursement for everything or pare back specific areas of overuse?

RG: We need to do both. We need to target specific areas of overuse and develop knowledge and skills to reduce it. We need to be honest about the harm from the overuse. Not all overuse results in harm but it can. When a child gets an adult dose of radiation, this is not just overuse. The child is exposed to unnecessary radiation. An early elective delivery without medical necessity is not just overuse. A newborn can be harmed by it. Overuse is a patient safety concern, not just a financial burden.

Our country also needs to set limits on the rate of growth in health care spending. Without limits, we will be successful only in reducing unnecessary use of particular tests or procedures. Meanwhile, overuse of other interventions will pop up. The “whack-a-mole” strategy does not work. The most vociferous opponents of limits on spending growth will be those who are dependent on revenue growth for their survival. Let’s be clear. Health care spending levels are not sustainable. A graph from a 2007 CBO report shows that if historical spending trends persist, by 2082 we’ll be spending 99% of our GDP on health care. The recent slowdown in spending growth will moderate these projections somewhat to say, 80% of GDP. Still, we cannot keep doing what we are doing.

RW: Is your recommendation that we focus on the clinical harm rather than the finances in terms of trying to get the message out?

RG: We need to do both: highlight the harm to patients and set limits on spending growth. Health care journalists have contributed to the public debate on overuse with front-page articles in the New York Times, Washington Post, and Wall Street Journal on overuse of back surgeries, stents, among other interventions. Public comments on these articles are enlightening. A subset of the public is deeply attuned to the harms of overuse and are dismayed by it. The drumbeat should continue. We need reporting by hospital of measures of overuse, which we don’t have too many of, that will shine a light on outliers and galvanize improvement.

RW: Your sense is that transparency has a huge bang for the buck. That good things flow from putting out appropriately adjusted data and demonstrating patterns of variation, particularly when the media gets ahold of it. An alternative approach would be to say we must be even more proactive. Regulators need to get involved, boards need to be looking at this, and payers need to be looking at these variations and stop paying for these things. Is your feeling that transparency comes first, or should all of this be happening at the same time?

RG: Transparency comes first to build awareness and understanding of the human and financial cost of overuse. Next, promote learning to reduce overuse. Finally, cut reimbursement as South Carolina Medicaid and Blue Cross have done for inappropriate early deliveries.

RW: When you talk to practicing physicians about overuse they have almost an inkblot response. One is malpractice and the second is patient-driven demand. What do you think and what can be done about those two areas?

RG: Fear of medical malpractice drives overuse. The dreaded thought of missing a cancer diagnosis leads to overtesting. At the same time, the fear of being sued is probably not the motivation for cardiac procedures performed on patients when there are no blockages, or for most complex back surgeries that result in increased morbidity and mortality. Overuse is causing medical malpractice suits.

RW: Obviously physician groups often advocate for some sort of legislative changes to the malpractice system that might help here. Are you sympathetic to those calls at all?

RG: Early evidence suggests that health care organizations that are honest with patients in the aftermath of preventable harm can have dramatic reductions in medical malpractice claims and premiums. This is where I would start to bring down the cost of medical malpractice. Second, examination of the root causes of preventable harm and repeated patterns of harm offers valuable information about systemic risks and provides insight to make dramatic improvements in safety, as anesthesiologists have discovered, resulting in dramatically lower malpractice premiums. Third, a small subset of physicians account for a disproportionate number of malpractice claims. It is not clear that medical malpractice insurers have handled this disproportionate share issue satisfactorily. Finally, some lawyers on both sides are too keen on generating income rather than resolving cases. For this reason, the best strategy is prevention.

RW: How about patient-driven demand?

RG: Our minds have been marinated to believe more is better, and we’ve bought into it. So how do we turn our brains back on? It’s not easy to do, but the good news is that some people clearly understand now that they might be getting medical care that’s not warranted. When I give talks I ask two questions, have you or someone you know ever had medical care that you or they thought was unnecessary? The range of those answers goes from 30% up to 80%. At a National Patient Safety Foundation meeting, 80% of people raised their hand indicating that they knew somebody. The second question is: have you or someone you know ever declined a treatment option recommended by a physician, because you or they thought it was over the top? Then did you seek a less intensive or invasive alternative that met your needs? And a slightly smaller percentage raised their hands.

That tells me many people are thinking about this and being more involved in their medical decision-making. Then again, some people have no clue. And that’s deeply worrisome. So at least we’re on this curve of starting because we do so much in health care. People are finally getting the picture: This doesn’t make any sense and I’m worse off because of it.

RW: In some ways, these are extraordinarily complex issues, and the simple cases are when we know this thing doesn’t work. But the harder cases, which may be even more prevalent, are when it might work but the probability that it’s going to work is very, very low and it’s usually expensive. How do we deal with that circumstance?

RG: It’s very hard for people to make these complex decisions weighing the potential for benefit vs. the potential for harm. I think we can begin by helping people develop the skills to make good decisions about relatively simple, less consequential tests or procedures so they can gradually develop competence in medical decision-making.

An example might be dental x-rays. Are they really necessary every 6 months or every year? If not, we can empower people to politely, but firmly, decline repeated x-rays or other unnecessary tests or procedures. I remember a state legislator who approached me after a talk I gave on overuse of diagnostic imaging. He goes to his doctor every 3 months and gets a chest x-ray. I asked him if he had any underlying condition, and he said no. Then I asked him what he thought about these repeated tests after hearing the talk I had given, and he said that he needed to talk to his doctor about whether they were really needed. This conversation taught me that we can help people examine their own patterns of health care use and begin to ask questions about whether it is appropriate.

The public also needs examples and good medical decision-making. What do you ask? How do you ask? And coming from real people who’ve made those same types of decisions and how they went through it.

RW: Do you feel that having patients have some financial skin in the game is a force for good in terms of patients being more sensitive to trying to find that information, asking that question, getting educated? Or are the negatives so strong that it’s not a favorable trend?

RG: Yes, people should have some financial stake, so they have an incentive to choose wisely. It addresses the so-called moral hazard that occurs when people use more health care services because insurance insulates them from the full cost. But another moral hazard is even more problematic: Health care organizations that build new cancer centers or cardiac facilities to grow market share and rival a competitor’s facilities across town are using the public’s money to create duplicative capacity. Excess capacity in a community is overtreatment waiting to happen. These market dynamics won’t stop until the money stops.

RW: Let me ask you about the future. What do you think this all looks like 5 or 10 years from now?

RG: The health care cost burden will continue to weigh heavily on many Americans, reducing both necessary and unnecessary use. More health delivery systems will be for-profit. Mergers and consolidations will continue, resulting in fewer, larger health care systems. This trend will parallel the consolidation in the banking industry. Countervailing forces to balance excesses will be few and far between.

RW: You’re working on the JAMA Internal Medicine series, “Less Is More,” and there is Choosing Wisely—so at least elements of the profession have begun talking about these issues in ways that they weren’t before. Are these meaningful changes?

RG: Yes, meaningful change to reduce overuse is taking place but only in small pockets where dedicated physicians are endeavoring to ensure patients get the care they need, not the care they don’t. Far more powerful forces, however, continue to drive overuse. I’m an optimistic person but the challenges are daunting.

Article link; http://webmm.ahrq.gov/perspective.aspx?perspectiveId=163#.VCYzXvJjF2N.twitter

Docs frustrated with transition to electronic medical records

Posted by timmreardon on 09/23/2014
Posted in: Health Care Costs, Health Care Economics, Health IT adoption, Health Outcomes, Healthcare Delivery, Healthcare Informatics, Innovation, Integrated Electronic Health Records, Military Health System Reform, National Health IT System, Patient Centered Medical Home, Patient Portals, PCMH. Leave a comment

By Molly Rosbach / Yakima Herald-Republic
mrosbach@yakimaherald.com

September 12, 2014
Herald Repub
Scribe AJ Lakey enters patient information that will become part of an electronic medical record as Dr. Joan Knight, not pictured, examines a patient in the emergency room of Memorial Hopsital in Yakima, Wash. on Sept. 9, 2014. (KAITLYN BERNAUER/Yakima Herald-Republic)

Technology has enabled major breakthroughs in medicine in recent years. Scientists can grow body parts in petri dishes, surgeons can operate via robot, and an individual genome can be mapped for just a few thousand dollars.

But with electronic medical records, a crucial piece of health care reform going forward, doctors are still waiting for the technology to deliver on its promises. Meanwhile, they face penalties if they do not adequately implement electronic systems.

Electronic medical records, called EMR for short, were hyped as a solution to inefficiencies in processing patients through medical situations as well as safety concerns. No more worrying about interpreting doctors’ chicken-scratch handwriting with electronic prescriptions. Doctors would be able to predict which patients were at risk for conditions like diabetes by analyzing medical history amid vast new stores of data. Patients and their providers would have access to the appropriate records in an emergency room, a doctor’s office or a hospital.

Instead, at least in Yakima, many doctors are frustrated by interfaces that don’t allow them to input or receive information the way they want, which slows them down.

“Time is very valuable, and right now, when you talk to patients who are being seen by providers, they feel that … the provider’s spending more than half of the visit with the computer, not the patient,” said Dr. Michael Schaffrinna, chief medical officer at Community Health of Central Washington. “And that’s not right. It’s not fair to our patients.”

Part of the problem is the transition. Most doctors today are still used to paper charts, or at least dictating notes to be transcribed later. Many are poor typists. They’re used to writing patient notes in sentences, not scrolling through endless lists to click boxes on a screen for each complaint and symptom.

“It reads like a translated Russian novel,” Schaffrinna said of the stilted, computer-generated notes. “It doesn’t flow. And that means it takes a lot longer for people to find the information they’re looking for to care for the patient.”

Schaffrinna has been involved with electronic medical records in some form for the past 24 years with different organizations.

The real push toward electronic medical records came in 2009 with the federal HITECH Act, which includes incentives for providers to employ EMR systems that meet certain “meaningful use” requirements. In other words, it’s not enough to simply install an EMR system; it has to improve results for patients and collect relevant data.

“There is both a carrot and a stick incentive structure,” says Ian Corbridge, the new policy director for clinical issues at the Washington State Hospital Association. He comes to the organization after working on quality metrics and meaningful use in the Health Resources and Services Administration, part of the federal Department of Health and Human Services.

The carrot side, Corbridge said, is federal financial support to providers who implement EMR. But that support is by no means sufficient to cover the exorbitant cost of implementation, nor was it meant to be. For a small organization, Schaffrinna says, the patient portal alone costs $100,000; the cost of the entire EMR, including hardware and software, can be in the millions or tens of millions depending on the size of the organization.

Starting in January, however, the carrot becomes more of a stick because financial penalties kick in for providers who have not met specific “meaningful use” benchmarks, Corbridge said. The penalty is a 1 percent reduction in Medicare reimbursement rates each year.

Meaningful use requirements — think improved patient results and data collection — are set in stages, so once providers are up to snuff on certain measures, they are then required to address the next tier. Stage 1 includes providers ordering medications electronically, logging certain patient information such as allergies and vital signs electronically, and using EMR to check for drug interactions, among many others.

Stage 2 includes many of the same measures, then adds more: generating lists of patients with specific conditions to compare for quality improvement; using the EMR to identify patients who need reminders for follow-up treatment; and communicating with patients electronically about health issues, to name a few.

At Yakima Valley Memorial Hospital, physicians are meeting “pretty much all” of the 22 Stage 2 benchmarks set by the Center for Medicare and Medicaid Services, said Jeff Yamada, vice president and chief information officer. The list includes using EMR to order medications and lab studies, patient access to their medical record prior to discharge and the availability of images to physicians, among many others.

“Currently, we look very good to be able to attest for meaningful use Stage 2, which we’ll do Oct. 1,” Yamada said.

If Memorial doesn’t make it, he said, the hospital could lose the financial incentive payments, and after that, face penalties.

The incentive payments and the financial penalties are a big deal, providers say, because the cost of EMR is so high. Some smaller hospitals and providers around the country are affiliating just to share in the cost of implementation.

Schaffrinna sees the financial penalty as a double-whammy, considering how problematic Community Health’s EMR system has been so far.

Experts acknowledge that doctors see a significant drop in productivity during implementation of an EMR system and for the next six or eight months, at least, as they become accustomed to the digital format.

“They say you’ll recover. Well, not really,” Schaffrinna said. “You have to hire new people to do the work, change the workflow so the system can work for you.”

The unfortunate reality, he said, is that “these systems force people to bend to the system rather than having the system bend to the workflow or the people,” in part because the EMR systems available today are not designed by physicians currently working with patients.

That hurts the bottom line, both because of lost patient time as providers struggle to complete the electronic records quickly, and because organizations often have to hire extra information technology staff members to keep the system running.

Ease of EMR implementation depends a lot on the system, too, and almost every organization in the Yakima Valley is on a different system.

That wasn’t always the case. Yakima was far ahead of the curve when local physician Dr. Victor Sharpe and technology developer Greg Jewell created ChartConnect in 2000. Many providers used the system, so everyone could share patient information. But with the requirements of the HITECH Act, providers found that ChartConnect did not suit their evolving needs and chose new systems more specifically tailored to their practice.

“It was kind of like the Tower of Babel — everyone went to different systems,” Schaffrinna said. “So now we’re trying to achieve what we had in the past.”

One technology still promised for the future: “health information exchanges,” not to be confused with the state insurance exchanges. These new electronic exchanges will allow different systems to talk to each other, and with that hopefully comes sharing of patient information when medically necessary, eliminating redundant procedures (and forms for patients) and lowering the risk that crucial data slips through the cracks between multiple providers.

At Yakima Regional Medical and Cardiac Center, hospitalist program director Dr. Mark Silverstein says the system implemented under previous owner Health Management Associates works well. (HMA was bought earlier this year by Community Health Systems.) But he doesn’t bring a computer into the exam room; he enters notes at computer stations after he talks with patients. Doing the data entry with the patient present would detract from the personal interaction, he says. And medical staff still take some paper notes, though that will eventually be phased out.

As a hospitalist responsible for multiple patients each day, Silverstein said, paper records were more cumbersome and time-consuming. And riskier:

“If I’m going to order something new for a patient, when I input it, the warning prompts come up — oh, there’s an interaction, an allergy — it all works well to keep me from doing harm.”

Dr. Carl Olden, a practitioner at Memorial’s Pacific Crest Family Medicine, sees a lot of good in EMR, too. The efficiencies inherent in the technology — reviewing lab work in real time, prescriptions sent to the pharmacy automatically — can enhance patient relationships and improve care, he said — if they work as intended.

“But it takes more work to do that,” Olden said.

“People laugh at you when you type with two fingers. But if you don’t want to change, you won’t, and any change is going to be painful. If you’re willing to embrace change, you figure out how to make the system work.”

• Molly Rosbach can be reached at 509-577-7728 or mrosbach@yakimaherald.com.

Article link:http://www.yakimaherald.com/news/2484063-8/docs-frustrated-with-transition-to-electronic-medical-records

Posts navigation

← Older Entries
Newer Entries →
  • Search site

  • Follow healthcarereimagined on WordPress.com
  • Recent Posts

    • The Global Healthcare System Is Broken. Japan Fixed It for $4,100 Per Person. 04/10/2026
    • When Not to Use AI – MIT Sloan 04/01/2026
    • There are more AI health tools than ever—but how well do they work? – MIT Technology Review 03/30/2026
    • Are AI Tools Ready to Answer Patients’ Questions About Their Medical Care? – JAMA 03/27/2026
    • How AI use in scholarly publishing threatens research integrity, lessens trust, and invites misinformation – Bulletin of the Atomic Scientists 03/25/2026
    • VA Prepares April Relaunch of EHR Program – GovCIO 03/19/2026
    • Strong call for universal healthcare from Pope Leo today – FAN 03/18/2026
    • EHR fragmentation offers an opportunity to enhance care coordination and experience 03/16/2026
    • When AI Governance Fails 03/15/2026
    • Introduction: Disinformation as a multiplier of existential threat – Bulletin of the Atomic Scientists 03/12/2026
  • Categories

    • Accountable Care Organizations
    • ACOs
    • AHRQ
    • American Board of Internal Medicine
    • Big Data
    • Blue Button
    • Board Certification
    • Cancer Treatment
    • Data Science
    • Digital Services Playbook
    • DoD
    • EHR Interoperability
    • EHR Usability
    • Emergency Medicine
    • FDA
    • FDASIA
    • GAO Reports
    • Genetic Data
    • Genetic Research
    • Genomic Data
    • Global Standards
    • Health Care Costs
    • Health Care Economics
    • Health IT adoption
    • Health Outcomes
    • Healthcare Delivery
    • Healthcare Informatics
    • Healthcare Outcomes
    • Healthcare Security
    • Helathcare Delivery
    • HHS
    • HIPAA
    • ICD-10
    • Innovation
    • Integrated Electronic Health Records
    • IT Acquisition
    • JASONS
    • Lab Report Access
    • Military Health System Reform
    • Mobile Health
    • Mobile Healthcare
    • National Health IT System
    • NSF
    • ONC Reports to Congress
    • Oncology
    • Open Data
    • Patient Centered Medical Home
    • Patient Portals
    • PCMH
    • Precision Medicine
    • Primary Care
    • Public Health
    • Quadruple Aim
    • Quality Measures
    • Rehab Medicine
    • TechFAR Handbook
    • Triple Aim
    • U.S. Air Force Medicine
    • U.S. Army
    • U.S. Army Medicine
    • U.S. Navy Medicine
    • U.S. Surgeon General
    • Uncategorized
    • Value-based Care
    • Veterans Affairs
    • Warrior Transistion Units
    • XPRIZE
  • Archives

    • April 2026 (2)
    • March 2026 (9)
    • February 2026 (6)
    • January 2026 (8)
    • December 2025 (11)
    • November 2025 (9)
    • October 2025 (10)
    • September 2025 (4)
    • August 2025 (7)
    • July 2025 (2)
    • June 2025 (9)
    • May 2025 (4)
    • April 2025 (11)
    • March 2025 (11)
    • February 2025 (10)
    • January 2025 (12)
    • December 2024 (12)
    • November 2024 (7)
    • October 2024 (5)
    • September 2024 (9)
    • August 2024 (10)
    • July 2024 (13)
    • June 2024 (18)
    • May 2024 (10)
    • April 2024 (19)
    • March 2024 (35)
    • February 2024 (23)
    • January 2024 (16)
    • December 2023 (22)
    • November 2023 (38)
    • October 2023 (24)
    • September 2023 (24)
    • August 2023 (34)
    • July 2023 (33)
    • June 2023 (30)
    • May 2023 (35)
    • April 2023 (30)
    • March 2023 (30)
    • February 2023 (15)
    • January 2023 (17)
    • December 2022 (10)
    • November 2022 (7)
    • October 2022 (22)
    • September 2022 (16)
    • August 2022 (33)
    • July 2022 (28)
    • June 2022 (42)
    • May 2022 (53)
    • April 2022 (35)
    • March 2022 (37)
    • February 2022 (21)
    • January 2022 (28)
    • December 2021 (23)
    • November 2021 (12)
    • October 2021 (10)
    • September 2021 (4)
    • August 2021 (4)
    • July 2021 (4)
    • May 2021 (3)
    • April 2021 (1)
    • March 2021 (2)
    • February 2021 (1)
    • January 2021 (4)
    • December 2020 (7)
    • November 2020 (2)
    • October 2020 (4)
    • September 2020 (7)
    • August 2020 (11)
    • July 2020 (3)
    • June 2020 (5)
    • April 2020 (3)
    • March 2020 (1)
    • February 2020 (1)
    • January 2020 (2)
    • December 2019 (2)
    • November 2019 (1)
    • September 2019 (4)
    • August 2019 (3)
    • July 2019 (5)
    • June 2019 (10)
    • May 2019 (8)
    • April 2019 (6)
    • March 2019 (7)
    • February 2019 (17)
    • January 2019 (14)
    • December 2018 (10)
    • November 2018 (20)
    • October 2018 (14)
    • September 2018 (27)
    • August 2018 (19)
    • July 2018 (16)
    • June 2018 (18)
    • May 2018 (28)
    • April 2018 (3)
    • March 2018 (11)
    • February 2018 (5)
    • January 2018 (10)
    • December 2017 (20)
    • November 2017 (30)
    • October 2017 (33)
    • September 2017 (11)
    • August 2017 (13)
    • July 2017 (9)
    • June 2017 (8)
    • May 2017 (9)
    • April 2017 (4)
    • March 2017 (12)
    • December 2016 (3)
    • September 2016 (4)
    • August 2016 (1)
    • July 2016 (7)
    • June 2016 (7)
    • April 2016 (4)
    • March 2016 (7)
    • February 2016 (1)
    • January 2016 (3)
    • November 2015 (3)
    • October 2015 (2)
    • September 2015 (9)
    • August 2015 (6)
    • June 2015 (5)
    • May 2015 (6)
    • April 2015 (3)
    • March 2015 (16)
    • February 2015 (10)
    • January 2015 (16)
    • December 2014 (9)
    • November 2014 (7)
    • October 2014 (21)
    • September 2014 (8)
    • August 2014 (9)
    • July 2014 (7)
    • June 2014 (5)
    • May 2014 (8)
    • April 2014 (19)
    • March 2014 (8)
    • February 2014 (9)
    • January 2014 (31)
    • December 2013 (23)
    • November 2013 (48)
    • October 2013 (25)
  • Tags

    Business Defense Department Department of Veterans Affairs EHealth EHR Electronic health record Food and Drug Administration Health Health informatics Health Information Exchange Health information technology Health system HIE Hospital IBM Mayo Clinic Medicare Medicine Military Health System Patient Patient portal Patient Protection and Affordable Care Act United States United States Department of Defense United States Department of Veterans Affairs
  • Upcoming Events

Blog at WordPress.com.
healthcarereimagined
Blog at WordPress.com.
  • Subscribe Subscribed
    • healthcarereimagined
    • Join 153 other subscribers
    • Already have a WordPress.com account? Log in now.
    • healthcarereimagined
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...