Background: Adoption and implementation of electronic health records (EHRs) has not been without challenges as it infuses technology into what has been a historically manual process of recording patient information. In an effort to identify these challenges, the Office of the National Coordinator for Health Information Technology leveraged the Regional Extension Center population of over 140,000 providers to develop a structured way to track challenges to EHR adoption and Meaningful Use (MU).
Objectives: This report summarizes challenges to EHR adoption and MU based on nationwide data supplied by 55 Regional Extension Centers reporting over 19,000 issues representing over 43,000 unique health care providers. Practices were grouped on the basis of their place in the lifecycle of EHR adoption and MU achievement.
Uncategorized
Video link; http://www.thedailyshow.com/watch/wed-january-15-2014/exclusive—robert-gates-extended-interview-pt–3
Robert Gates failed to overcome the turf wars have prevented development of an integrated electronic health record to serve both the Defense and Veterans Affairs departments, the former Defense secretary told comedian Jon Stewart.
Gates said he and VA Secretary Eric Shinseki agreed on the need for a joint record, but faced bureaucratic pushback. “At the end of the day, Shinseki and I simply could not get the technical people to abandon their turf consciousness and their insistence on owning their own system and not combining the two,” he said Wednesday on the Daily Show.
Gates told Stewart he wanted to outsource development of the joint health record to the private sector because “when the government tries to build something really big and really complicated, especially in the technical world, it almost always fails.”
The Pentagon now plans to have its next generation health record developed by contractors while VA plans to upgrade its Veterans Health Information Systems and Technology (VistA) health record.
Gates also blasted what he called the “artificially precise” nature of the disability evaluation system which begins payments at 30 percent disability – instead of 28 percent or 32 percent. The system, Gates said, needs to err on the side of the soldier.
Paul Bergl, MD | Education | December 3, 2013
Article link: http://www.kevinmd.com/blog/2013/12/technology-revolutionize-medical-training.html
Recently, I attended the annual AAMC meeting where the question, “What will medical education look like in 2033?” was asked in a session called “Lightyears Beyond Flexner.”
After this thought-provoking session, I too pondered academic medicine’s fate. I would like to share my reflections in this forum.
Without question, technology stood out as a major theme in this conference. And for good reason: clearly it is already permeating every corner of our academic medical lives. But as technology outpaces our clinical and educational methods, how exactly will it affect our practices in providing care and in training physicians?
Our educational systems will evolve in ways we cannot predict. But in reality, the future is already here as transformations are already afoot. MOOCs — massive open online course for the uninitiated — like Coursera are already providing higher education to the masses and undoubtedly will supplant lectures in med schools and residencies. In a “flipped classroom” era, MOOCs will empower world renowned faculty to teach large audiences. Meanwhile, local faculty can mentor trainees and model behaviors and skills for learners.
Dr. Shannon Martin, a junior faculty at my institution, has proposed the notion of a “flipped rounds“ in the clinical training environment, too. In this model, rounds include clinical work and informed discussions; reading articles as a group or having a “chalk talk” are left out of the mix. In addition, medical education will entail sophisticated computer animations, interactive computer games for the basic sciences, and highly intelligent simulations.
Finally, the undergraduate and graduate curricula will have more intense training in the social sciences and human interaction. In a globalized and technologized world, these skills will be at a premium.
But why stop at flipped classrooms or even flipped rounds? Flipped clinical experiences are coming soon too.
Yes, technology will revolutionize the clinical experience as well. Nowadays, we are using computers mainly to document clinical encounters and to retrieve electronic resources. In the future, patients will enter the exam room with a highly individualized health plan generated by a computer. A computer algorithm will review the patient’s major history, habits, risk factors, family history, biometrics, previous lab data, genomics, and pharmacogenomic data and will synthesize a prioritized agenda of health needs and recommended interventions.
Providers will feel liberated from automated alerts and checklists and will have more time to simply talk to their patients. After the patient leaves the clinic, physicians will then stay connected with patients through social networking and e-visits. Physicians will even receive feedback on their patient’s lives through algorithms that will process each patient’s data trail: how often they are picking up prescriptions, how frequently they are taking a walk, how many times they buy cigarettes in a month. And of course, computers will probably even make diagnoses some day, as IBM’s Watson or the Isabel app aspire to do.
Yet even if Watson or Isabel succeeds in skilled clinical diagnosis, these technologies will not render physicians obsolete. No matter how much we digitize our clinical and educational experiences, humans will still crave the contact of other humans. We might someday completely trust a computer to diagnose breast cancer for us, but would anyone want a computer to break the bad news to our families? Surgical robots might someday drive themselves, but will experienced surgeons and patients accede ultimate surgical authority to a machine? A computer program might automatically track our caloric intake and physical activities, but nothing will replace a motivating human coach.
With all of these changes, faculty will presumably find time for our oft-neglected values. Bedside teaching will experience a renaissance and will focus on skilled communication. Because the Google generation of residents and students will hold all of the world’s knowledge in the palm of their hands, they will look to faculty to be expert role models. Our medical educators will be able to create a truly streamlined, ultra-efficient learning experience that allows more face-to-face experiences with patients and trainees alike.
So where is academic medicine headed beyond Flexner? Academic physicians will remain master artists, compassionate advisers, and a human face for the increasingly digitized medical experience.
Paul Bergl is an internal medicine physician who blogs at Insights on Residency Training, a part of Journal Watch.
Nov 26, 2013, 10:14 AM, Posted by Susan Dentzer
Senior Health Policy Adviser Susan DentzerActually, with the exception of the new health insurance exchanges, all of the phenomena described above have a long history. Similar concerns were voiced loudly in the late 1980s and 1990s, when “managed care” in health insurance became a dominant force on the health care and health insurance landscape.
What’s amazing to people who lived through both of these eras—then and now—is how little has changed.
Consumers, meanwhile, are confused. Although some are embracing the lower-cost “narrow networks” offered by health plans, others worry that lower-cost providers won’t provide the highest quality. The same consumers who might buy designer duds at outlet malls somehow feel jittery about discount health care.
Helping all of us understand what’s driving high health care costs–and making the best choices about health insurance and health care as a result–is why we need more transparency in the prices, costs and quality of health care. Here are four things to know about the issue.
- “Transparency” means clear and accurate information, and having it is the key to an effectively functioning market. If there is information “asymmetry” in a given market–for example, if the seller knows more about the quality of a good or service than a buyer–the buyer may end up paying more for a product than is optimal (think about buying a used car as an example). The situation would be even worse if the buyer didn’t know what the price was and only found it later, when the bill arrived. Yet that’s the way much of the U.S. health care market works now–and as a result, patients or consumers and purchasers, such as health plans and large employers, often end up paying very high prices for health care of uncertain quality.
- U.S. prices for health care are far higher than prices in all other countries. “Prices are–when you look at the real numbers–the overwhelming difference between us and them,” writes George Halvorson, the chairman and former CEO of Kaiser Permanente, in a forthcoming book, Don’t Let Healthcare Bankrupt America: Strategies for Financial Survival. Consider a simple appendectomy, for which many Europeans pay about $3,000, versus more than double that in the U.S.; even more startling, some U.S. hospitals and health systems charge nine times as much (see graphic below). What’s more, it can be next to impossible for consumers to find out about prices ahead of time, with a few notable exceptions. In California, for example, hospitals by law now have to disclose prices for the 25 most common outpatient services or procedures.
- Despite what many U.S. consumers appear to believe–that higher cost institutions offer higher quality–there is no automatic correlation between the two. One study showed that hospitals that charged the most for sepsis care actually had the highest sepsis death rates (T. Lagu et al, Archives of Internal Medicine, 2011). Higher-cost institutions that imply that they must charge more than others often lack the quality data to back up their claims. A recent Washington Post story noted that Seattle Children’s Hospital was omitted from health plans competing in the Washington state health insurance exchange in large part because of its higher costs for providing care that could be purchased less expensively elsewhere. The story pointed out that a pediatric appendectomy at Children’s costs $23,000, versus $14,100 at a nearby community hospital.
- When consumers are given clear information about costs and quality, they can make better choices about their health care. Judith Hibbard of the University of Oregon and colleagues have shown that presenting cost data alongside easy-to-interpret quality information–and highlighting high-value options–improved the likelihood that consumers would choose those options. And thanks to advances in quality reporting spurred in part by the Robert Wood Johnson Foundation’s Aligning Forces for Quality initiative, consumers no longer need to make many health care purchasing decisions in a quality vacuum. For example, the Foundation’s Comparing Health Care Quality: A National Directory, is an interactive tool listing 208 national, state and local public reports that can help consumers find reliable health care in their communities.
As Sy Syms, a pioneering New York discount retailer who died in 2009, used to say in television commercials for his chain of stores, “An educated consumer is our best customer.” Transparency can help make all of us the kind of savvy health care shoppers of whom Syms–and maybe his doctor–would have been proud.
Don’t miss our first #FirstFriday Google+ Hangout! Panelists will talk about transparency in health care. Submit your own questions for the expert panel to answer during the live discussion, through Twitter at @RWJF by tagging your questions with #PriceTransparency.
- RSVP for the event here.
- Check back to watch the live event on this page.
- Visit our Transparency in Health Care Prices topic page.

Tags: Transparency/Public Reporting, Quality of care, Cost of care, Enterprise Level
The exit interview: Farzad Mostashari on imagination, building healthcare bridges and his biggest “aha” moments
Farzad Mostashari, MD, stepped down from his post as the National Coordinator for Health Information Technology at the U.S. Department of Health and Human Services (HHS), during the first week of October, which was also the first week of the Federal partial shutdown. During his tenure, Dr. Mostashari, who spoke at TEDMED 2011 with Aneesh Chopra, led the creation and definition of meaningful use incentives and tenaciously challenged health care leaders and patients to leverage data in ways to encourage partnerships with patients within the clinical health care team.
Whitney Zatzkin and Stacy Lu had the opportunity to speak with Dr. Mostashari during his last week in office.
WZ: Sometimes, a person will experience an “aha!” moment – a snapshot or event that reveals a new opportunity and challenges him/her to pursue something nontraditional. Was there a critical turning point when you figured out, ‘I’m the guy who should be doing this?’
Yeah, I’ve been fortunate to have a couple of those ‘aha’ moments in my life. One of them was when I was an epidemic intelligence service officer back in 1998, working for the CDC in New York City. I’ve always been interested in edge issues, border issues; things that are on the boundaries between different fields. I was there in public health, but I was interested in what was happening in the rest of the world around electronic transactions and using data in a more agile way.
In disease surveillance we often look back — the way we do claims data now – years later or months later you get the reports and you look for the outbreak, and often times the outbreak’s already come and gone by the time you pick it up. But I started thinking and imagining: What if the second something happens, you can start monitoring it? In New York City the fire department was monitoring ambulance calls. I said, ‘Wow, if we could just categorize those by the type of call, maybe we’ll see some sort of signal in the noise there.’
When I was first able to visualize the trends in the proportion of ambulance dispatches in NYC that were due to respiratory distress, what I saw was flu. What jumped out at me was the sinusoidal curve. Wham! At different times of year, it could be a stutter process – it would go up and you would see this huge increase, followed two weeks later by an increase in deaths. It was like the sky opening up. The evidence was there all along, but I am the first human being on earth to see this. That was validation, for me, of the idea that electronic data opens up worlds. To bring that data to life, to be able to extract meaning from those zeros and ones — that’s life and death. That was my first ‘aha’ moment.
The second aha was after I joined New York City Department of Health, and I started a data shop to build our policy around smoking and tracking chronic diseases. What we realized was that healthcare was leaving lives on the table. There were a lot of lives we could save by doing basic stuff a third-year medical student should do, but we’re not doing it. Related to that – Tom Frieden had a great TEDMED talk about everybody counts.
I said, ‘I want to take six months off and do a sabbatical, and see if there’s anything to using electronic health records to provide those insights, not to save lives by city level, but on the 10 to the 3 level – the 1,000 patient practice. That started the whole journey. None of the vendors at the time had the vision we had, but we finally got someone to work with us and rolled this system out. We called some doctors some 23 times, and did all the work to get to the starting line. Finally, I took Tom on a field visit to see one of the first docs to get the program.
It was a very normal storefront in Harlem, and a nice physician, very caring, very typical. I asked her what she thought of the program. She said, ‘It’s ok. I’m still getting used to it.’ I said, ‘Did you ever look at the registry tab on the right, where you can make a list of your patients? She said no. I said, ok – how many of your elderly patients did you vaccinate for flu this year? She said, ‘I don’t know, about 80 to 85 percent. I’m pretty good at that.’ I said, ‘o.k., let’s run a query.’ And it was actually something like 22 percent. And she said – this was the aha moment – ‘That’s not right.’
That’s generally the feeling the docs have when they get a quality measure report from the health plan. But that’s population health management — the ability to see for the first time ever that everybody counts. And being able to then think about decision support and care protocols to reduce your defect rate. That was the validation that we’re on to something. Without the tools to do this, all the payment changes in the world can’t make healthcare accountable for cost and quality if you can’t see it.
WZ: Everyone has that moment in life when they’re considering all of their career options. As you were considering medical school, what else was on the table?
I actually didn’t think I was going to go to medical school. I was at the Harvard School of Public Health. I was interested in making an impact in public health. I grew up in Iran, and thought I would do international public health work. And then my dad got sick; he had a cardiac issue. The contrast between the immediacy of the laying on of hands of healthcare, and the somewhat abstractness of international public health — the distance, the remove — tipped me into saying, ‘You know, maybe I should go to medical school.’ I’ve been on that edge between healthcare and public health ever since, and always trying to drag the two closer to each other.
SL: Fast forward 20 years. You’re giving another talk at TEDMED. What’s the topic?
TEDMED and Jay Walker’s vision is more powerful in the futurescope, rather than in the retroscope. It’s more powerful to be where we are today and imagine a different future rather than look back and say, ‘Oh, yeah, we’ve done this.’ So what’s the future I would love to imagine?
The most exciting thing – as Jay Walker once mentioned in a talk comparing “medspeed” to “techspeed” – is to fully imagine what will happen if techspeed is brought to healthcare. Right now, there’s all this unrealized value that’s being given away for free that doesn’t show up on any GDP lists – what Tim O’Reilly called “the clothesline paradox.” That kind of possibility brought to medicine, but where software costs $100,000 as opposed to free, and it evolves daily and is more powerful and quicker every day, and it’s beautiful and usable and intuitive, and that’s what people compete on.
And all of that is toward the goal of empowering people. Someone said, maybe it was Jay at TEDMED, that a 14-year-old kid in Africa with a smart phone has more access to information than Bill Clinton did as President. Information is power, and it has changed everything but healthcare. For me the vision is breaking down that wall, so that patients can be empowered and can bind themselves to the mast to use what we’ve learned about how behavior changes.
It’s not as simple as you give people information and they change their behavior. It’s information tools that build on that data and build on communities and a much more sophisticated understanding about how behavior changes. What TEDMED is also great at, is understanding the power of marketing. People think of marketing of being about advertising, but marketing is the best knowledge we have about how to change behavior and all those intangibles, those predictably irrational insights, of how and why we do what we do.
It’s harnessing those, instead of having them lead to worse health – like present value discounting that leads to people wanting to procrastinate and eat that doughnut now instead of going to the gym. Or the power of anchoring, where we fixate on the first thing we see and won’t think objectively about the true risks of things. Or the herd effect, our friend is overweight and so we are more likely to be overweight.
All those nudges that are possible can be delivered to us ubiquitously and continuously, and we can choose to have them. It’s not some big brother dystopic vision. It’s me saying, ‘I want to be healthier, so I will do something now that will help me overcome and use my irrationality to help me stay healthy. To me, that’s the neat new edge between mobile cloud computing, personal healthcare, behavioral economics, healthcare IT, data science and visualization, design, and marketing. It’s that sphere that has so many possibilities to get us to better health.
The thing about the health is, we have a Persian saying: Health is a crown on the head of the healthy that only the sick can see. When you have it, you don’t appreciate it, but when you’re sick and someone you love is sick, there’s nothing better. You would do anything to get that. We need to bring that vision of the crown to everyone and help each of us grab it when we can.
WZ: I noticed you closing your eyes while preparing to answer a question. How do you pursue being able to exercise your imagination, in particular while you’re sitting in a building that’s been marked for being the least imaginative?
Because the world, as it is, is too immediate and real and limiting, sometimes you have to close your eyes to see a different world.
What has been amazing has been to see that, contrary to what people expect, this building is filled with people with untapped, unbound, unfettered imaginations who are slogging through. They’re just trapped. You give them the opening, the smallest bit of daylight to exercise that, and they’re off and running.
I give a lot of credit to Todd Park as our “innovation fellow zero,” He saw the possibility that there are more than two kinds of people in the world, innovators and everybody else. For him, it was about going to create a space where outside innovators can be the catalyst or spark that elevates and permissions the innovation of the career civil servant at CMS in Baltimore. That’s been cool.
SL: What’s your bowtie going to do after you leave HHS? Will we see it lounging on the beach in Boca?
I like the bowtie. I think I’m going to keep it. Perhaps the @FarzadsBowtie Twitter handle is going to go into hibernation, I don’t know. I don’t control it. One of the things the bowtie does for me is help me remember not to get too comfortable.
I once said at the Consumer Health IT Summit – ‘You’re a bunch of misfits – glorious misfits. And I feel like I’m very well suited to be your leader. You know, I always felt American in Iran, and felt Iranian in America when I came here. I felt like a jock among my geeky friends, and like a geek among jocks. For crying out loud, I wear a bowtie! I don’t have to tell you I’m a misfit.’
It’s that sense of not fitting into the world as it is. The world doesn’t fit me. So instead of saying, ‘I need to change,’ this group of people said, ‘The world needs to change.’ That’s the difference between a misfit and a glorious misfit.
The person who doesn’t fit into our healthcare system is the patient. The patient’s preferences don’t fit into the need to maximize revenue and do more procedures. The patient’s family doesn’t fit into the, ‘I want to do an eight-minute visit and get you out the door’ agenda. The patient asking questions doesn’t fit. That’s the change we need to make. It’s not that we need to change. Healthcare needs to change to fit the patient.
Shortly following this interview, Dr. Mostashari left HHS and is now the a visiting fellow of the Engelberg Center for Health Care Reform at the Brookings Institution, where he aims to help clinicians improve care and patient health through health IT, focusing on small practices.
This interview was edited for length and readability.
You May Not Need Big Data After All
Artwork: Chad Hagen, Graphic Composition No. 2, 2009, digital
Companies are investing like crazy in data scientists, data warehouses, and data analytics software. But many of them don’t have much to show for their efforts. It’s possible they never will.
What’s the problem? To begin with, big data has been hyped so heavily that companies are expecting it to deliver more value than it actually can. In addition, analytics-generated insights can be easy to replicate: A financial services company we studied built a model based on an analysis of big data that identified the best place to locate an ATM, only to learn that consultants had already built similar models for several other banks. Moreover, turning insights from data analytics into competitive advantage requires changes that businesses may be incapable of making. One retailer, for example, learned that it could increase profits substantially by extending the time items were on the floor before and after discounting. But implementing that change would have required a complete redesign of the supply chain, which the retailer was reluctant to undertake.
The biggest reason that investments in big data fail to pay off, though, is that most companies don’t do a good job with the information they already have. They don’t know how to manage it, analyze it in ways that enhance their understanding, and then make changes in response to new insights. Companies don’t magically develop those competencies just because they’ve invested in high-end analytics tools. They first need to learn how to use the data already embedded in their core operating systems, much the way people must master arithmetic before they tackle algebra. Until a company learns how to use data and analysis to support its operating decisions, it will not be in a position to benefit from big data. (See the sidebar “Who Benefits from Big Data?”)
Big data is big business. The IT research firm Gartner estimates that total software, social media, and IT services spending related to big data and analytics topped $28 billion worldwide in 2012. All estimates predict rapid growth. In addition to vendors, at least three types of organizations are harvesting value from big data.
Companies with a tradition of fact-based decision making. Procter & Gamble and UPS are exemplars. In the 1920s P&G became the first company to make significant product and advertising decisions on the basis of detailed market research data laboriously gathered during door-to-door conversations with consumers. Today P&G uses computer modeling and simulation to analyze multiple data sources—comments collected from social media, consumer sales data, RFID data, and information from the company’s highly digitized processes—and makes fact-based decisions on a daily basis.
UPS started tracking the movements of its vehicles and packages in the 1980s. More recently, the company began using big data from telematics sensors installed in its vehicles together with mapping data and other real-time reports of drop-offs and pickups from its drivers. Using these data, UPS designs routes that, for example, minimize the number of left turns a driver must make to deliver a load. Such changes can generate big payoffs, because they are deployed with more than 100,000 drivers around the world. In 2011, guided by analysis of big data, UPS avoided adding more than 11,000 metric tons of CO2 to the atmosphere and saved $30 million in fuel costs.
Engineering and research functions. Many engineering-based companies rely on analysis of big data to make critical operating decisions. For example, as long ago as the 1960s ExxonMobil invented 3-D seismic technology, which revolutionized how the oil and gas industry decided where to drill. Collecting and processing 3-D images of geologic formations beneath the earth’s surface provided more and better data for those decisions. Today the company’s scientists and engineers use 4-D analysis (which shows changes in a field over time) to further reduce the costs and risks of exploration. Researchers at pharmaceutical and biotech companies are also using big data and powerful processing to help drive business decisions.
The best web-native companies. Companies that connect with customers solely via the internet can capture enormous amounts of data about customer behavior. This is the perfect big-data opportunity for making fact-based decisions. One technique, which has become almost a governing ethos for Google, Amazon, Netflix, and eBay, is A/B testing, in which some users are diverted to a slightly different version of a web page, which is presenting a new idea or product. The behavior of those users (B) is then compared with that of users on the existing page (A), and the results are often subjected to sophisticated statistical analysis. This technique transforms much product-development decision making from a subjective to an objective exercise. Product designers are often shocked to learn how bad their instincts and rules of thumb are. In a neat twist, Google and Amazon are now providing tools that will help other companies follow the same approach.
Over the past three years, we’ve conducted seven case studies and interviewed executives at 51 companies to understand how companies generate business value from data. We have found that those that consistently use data to guide their decision making are few and far between. The exceptions, companies that have what we call a culture of evidence-based decision making, have all seen improvements in their business performance—and they tend to be more profitable than companies that don’t have that kind of culture.
The digital economy is all about capturing, analyzing, and using information to serve customers. Most companies can significantly improve their business performance simply by focusing on how operating data can inform day-to-day decision making. So why don’t more companies make better use of data and analysis? One reason may be that their management practices haven’t caught up with their technology platforms. Companies that installed digital platforms—ERP and CRM systems, real-time data warehouses, and homegrown core information systems—over the past 10 to 15 years have not yet cashed in on the information those platforms make available. In addition, adopting evidence-based decision making is a difficult cultural shift: Work processes must be redefined, data must be scrubbed, and business rules must be established to guide people in their work. The good news is that once companies have made the cultural change, they usually don’t go back, and their operating improvements are not easily replicated by competitors.
Our research suggests that companies with a culture of evidence-based decision making ensure that all decision makers have performance data at their fingertips every day. They also follow four practices: They establish one undisputed source of performance data; they give decision makers at all levels near-real-time feedback; they consciously articulate their business rules and regularly update them in response to facts; and they provide high-quality coaching to employees who make decisions on a regular basis.
Before we explore those practices, let’s look at a company that has had a culture of evidence-based decision making since its founding.
Empowering Employees to Make Good Decisions
In the 1970s Southland Corporation, known for pioneering the concept of the convenience store chain with its 7-Eleven shops, divested its Japanese stores, and Seven-Eleven Japan was born. Toshifumi Suzuki, the first CEO, decided early on that the key to profitability for the company’s tiny stores would be rapid inventory turnover. So he placed responsibility for ordering—the single most important decision in the business—in the hands of the stores’ 200,000 mostly part-time salesclerks. Those employees, Suzuki believed, understood their customers and, with good information, could make the best decisions about what would sell quickly.
Article link: http://blogs.hbr.org/2013/10/indias-secret-to-low-cost-health-care/
A renowned Indian heart surgeon is currently building a 2,000-bed, internationally accredited “health city” in the Cayman Islands, a short flight from the U.S. Its services will include tertiary care procedures, such as open-heart surgery, angioplasty, knee or hip replacement, and neurosurgery for about 40% of U.S. prices. Patients will have the option of recuperating for a week or two in the Caymans before returning to the U.S.
At a time when health care costs in the United States threaten to bankrupt the federal government, U.S. hospitals would do well to take a leaf or two from the book of Indian doctors and hospitals that are treating problems of the eye, heart, and kidney all the way to maternity care, orthopedics, and cancer for less than 5% to 10% of U.S. costs by using practices commonly associated with mass production and lean production.
The nine Indian hospitals we studied are not cheap because their care is shoddy; in fact, most of them are accredited by the U.S.-based Joint Commission International or its Indian equivalent, the National Accreditation Board for Hospitals. Where available, data show that their medical outcomes are as good as or better than the average U.S. hospital.
The ultra-low-cost position of Indian hospitals may not seem surprising — after all, wages in India are significantly lower than in the U.S. However, the health care available in Indian hospitals is cheaper even when you adjust for wages: For example, even if Indian heart hospitals paid their doctors and staff U.S.-level salaries, their costs of open-heart surgery would still be one-fifth of those in the U.S.
When it comes to innovations in health care delivery, these Indian hospitals have surpassed the efforts of other top institutions around the world, as we discussed in our recent HBR article. Today, the U.S. spends $8,000 per capita on health care; if it adopted the practices of the Indian hospitals, the same results might be achievable for a whole lot less, saving the country hundreds of billions of dollars.
A key to this is that, faced with the constraints of extreme poverty and a severe shortage of resources, these Indian hospitals have had to operate more nimbly and creatively to serve the vast number of poor people in need of medical care in the subcontinent. And because Indians on average bear 60% to 70% of health care costs out of pocket, they must deliver value. Consequently, value-based competition is not a pipe dream but a reality in India.
Three major practices have allowed these Indian hospitals to cut costs while still improving their quality of care.
A Hub-and-Spoke Design
In order to reach the masses of people in need of care, Indian hospitals create hubs in major metro areas and open smaller clinics in more rural areas which feed patients to the main hospital, similar to the way that regional air routes feed passengers into major airline hubs.
This tightly coordinated web cuts costs by concentrating the most expensive equipment and expertise in the hub, rather than duplicating it in every village. It also creates specialists at the hubs who, while performing high volumes of focused procedures, develop the skills that will improve quality. By contrast, hospitals in the U.S. are spread out and uncoordinated, duplicating care in many places without high enough volume in any of them to provide the critical mass to make the procedures affordable. Similarly, an MRI machine might be used four to five times a day in the U.S. but 15 to 20 times a day in the Indian hospitals. As one CEO told us, “We have to make the equipment sweat!”
U.S. hospitals have been developing similar structures, but there are still too many hubs and not enough spokes. Moreover, when hospitals consolidate, the motive often is to increase market power vis-à-vis insurance companies, rather than to lower costs by creating a hub-and-spoke structure.
Task Shifting
The Indian hospitals transfer responsibility for routine tasks to lower-skilled workers, leaving expert doctors to handle only the most complicated procedures. Again, necessity is the mother of invention; since India is dealing with a chronic shortage of highly skilled doctors, hospitals have had to maximize the duties they perform. By focusing only on the most technical part of an operation, doctors at these hospitals have become incredibly productive — for example, performing up to five or six surgeries per hour instead of the one to two surgeries common in the U.S.
This innovation has also reduced costs. After shifting tasks from doctors to nurse practitioners and nurses, several hospitals have even created a lower tier of paramedic workers with two years’ training after high school to perform the most routine medical jobs. In one hospital, these workers comprise more than half of the workforce. Compare that to the U.S. system, where the first cost-cutting move is often to lay off support staff, shifting more mundane tasks such as billing and transcription onto doctors overqualified for those duties — precisely the wrong kind of task shifting.
Good, Old-Fashioned Frugality
There is a lot of waste in U.S. hospitals. You walk into a hospital in the U.S., and it looks like a five-star resort; half of the building has no relation to medical outcomes, and doctors are blissfully unaware of costs. By contrast, Indian hospitals are fanatical about wisely shepherding resources — for example, sterilizing and safely reusing many surgical products that are routinely discarded in the states after a single use. They have also developed local devices such as stents or intraocular lenses that cost one-tenth the price of imported devices.
These hospitals have also been innovative in compensating doctors. Instead of the fee-for-service model, which creates an incentive to perform unnecessary procedures and tests, doctors at some Indian hospitals are paid fixed salaries, regardless of how many tests they order. Other hospitals employ team-based compensation, which generates peer pressure to avoid unnecessary tests and procedures.
Innovation has flourished in the U.S. in the development of new pills, clinical procedures, devices, and medical equipment, but in the field of health care delivery, it appears to have been frozen in time. In too much of the U.S., system, health care is viewed as a craft and each patient as unique. But by applying principles of mass production and lean production to health care delivery, Indian doctors and hospitals may have discovered the best way to cut costs while still delivering high quality in health care.
Follow the Leading Health Care Innovation insight center on Twitter @HBRhealth. E-mail us at healtheditors@hbr.org, and sign up to receive updates here.
The genetic-testing company’s real goal is to hoard your personal data
SA Forum is an invited essay from experts on topical issues in science and technology.
If there’s a gene for hubris, the 23andMe crew has certainly got it. Last Friday the U.S. Food and Drug Administration (FDA) ordered the genetic-testing company immediately to stop selling its flagship product, its $99 “Personal Genome Service” kit. In response, the company cooed that its “relationship with the FDA is extremely important to us” and continued hawking its wares as if nothing had happened. Although the agency is right to sound a warning about 23andMe, it’s doing so for the wrong reasons.
Since late 2007, 23andMe has been known for offering cut-rate genetic testing. Spit in a vial, send it in, and the company will look at thousands of regions in your DNA that are known to vary from human to human—and which are responsible for some of our traits. For example a site in your genome named rs4481887 can come in three varieties. If you happen to have what is known as the GG variant, there is a good probability that you are unable to smell asparagus in your urine; those blessed with the GA or AG varieties are much more likely to be repulsed by their own pee after having a few spears at Spargelfest.
At first, 23andMe seemed to angle its kit as a fun way to learn a little genetics using yourself as a test subject. (“Our goal is to connect you to the 23 paired volumes of your own genetic blueprint… bringing you personal insight into ancestry, genealogy, and inherited traits,” read the company’s website.) The FDA had little problem with the company telling you why you had dry ear wax (rs17822931) or whether you’re likely to sneeze when you look at a bright light (rs10427255).
That phase didn’t last for long, because there is much more interesting stuff in your genome than novelty items. Certain regions signal an increased risk of breast cancer, the impending onset of metabolic diseases, and sensitivity to medications. 23andMe—as well as a number of other companies—edged closer and closer to marketing their services as a way of predicting and even preventing health problems. And any kit intended to cure, mitigate, treat, prevent, or diagnose a disease is, according to federal law, a “medical device” that needs to be deemed safe and effective by the FDA. Since mid-2009, 23andMe has been negotiating with the agency, and in July 2012, the company finally began the process of getting clearance from the FDA to sell the kit that it had already been selling for five years.
Everything seemed rosy until, in what a veteran Forbes reporter calls “the single dumbest regulatory strategy [he had] seen in 13 years of covering the Food and Drug Administration,” 23andMe changed its strategy. It apparently blew through its FDA deadlines, effectively annulling the clearance process, and abruptly cut off contact with the agency in May. Adding insult to injury the company started an aggressive advertising campaign (“Know more about your health!”), leaving little doubt about the underlying medical purpose of 23andMe’s Personal Genome Service. This left the agency with little alternative but to take action. “As part of our interactions with you, including more than 14 face-to-face and teleconference meetings, hundreds of email exchanges, and dozens of written communications,” the agency complained, “we provided you with… statistical advice, and discussed potential risk mitigation strategies.” It is the tone of a spurned spouse, exasperated and angry that 23andMe is putting no effort into salvaging their relationship.
But as the FDA frets about the accuracy of 23andMe’s tests, it is missing their true function, and consequently the agency has no clue about the real dangers they pose. The Personal Genome Service isn’t primarily intended to be a medical device. It is a mechanism meant to be a front end for a massive information-gathering operation against an unwitting public.
Sound paranoid? Consider the case of Google. (One of the founders of 23andMe, Anne Wojcicki, is presently married to Sergei Brin, the founder of Google.) When it first launched, Google billed itself as a faithful servant of the consumer, a company devoted only to building the best tool to help us satisfy our cravings for information on the web. And Google’s search engine did just that. But as we now know, the fundamental purpose of the company wasn’t to help us search, but to hoard information. Every search query entered into its computers is stored indefinitely. Joined with information gleaned from cookies that Google plants in our browsers, along with personally identifiable data that dribbles from our computer hardware and from our networks, and with the amazing volumes of information that we always seem willing to share with perfect strangers—even corporate ones—that data store has become Google’s real asset. By parceling out that information to help advertisers target you, with or without your consent, Google makes more than $10 billion every quarter.
What the search engine is to Google, the Personal Genome Service is to 23andMe. The company is not exactly hiding its ambitions. “The long game here is not to make money selling kits, although the kits are essential to get the base level data,” Patrick Chung, a 23andMe board member, told FastCompany last month. “Once you have the data, [the company] does actually become the Google of personalized health care.” The company has lowered the price of the kit again and again, most recently from $299 to a mere $99, practically making it a stocking-stuffer. All the better to induce volunteers to give 23andMe the data it so desperately wants. (Currently, the database contains the genetic information of some half a million people, a number Wojcicki reportedly wants to double by year end.)
What does 23andMe want to do with all that data? Right now the talk is all about medical research—and, in fact, the company is doing some interesting work. It has been sifting through its genomic database, which is combined with information that volunteers submit about themselves, to find possible genetic links to people’s traits. (The bright-light/sneeze genetic tag is a 23andMe discovery.) More promising are 23andMe’s attempts to recruit people who suffer from certain diseases, such as Parkinson’s and a few types of cancer. Simply through brute-force pattern matching, the company has a chance of finding genetic causes of these ailments, which could lead to a way to combat them. (And perhaps a blockbuster patent or three.)
That’s just the beginning, though. 23andMe reserves the right to use your personal information—including your genome—to inform you about events and to try to sell you products and services. There is a much more lucrative market waiting in the wings, too. One could easily imagine how insurance companies and pharmaceutical firms might be interested in getting their hands on your genetic information, the better to sell you products (or deny them to you). According to 23andMe’s privacy policy, that wouldn’t be an acceptable use of the database. Although 23andMe admits that it will share aggregate information about users genomes to third parties, it adamantly insists that it will not sell your personal genetic information without your explicit consent.
Patient portal technology enables patients to use a web-based connection to communicate with their healthcare provider. Forty percent of U.S. office-based physicians currently have a portal through their electronic health record or practice management system, according to research released by Frost and Sullivan in September 2013.
Although the technology can be utilized in a variety of ways, practices generally set up a portal to give patients the following capabilities:
- Schedule appointments
- Complete registration forms
- Send secure email to providers
- Access lab results
- Request prescription refills
- Create and maintain a personal health record (PHR)
- View educational materials
A potent driver of portal growth is that Stage 2 of meaningful use under the federal EHR Incentive Program requires “patients to electronically view, download and transmit electronic copies of their own medical records.” Physicians must engage at least 5 percent of their patients via an online portal under Stage 2.
Medical Practice Insider identified patient portal products listed on the Office of the National Coordinator for Health IT (ONC)’s Certified Health IT Product List (CHPL) that have been certified to 2014 Edition criteria under the EHR Incentive Program. We further filtered to obtain products categorized by ONC in the CHPL as being applicable to ambulatory practices and modular in nature (as opposed to complete EHR systems with an integrated portal component).
Details on those products appear in the comparison table below and on hyperlinked individual pages compiled by Medical Practice Insider. Use the available hyperlinks to navigate among the products you wish to analyze.
| Vendor | Product Name | Product Version | Certification Body | Notable Feature |
|---|---|---|---|---|
| eClinicalWorks | Healow Enterprise Patient Portal | 1.0 | CCHIT | PHR and patient profile can be securely transported and shared. [More] |
| InteliChart | InteliChart Patient Portal | 2.5 | ICSA Labs | Delivers connectivity for patients through one mobile app. [More] |
| Medfusion | Medfusion Patient Portal | 13.5 | CCHIT | Anytime, anywhere access to health information [More] |
| Medical Office Technologies | ezAccess Patient Portal | 3.0 | CCHIT | Practice can do more than 100 workflow processes in the software. [More] |
| Medical Web Experts | Bridge Patient Portal | 1.1 | ICSA Labs | Increases patient engagement. [More] |
| NoMoreClipboard | NoMoreClipBoard Patient Portal | 2.8 | Drummond Group | Portal is branded to the practice, but also provides a PHR to the patient. [More] |
| Sophrona Solutions | Sophrona MU2 Portal Technology | 5.0 | Drummond Group | Patients view a calendar on the practice website and choose their appointment. [More] |
The solution to bad [iEHR] data quality isn’t more advanced technology; it’s better management – HBR
Data’s Credibility Problem
Artwork: Chad Hagen, Nonsensical Infographic No. 7, 2009, digital
As a rising product-management executive prepares for an important presentation to her firm’s senior team, she notices that something looks off in the market share numbers. She immediately asks her assistant to verify the figures. He digs in and finds an error in the data supplied by the market research department, and the executive makes the necessary corrections. Disaster averted! The presentation goes very well, and the executive is so delighted that she makes an on-the-spot award to her assistant. She concludes, “You know, we should make it a policy to double-check these numbers every time.” No one thinks to inform the people in Market Research of the error, much less work with the group to make sure that the proper data is supplied the next time.
I’ve seen such vignettes play out in dozens of companies in my career as a data doctor. In telecommunications, the maintenance department might have to correct bad addresses inputted by Customer Service; in financial services, Risk Management might have to accommodate incorrect loan-origination details; in health care, physicians must work to improve patient outcomes in the face of incomplete clinical data. Indeed, data quality problems plague every department, in every industry, at every level, and for every type of information.
Much like our rising executive, employees routinely work around or correct the vast majority of these errors as they go about their daily work. But the costs are enormous. Studies show that knowledge workers waste up to 50% of time hunting for data, identifying and correcting errors, and seeking confirmatory sources for data they do not trust.
And consider the impact of the many errors that do leak through: An incorrect laboratory measurement in a hospital can kill a patient. An unclear product spec can add millions of dollars in manufacturing costs. An inaccurate financial report can turn even the best investment sour. The reputational consequences of such errors can be severe—witness the firestorm that erupted over problems with Apple Maps in the fall of 2012.
When crude oil is thick, one of the major costs of working an oil field is steam-heating the crude in the ground to make the oil easier to pump. To figure out how much steam is needed, field technicians point an infrared gun at the flow line, take a reading, and send the data to the reservoir engineer. On the basis of those data, the engineer determines the right amount of steam and instructs field technicians to make any adjustments.
But the flow line can get dirty, which insulates the line and causes readings to be as much as 20°C lower than the true level. A dirty flow line means dirty data. This was a big problem at one oil company, whose field technicians had no idea how inaccurate their readings were—or that bad readings routinely caused reservoir engineers to use more steam than necessary, jacking up operational expenses by tens of millions of dollars.
This story is all too typical of data quality problems that plague every industry. Yet the solution is usually quite simple: Make sure that the employees involved in creating the data understand the problem. Once managers at the oil company specified that employees had to clean the flow lines, the errors stopped.
When data are unreliable, managers quickly lose faith in them and fall back on their intuition to make decisions, steer their companies, and implement strategy. They are, for example, much more apt to reject important, counterintuitive implications that emerge from big data analyses.
Fifty years after the expression “garbage in, garbage out” was coined, we still struggle with data quality. But I believe that fixing the problem is not as hard as many might think. The solution is not better technology: It’s better communication between the creators of data and the data users; a focus on looking forward; and, above all, a shift in responsibility for data quality away from IT folks, who don’t own the business processes that create the data, and into the hands of managers, who are highly invested in getting the data right.
Connect Data Creators with Data Customers
From a quality perspective, only two moments matter in a piece of data’s lifetime: the moment it is created and the moment it is used. The quality of data is fixed at the moment of creation. But we don’t actually judge that quality until the moment of use. If the quality is deemed to be poor, people typically react by working around the data or correcting errors themselves.
But improving data quality isn’t about heroically fixing someone else’s bad data. It is about getting the creators of data to partner with the users—their “customers”—so that they can identify the root causes of errors and come up with ways to improve quality going forward. Recall our rising executive. By not informing Market Research of the error and correcting it herself, she left others to be victimized by the same bad data coming from the department. She also took it upon herself to adjust the numbers even though she was far less qualified to do so than the creators of the data.





