On Monday, the U.S. Department of Health and Human Services (HHS) released its final rule for the 2019 plan year. This annual rule, called the Notice of Benefit and Payment Parameters, typically includes a broad range of changes that HHS intends to make for the next plan year for the Affordable Care Act’s (ACA) marketplaces and insurance market reforms. This is the first such rule of the Trump administration; the current 2018 plan year is governed by rules issued at the end of the Obama administration.
The rule makes a number of changes to the marketplaces that reflect President Trump’s Inauguration Day executive order that instructed federal agencies to relieve fiscal and regulatory burdens on states, individuals, and industry. HHS also notes the changes in the rule are intended to significantly expand the role of the states in the administration of the ACA, including providing states more flexibility in several areas such as benefits covered by plans, certification of plans, and the operation of “navigator” programs that help people enroll.
Although the stated goal is simplification, in at least one key way, the rule is likely to add more complexity to consumer’s health plan choices in 2019. At the same time, it leaves people with less help to sort through that complexity. Not only does the rule eliminate “simple choice” plans offered on the marketplaces, it reduces the number of navigators available to help people select plans.
What Are “Simple Choice” Plans?
Health insurers selling plans in the individual market are required to sell plans at four distinct levels that signal the amount of costs on average that are covered by a plan, or its actuarial value — bronze, silver, gold, and platinum. Along with the requirement that all plans in a given state must offer the same package of essential health benefits, this stratification greatly improved the ability of consumers to make apples-to-apples comparisons among plans. By contrast, on the pre-ACA individual market, it was often difficult for consumers to even know what a plan covered before they purchased it.
Still, competing insurance companies use a wide variety of combinations of deductibles, out-of-pocket limits, copayments and coinsurance, and deductible exclusions (services that the plan covers before the deductible is met) to arrive at the same average actuarial value for all enrollees in a plan. In a study of marketplace health plans, the Commonwealth Fund found that these different combinations can ultimately add up to very different costs for people enrolled in the plans, depending on how much and the types of health care services they end up needing in a given year.
Consumers understandably are often confused. While the federal marketplace website HealthCare.gov and state-run marketplaces feature a cost comparison tool that allows consumers to compare plans based on their potential out-of-pocket costs, having to compare a large number of plans can be overwhelming. Commonwealth Fund survey data have shown that many consumers continue to experience difficulty comparing plans on the basis of out-of-pocket costs.
In order to make choosing a plan less daunting, HHS created a set of “simple choice” plans for plan years 2017 and 2018. These are standardized health plans that insurers may elect to sell and which HHS differentially displays on HealthCare.gov to make them clear to consumers. The plans, which are offered at the bronze, silver, and gold level, have fixed deductibles, out-of-pocket limits, and copayments or coinsurance for health care services. In addition, they provide pre-deductible coverage for seven services and prescription drugs in most states and as many as 10 in states with state-specific cost-sharing limits.
The rationale behind the plans was twofold. The first was to simplify the shopping experience for consumers. Comparing two plans offered by two different insurance companies is far simpler if the cost-sharing is the same. Consumers can focus on differences between the plans on premium and the providers included in the network, for example. Second, the standardized plans also include beneficial features (such as pre-deductible services) that might not otherwise be available in some markets. A broader goal of standardizing is to encourage insurers to compete for enrollees on the basis of the value of the health care provided by their plans.
In its rule, HHS claims that dropping simple choice plans is consistent with promoting free market principles in the individual market and that the standardized plans stifle innovation. But insurers always had the option to decide whether to offer the plans, and could therefore offer plans with innovative designs.
Changes for Consumers in 2019 But Less Help
HHS’s decision not to feature simple choice plans in 2019 may be confusing to consumers who had enrolled in the plans in the past two years. In addition, the rule eliminates the requirement that insurers offer plans that are meaningfully different from one other, which will likely lead to more plans and choices in some markets. Moreover, consumers in some states may have fewer people to help them sort through their plan choices. In the rule, HHS reduces the number of navigators states are required to make available to consumers from two to one. In addition, the rule no longer requires that navigators maintain a physical presence in the state in order to provide face-to-face assistance to consumers. These changes come on the heels of a deep cut in federal funding for advertising and the navigator program during last year’s open enrollment period.
State-Based Marketplaces Are Simplifying Plan Choices
Several state-based marketplaces offer standardized health plans and won’t be affected by the new rule. California’s marketplace, Covered California, in particular has been a leader in simplifying plan options for consumers and driving its individual market toward higher-value health plans. It only allows the sale of plans that meet standardized designs set by the marketplace, which it refers to as patient-centered benefit designs.
Simple choice plans were an innovative attempt to improve the ability of consumers to make decisions about health plans that have consequences for their ability to maintain their health. While HHS has retreated from its experiment with standard plans, states that run their own marketplaces can continue to innovate.
Article link: http://www.commonwealthfund.org/publications/blog/2018/apr/the-trump-administrations-new-marketplace-rules
How a learning and research center uses simulation games to help caregivers work better together


“The tension in the army tent is palpable. The medical officer assesses victims swiftly while the military nurse and the military aid are taking care of the many wounded, stopping bleedings, intubating victims and stabilizing vital signs. Their helmets and bulletproofs vests are heavy and keep getting in the way of their movements. Flashes and vibrations indicate yet another wave of shells has hit the ground a few meters away. The smell of burnt flesh is almost unbearable but the team’s focus on the wounded is unwavering.”

Quite dramatic but, fortunately, it is not real. This is just one of the many scenarios developed at the CESIM healthcare simulation center in Brest, France to support the training of healthcare professionals. “Managing complex care decisions is not something you learn in books,” says Pr. Erwan L’Her, director of the CESIM healthcare simulation center in Brest, France.
“Medical students learn first procedures such as suturing a wound or putting on a plaster cast,” Pr. L’Her continues. “After six years at medical school, the training shifts from practical to more relational. We help students learn how to handle the patient-caregiver relationship.”
The center employs professional actors to recreate care situations. The medical student would sit in an examination room with a “patient,” leading a consultation the way they would normally do. From examining the patient and asking the right questions to writing a prescription, all the steps of the process are enacted and analyzed. It is also an opportunity for students to be confronted with difficult situations.
“There is a session in which the student has to inform the “patient” that he has a cancer,” explains Pr. L’Her. “Delivering bad news is part of the life of any healthcare professional. It is not easy to do. Learning how to handle such a situation is essential.”
The hands-on experience isn’t restricted to future caregivers alone; the center also provides training for more experienced healthcare professionals.
“We recently trained the entire anesthesia team of a hospital. The nurses, residents and doctors came to perform a procedure in our staged operating room on a dummy. Our focus was not so much on the procedure, but on managing the various incidents that can happen and on improving the team’s communication.”

Pr. L’Her and his team create immersive environments that are as realistic as possible by leveraging visual, light and sound effects.
Healthcare professionals will experience an environment that mimics their day-to-day work.
“For an immersive sequence in an operating room or intensive care unit, we use real respiratory equipment and complex procedures,” Pr. L’Her notes. “The challenge will be on team dynamics. It is really situational, about relationships, communications, crisis management and team management.” The format and scenario of every simulation training are developed specifically for a key objective and a predetermined audience.
Groups of about 20 people will usually spend two days at the center and roleplay six to eight scenarios. A couple of participants are immersed in a real-life like scenario with either a dummy or an actor portraying a patient, while the others are watching what’s happening on a screen in the room next door.
“The debriefing is very important and lasts about 45 minutes. We review the entire session from both perspectives of both the trainees and observers,” Pr L’Her adds. “The real value comes from confronting the two groups. The impressions of those in action is always very different from that of those who were comfortably watching from the other room.”
“These simulation games have a real impact on trainees’ learning curve,” says Pr L’Her. A recent study1 has shown that cardiac echography competency can be achieved in half the time (from an average of 32.5 supervised studies to 13.6) by introducing computerized echocardiographic simulators in students’ training.
“A few countries have introduced simulation programs for student to learn medical procedures, but simulation for relational trainings and continuous education is not widely used even though it’s a real opportunity for caregivers to improve their practice,” he concludes.
1 “The use of computerized echocardiographic simulation improves the learning curve for transesophageal hemodynamic assessment in critically ill patients.”
Gwénaël Prat, Cyril Charron, Xavier Repesse, Pierre Coriat, Pierre Bailly, Erwan L’Her, Antoine Vieillard-Baron. Ann Intensive Care. 2016; 6: 27. Published online 2016 Apr 7. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4824699/
Article link: http://newsroom.gehealthcare.com/roleplaying-for-healthcare-training-game-on-and-let-the-learning-begin/
By Darius Tahir | 05/03/2018 10:00 AM EDT
With help from Arthur Allen (@arthurallen202), Paul Demko (@pauldemko) and Mohana Ravindranath (@ravindranize)
CERNER SLOGGING: The first quarter of 2018 was not a kind one for Cerner. The company’s net earnings declined by $13 million so far this year when compared to a similar period in 2017, the firm announced Wednesday. The reasons are not exactly mysterious: Cerner’s president, Zane Burke, acknowledged during the earnings call that the delay in the VA contract contributed to the decline. What’s more, the company doesn’t expect the deal to get signed until the latter half of the year.
Cerner has become snarled by politics. As our colleague Arthur Allen reported Sunday, a West Palm Beach doctor using an outdated version of Cerner’s software has gotten a bee up his bonnet – or perhaps a wasp down his white coat – about the software system. Because he’s part of the Mar-a-Lago social crowd, he’s gotten the ear of Marvel executive Ike Perlmutter, who has in turn pulled political strings with his friends in the Trump administration. That’s delayed the contract.
The delay in the deal has now riled up Congressional Democrats. On Wednesday, leading Democrats on the House Veterans Affairs Committee sent a letter based on POLITICO’s reporting to acting VA secretary Robert Wilkie and the department’s inspector general asking them to explain the reasons for the delays. The letter does not state a deadline for their request.
– Meanwhile…: Former Rep. Jeff Miller is one of the leaders in the clubhouse for the Veterans Affairs secretary nomination, the Washington Post reported, citing a Wednesday meeting between the former congressman and the White House’s vetting office. It’s by no means a certainty, but something to keep an eye on.
As a congressman, Miller was a proponent of expanding private sector options for veterans. While we haven’t been able to find Miller taking a firm position on the VistA switchover, he was highly critical of the Obama-era VA’s IT record in this FCW article.
Overview
Life is moving faster and faster. Just about everything—transportation, weapons, the flow of information—is accelerating. How will decisionmakers preserve our personal and national security in the face of hyperspeed?
Topline
An American consulate is surrounded. At 1:06 p.m. local time, three trucks arrive and a dozen men gripping matte black assault rifles stream out of the vehicles. There might have been an explosion in the compound.
Thousands of miles away, a member of the National Security Council is checking email between meetings. Just ten minutes after the incident, updates and questions start pouring in.
How should the United States respond?
News media has the story. It’s trending on Twitter. How should the United States respond?
This kind of pressure-cooker scenario is not uncommon. As the velocity of information—and just about everything else—accelerates, leaders face immense pressure to act or respond quickly. To help them adapt, researchers at the RAND Corporation are studying the phenomenon of speed as part of a special project, known as Security 2040, which looks over the horizon to evaluate future threats.
In his former roles as Deputy Secretary of State and Deputy National Security Adviser, Antony Blinken was one of those leaders responding to speed-driven crises.
U.S. Deputy National Security Advisor Tony Blinken speaks on Syria at the White House in Washington, DC, September 9, 2013. Photo by Kevin Lamarque/Reuters.
“The single biggest change that I experienced in almost 25 years . . . was in the area of speed,” said Blinken, a RAND adjunct researcher who shared his expertise on the project. “Nothing had a more profound effect on government and the challenges of government.”
Blinken is concerned about things moving faster physically and virtually and the resulting pressure placed on government leaders. “It’s going to increase the probability of bad decisions.”
With Speed Comes New Threats
Almost every aspect of our lives is accelerating—communication, travel, financial transactions, cultural change, even evolution itself through new gene-editing technologies. These changes have benefits. But in many cases, there is a paradox: With progress, comes risk.
Three-dimensional (3D) printing, for example, could drastically reduce the time needed to deliver lifesaving materials to communities after a disaster. But the same technology could also increase the risk of weapons proliferation.
“With speed, we’re anticipating risks that come from tech like 3D printing and virtual threats from cyberattacks.” said Seifu Chonde, an associate operations researcher at RAND and one of the lead authors of a new paper, Speed and Security: Promises, Perils, and Paradoxes of Accelerating Everything. “But speed can also lead to new, diverse threats seemingly overnight.”
Chonde—along with research partner and RAND associate social scientist Kathryn “Casey” Bouskill—predict that this acceleration will continue into the year 2040 and beyond. It can’t be stopped, they say, but we can prepare for the changes it might bring. If governments, private-sector leaders, and the public start talking about speed now, there could be ways to mitigate its negative effects.

Stress Testing Security Norms
When it comes to national security, we feel the effects of speed in many ways. It compresses the time that leaders have to respond to threats. It disrupts traditional, more-predictable patterns of conflict escalation. It enables bad actors to attack democracies in novel ways—such as using bots to spread propaganda quickly. Speed is also transforming the weapons that adversaries will have in their arsenals.
Speed disrupts traditional, more-predictable patterns of conflict escalation, and compresses the time that leaders have to respond to threats.
During his time in government, Blinken regularly received notice of urgent security situations. News and updates poured in via email, phone call, and text. Often, as soon as a threat was reported, another one would emerge. Separating fact from fiction proved challenging.
He recalls a handful of violent attacks that were immediately labeled as acts of terrorism but turned out to be something else. In such circumstances, decisive action can be necessary—but pausing before acting is essential to getting the facts straight and understanding the second- and third-order consequences of potential actions.
“There is incredible pressure to be reactive and responsive to information that’s coming in at any given moment,” he said. “But the biggest risk is that, in your haste to react or to show strength, you get it wrong.”
In the past, measured diplomatic discussions helped decisionmakers understand and manage conflict. Now, tweets can fan the flames of conflict in an instant.
Weapons are accelerating, too. Hypersonic missiles and planes move at five times the speed of sound. Other threats, such as cyberweapons, can be deployed with the click of a mouse. Blinken likens these weapons to being “a little bit like Dr. Strangelove on steroids.” Even if such an attack is detected, officials have just minutes—or seconds—to respond.
Leaders must prepare for these emerging threats, but speed has already landed a few blows on American democracy. The rapid growth of social media and online news platforms have accelerated the spread of online propaganda and misinformation. That plays a role in what RAND calls “Truth Decay,” a phenomenon marked by the diminishing role of facts and analysis in public life.
Adapting to Life in the “Infosphere”

Woman interacting with the infosphere. Photo by PeopleImages/Getty Images.
Everyone—not just government leaders or members of the National Security Council—is influenced by a ramped-up pace and an increasing volume of information. Luciano Floridi is a professor of philosophy and ethics of information, and director of the digital ethics lab at the University of Oxford. He views people and their relationship with information more like organisms in a biosphere: Humans interact constantly with data as part of an “infosphere,” a world that is increasingly more digital.
Floridi, who examines many of the questions that RAND is asking in Speed and Security, sees humans as slowly adapting to life in the infosphere. Often, we fail, because the digital world is absorbing so much of our energy and mental space. “It’s getting harder for people to resist the temptation of speed first, direction next,” he added. Put another way, speed is forcing humans to put too much emphasis on tactics, rather than strategy.
Countering Speed with Critical Thinking
As we go faster and faster, the opportunities to take control are dwindling. But they’re not all lost. “It’s not about hitting the brakes,” Bouskill said. “It’s about striking the right tempo.”
It’s not about hitting the brakes, it’s about striking the right tempo.
The researchers urge that it’s time to start working toward that tempo. Bouskill added: “For now, it may be enough to say that democracies need to think very critically about speed and how emerging technologies could fundamentally change the structure of governance.” In the future, the speed born of new technologies will create unprecedented social and ethical dilemmas. How will speed affect international power relationships? Who will get access to new technologies? Who is likely to be left behind? How can we prevent or manage the disenchantment of those people?
To help answer these questions, the research team recommends a comprehensive approach—one that includes discussions among consumers, employers, technology developers, and policymakers. They also emphasize the importance of creativity and social-mindedness when it comes to devising solutions.

Harnessing speed requires a long-term strategy. But what about the security decisions that have to be made right now—and every day over the next 10 or 20 years?
For now, it’s up to us. When a consulate is surrounded by armed men and information gushes into the Situation Room, humans will have to play the role of circuit breakers on speed.
“It comes down to people, their knowledge, judgment, and reflexes,” Blinken said. “There’s no substitute for that.”
Project Credits
- Story Deanna Lee
- Design and Production Chara Williams and Matthew Clews
Article link: https://www.rand.org/blog/articles/2018/05/can-humans-survive-a-faster-future.html
Health Care and High Reliability: A Cautionary Tale from The Joint Commission on Vimeo.
See also (“The Best Medical Care in the World” – NEJM https://www.nejm.org/doi/full/10.1056/NEJMms1802026 )
Health care and high reliability: a cautionary tale
Tuesday, August 21, 2012
Joint Commission President Mark R. Chassin, M.D., FACP, M.P.P., M.P.H., discusses how far away health care is from high reliability and The Joint Commission’s efforts to accelerate high reliability.
5th International High Reliability Organizing Conference
Oakbrook Terrace, Illinois
May 21, 2012
Article link: https://www.jointcommission.org/multimedia/dr-chassin-keynote-hro-conference/
Article link: https://www.nejm.org/doi/full/10.1056/NEJMms1802026
Supplementary Material
| Disclosure Forms | 82KB |
References (6)
-
1. Matlock DD, Allen LA. Defibrillators, deactivation, decisions, and dying. JAMA Intern Med 2013;173:375–394.
-
2. Rosenbaum L. Closing the mortality gap — mental illness and medical care. N Engl J Med 2016;375:1585–1589.
-
3. Jamison KR. Touched with fire: manic-depressive illness and the artistic temperament. New York: Free Press, 1993.
-
4. Shakespeare T, Iezzoni LI, Groce NE. Disability and the training of health professionals. Lancet 2009;374:1815–1816.
-
5. Relman AS. What market values are doing to medicine. Atlantic 1992;269:98–102, 105-6.
-
6. Joynt KE, Jha AK. Characteristics of hospitals receiving penalties under the Hospital Readmissions Reduction Program. JAMA 2013;309:342–343.
Close References
High prescription drug costs are a persistent problem in the United States. The U.S. spends nearly double per person what other high-income countries do — a big reason we spend so much more on health care overall than the rest of the world.
In three new Health Affairs papers supported by the Commonwealth Fund and The Drug Pricing Lab of Memorial Sloan Kettering, policy experts explore ways to get pharmaceutical prices under control. By increasing competition in drug markets, applying value-based purchasing, and protecting patients from high out-of-pocket costs, they believe policymakers can begin to make affordable prescription drugs a reality.
One leading driver of high drug costs is inadequate competition in the pharmaceutical industry. Potential competitors face high start-up costs and extensive government approval requirements, yet entrenched manufacturers also take advantage of regulatory loopholes to extend patent protections. These market dynamics bar potential competitors, leading to higher prices. And this price inflation is exacerbated by a regulatory environment that encourages more generous employer-provided health insurance plans and public-sponsored coverage for high-priced drugs.
The first two papers — one by Jonathan J. Darrow and Aaron S. Kesselheim, M.D., and the other by Steven D. Pearson, M.D., Len Nichols, and Amitabh Chandra — propose five pathways to increased competition, both between brand-name manufacturers and through lower-cost generic drugs:
- Government agencies could make more information available about a drug’s value to encourage patients and physicians to choose drugs that provide the most clinical benefit for the money.
- Pharmacists could have expanded ability to substitute prescribed medications with less expensive drugs that are chemically similar and just as effective.
- The Food and Drug Administration could expedite the approval process for generic drugs and other potential competitors to existing therapies.
- Policymakers could allow the importation of generic drugs from other countries, particularly during periods of drug shortages in the U.S.
- Regulators could crack down on frivolous patents that protect brand-name drugs from generic competition.
Beyond increasing competition, Darrow and Kesselheim argue for empowering government insurance programs like Medicare and Medicaid to exclude coverage of low-value drugs altogether. Pearson, Nichols, and Chandra further propose allowing the government to make patent protection and other forms of market exclusivity for drugs contingent on manufacturers fairly pricing their products. Under such a system, higher-value drugs would be rewarded with longer periods of protection from competitors.
Pearson and colleagues propose four additional ways to pay for drugs based on their value. First, the federal government could make greater use of comparative effectiveness research to negotiate the prices it pays to drug manufacturers — a strategy all other industrialized countries follow. Federal and state governments also could assess penalties on drug manufacturers that fail to provide good value. For instance, when New York exceeds a predetermined cap for Medicaid drug spending, state law allows it to assign value-based price targets for key drugs. New York can then demand supplemental rebates from manufacturers to reach these targets and can assess penalties against manufacturers that don’t negotiate a satisfactory rebate.
Insurers can be part of the solution as well. For instance, they could pay different rates for drugs used to treat multiple conditions, based on the relative benefits of each treatment. And insurers could negotiate outcomes-based agreements with manufacturers, which would be required to provide a larger rebate or a full refund when a drug doesn’t work as advertised.
Promoting competition and value-based purchasing would help reduce the prices we pay for prescription drugs. But high prescription drug costs aren’t just a financial burden: they are also detrimental to patients’ health. In the third of the Health Affairs papers, Stacie Duetzina, Juliette Cubanski, Diane Rowland, and Scott Ramsey, M.D., demonstrate how high prescription drug prices, coupled with a shift toward higher cost-sharing by health insurers, interfere with the care of patients who need expensive specialty drugs. Both commercial insurers and Medicare Part D have gravitated away from copayments (where the patient pays a flat dollar amount per prescription) and toward both coinsurance (where the patient pays a percentage of the drug’s total price) and higher deductibles. This leaves patients more exposed to rising drug prices. When out-of-pocket costs become unaffordable, patients may stop taking the medicine they need.
The authors propose several ways to avoid cost-related disruptions in patient care. For one, insurers could charge copayments for specialty drugs, rather than coinsurance. Coinsurance is intended to make patients more cost sensitive, so they’ll shop around for better-value drugs. But patients can’t shop for better deals when only one manufacturer sells each specialty drug, as is typically the case. Coinsurance, therefore, serves little economic purpose in this market.
Insurers also could give patients incentives to choose high-value drugs. For example, the most cost-effective drugs could be exempted from deductibles or subjected to lower cost-sharing. Policymakers, meanwhile, could impose a limit on how much Medicare beneficiaries with Part D plans must spend out of pocket on prescription drugs.
These strategies could help make prescription drugs more affordable for patients, while also holding down costs for the health care system as a whole. And that would go a long way toward bringing U.S. prescription drug costs — and health care costs overall — closer to those paid by the rest of the world.
Written with editorial assistance from Joel Dodge.
EDWARD CARVALHO-MONAGHAN
In some ways, Synthego looks like any other Silicon Valley startup. Inside its beige business park facilities, a five-minute drive from Facebook HQ, rows of nondescript black server racks whir and blink and vent. But inside the metal shelving, the company isn’t pushing around ones and zeros to keep the internet running. It’s making molecules to rewrite the code of life.
Crispr, the powerful gene-editing tool, is revolutionizing the speed and scope with which scientists can modify the DNA of organisms, including human cells. So many people want to use it—from academic researchers to agtech companies to biopharma firms—that new companies are popping up to staunch the demand. Companies like Synthego, which is using a combination of software engineering and hardware automation to become the Amazon of genome engineering. And Inscripta, which wants to be the Apple. And Twist Bioscience, which could be the Intel.
All these analogies to the computing industry are more than just wordplay. Crispr is making biology more programmable than ever before. And the biotech execs staking their claims in Crispr’s backend systems have read their Silicon Valley history. They’re betting biology will be the next great computing platform, DNA will be the code that runs it, and Crispr will be the programming language.
Synthego was founded by a pair of fraternal software engineers fresh off a tour working for Elon Musk’s SpaceX. Brothers Paul and Michael Dabrowski aren’t biologists. But in Crispr they saw a unique opportunity to take the principles of agile design they learned building rockets and apply it to making gene editing tools. Their first order of business was using miniaturization and automation to drastically speed up research and product development. They started by packing an airplane hangar’s worth of intelligent instrumentation into machines stacked in those server racks. Each one orchestrates a biochemical ballet, transforming a string of in silico instructions into the company’s first product: a custom Crispr kit.
To order a kit, scientists log on to Synthego’s design portal and pick out one of the roughly 5,000 organisms they might want to edit from Synthego’s genome library—everything from E. coli to Homo sapiens—and the gene they want to knock out. The company’s predictive software then kicks out a couple optimized options for synthetic guide RNAs—the genetic guides that get Crispr’s DNA-cutting proteins where they need to go. After the order is completed, software directs compressors and pumps to push chemical reagents into the rows of instruments, mixing the ¬fluids and catalyzing the 100,000 reactions needed to create a single batch of kits. Within a week, a delivery shows up at the lab’s doorstep: everything a lab tech needs to begin manipulating the genome of a lab rat or zebrafish or dish of HeLa cells. They simply add their Crispr protein and start injecting.
“Being able to do that in a parallel way is the novel part,” says Paul Dabrowski, who estimates that Synthego cuts down the time it takes for a scientists to perform gene edits from several months to just one. Those services are in high demand. While coy about exact numbers, Dabrowski said Synthego is putting out hundreds to thousands of kits a day. But they’ll soon be more, and Synthego is ramping up accordingly. They doubled their space last year and this year they’re doubling it again.
But they’re not just adding more rows of server rack-mounted machines. They’re also adding services. Starting later this month scientists will be able to order custom-Crispr’d human cell lines, an important tool for people making potentially life-saving medicines. “We’ve got hundreds of thousands of brilliant PhDs and bench researchers who have to spend up to 50 percent of their time just taking care of these cells,” says Dabrowski. Instead, he wants them to be able to go from designing an experiment to doing it in just a few clicks.
This is all well and good for academic researchers using Crispr for basic science. But for companies that want to use Synthego’s tools to speed up product development, they still have to pay steep licensing fees to the Crispr companies who own the foundational Crispr/Cas9 IP—the patents of which remain in legal limbo. According to Inscripta CEO Kevin Ness, that technology bottleneck is creating financial barriers to innovation. Which is why his company’s first move was to release a different gene-editing enzyme called MAD7—you can think of it like a Crispr/Cas9 knockoff, but legal—free for R&D uses. Inscripta will charge a single-digit royalty, far below market standards, to use MAD7 in manufacturing products or therapeutics.
Right now it’s the company’s only revenue stream—well, let’s call it a trickle. But Ness has his reasons. “It’s not a trick or a ploy. We’re trying to get more people into the game now, by democratizing access to this family of enzymes,” he says. It’s a page from the Steve Jobs playbook; get them hooked on the MADzyme platform, down the line sell them personal hardware. Inscripta is working on a benchtop instrument where you design your gene editor and it kicks out your constructs on the spot. “We’re trying to build the tools to standardize the process of reading, writing, and testing genomes in living cells so that you don’t have to be a wizard to get it to work,” says Ness. “Everybody can push a button.”
If Inscripta is working on a biological equivalent of the personal computer, Twist Bioscience is working a level down, where the processors are. The San Francisco-based startup manufactures custom strands of synthetic DNA on semiconductor chips, to crank out the As, Ts, Cs, and Gs that are the building blocks of biology. From the Crispr guides Twist produces on a single chip, researchers can make up to a million edits. Just as the exponential miniaturization of silicon wafers propelled the computing industry forward, so too will the massive parallelization of gene editing push the boundaries of biology into the future.
Article link: https://www.wired.com/story/biology-will-be-the-next-great-computing-platform

After a blizzard of hype surrounding the electronic health record (EHR), health professionals are now in full backlash mode against this complex new tool. They are rightly seen as a major cause of professional burnout among physicians and nurses: Clinicians are spending almost half their professional time typing, clicking, and checking boxes on electronic records. They can and must be made into useful, easy-to-use tools that liberate, rather than oppress, clinicians.
Performing several tasks, badly. The EHR is a lot more than merely an electronic version of the patient’s chart. It has also become the control panel for managing the clinical encounter through clinician order entry. Moreover, through billing and regulatory compliance, it has also become a focal point of quality-improvement efforts. While some of these efforts actually have improved quality and patient safety, many others served merely to “buff up the note” to make the clinician look good on “process” measures, and simply maximize billing.
Mashing up all these functions — charting, clinical ordering, billing/compliance and quality improvement — inside the EHR has been a disaster for the clinical user, in large part because the billing/compliance function has dominated. The pressure from angry physician users has produced a medieval solution: Hospital and clinics have hired tens of thousands of scribes literally to follow clinicians around and record their notes and orders into the EHR. Only in health care, it seems, could we find a way to “automate” that ended up adding staff and costs!
As bad as the regulatory and documentation requirements are, they are not the largest problem. The electronic systems hospitals have adopted at huge expense are fronted by user interfaces out of the mid-1990s: Windows 95-style screens and dropdown menus, data input by typing and navigation by point and click. These antiquated user interfaces are astonishingly difficult to navigate. Clinical information vital for care decisions is sometimes entombed dozens of clicks beneath the user-facing pages of the patient’s chart.
Paint a picture of the patient. For EHRs to become truly useful tools and liberate clinicians from the busywork, a revolution in usability is required. Care of the patient must become the EHR’s central function. At its center should be a portrait of the patient’s medical situation at the moment, including the diagnosis, major clinical risks and trajectory, and the specific problems the clinical team must resolve. This “uber-assessment” should be written in plain English and have a discrete character limit like those imposed by Twitter, forcing clinicians to tighten their assessment.
The patient portrait should be updated frequently, such as at a change in clinical shifts. Decision rules determining precisely who has responsibility for painting this portrait will be essential. In the inpatient setting, the main author may be a hospitalist, primary surgeon, or senior resident. In the outpatient setting, it’s likely to be the primary care physician or non-physician provider. While one individual should take the lead, this assessment should be curated collaboratively, a la Wikipedia.
This clinical portrait must become the rallying point of the team caring for the patient. To accomplish this, the EHR needs to become “groupware” for the clinical team, enabling continuous communication among team members. The patient portrait should function as the “wall” on which team members add their own observations of changes in the patient’s condition, actions they have taken, and questions they are trying to address. This group effort should convey an accurate picture (portrait plus updates) for new clinicians starting their shifts or joining the team as consultants.
The tests, medications or procedures ordered, and test results and monitoring system readings should all be added (automatically) to the patient’s chart. But here, too, major redesign is needed. In reimagining the patient’s chart, we need to modify today’s importing function, which encourages users indiscriminately to overwhelm the clinical narrative with mountains of extraneous data. The minute-by-minute team comments on the wall should erase within a day or two, like images in SnapChat, and not enter and complicate the permanent record.
Typing and point and click must go. Voice and gesture-based interfaces must replace the unsanitary and clunky keyboard and mouse as the method of building and interacting with the record. Both documenting the clinical encounter and ordering should be done by voice command, confirmed by screen touch. Orders should display both the major risks and cost of the tests or procedures ordered before the order can be confirmed. Several companies, including Google and Microsoft, are already piloting “digital” scribes that convert the core conversation between doctor and patient into a digital clinical note.
Moreover, interactive data visualization must replace the time-wasting click storm presently required to unearth patient data. Results of voice searches of the patient’s record should be available for display in the nursing station and the physicians’ ready room. It should also be presentable to patients on interactive white boards in patient rooms. Physicians should be able to say things like: “Show me Jeff’s glucose and creatinine values graphed back to the beginning of this hospital stay” or “Show me all of Bob’s abdominal CT scans performed pre- and postoperatively.” The physician should also be able to prescribe by voice command everything from a new medication to a programmed reminder to be delivered to the patient’s iPhone at regular intervals.
Population health data and research findings should also be available by voice command. For example, a doctor should be able to say: “Show me all the published data on the side-effect risks associated with use of pembrolizumab in lung cancer patients, ranked from highest to lowest,” or “Show me the prevalence of postoperative complications by type of complication in the past thousand patients who have had knee replacements in our health system, stratified by patient age.”
AI must make the clinical system smarter. EHRs already have rudimentary artificial intelligence (AI) systems to help with billing, coding, and regulatory compliance. But the primitive state of AI in EHRs is a major barrier to efficient care. Clinical record systems must become a lot smarter if clinical care is to predominate, in particular by reducing needless and duplicative documentation requirements. Revisiting Medicare payment policy, beginning with the absurdly detailed data requirements for Evaluation and Management visits (E&M), would be a great place to start.
The patient’s role should also be enhanced by the EHR and associated tools. Patients should be able to enter their history, medications, and family history remotely, reducing demands on the care team and its supporting cast. Patient data should also flow automatically from clinical laboratories, as well as data from instrumentation attached to the patient, directly to the record, without the need for human data entry.
Of course, a new clinical workflow will be needed to curate all of this patient-generated data and respond accordingly. It cannot be permitted to clutter the wall or be “mainlined” to the primary clinical team; rather, it must be prioritized according to patient risk/benefit and delivered via a workflow designed expressly for this purpose. AI algorithms must also be used to scrape from the EHR the information needed to assign acuity scores and suggest diagnoses that accurately reflect the patient’s current state.
Given how today’s clinical alert systems inundate frontline caregivers, it is unsurprising that most alerts are ignored. It is crucial that the EHR be able to prioritize alerts that address only immediate threats to the patient’s health in real time. Health care can learn a lot from the sensible rigor and discipline of the alert process in the airline cockpit. Clinical alerts should be presented in an easy-to-read, hard-to-ignore color-coded format. Similarly, hard stops — system-driven halts in medication or other therapies — must be intelligent; that is, they must be related to the present reality of the patient’s condition and limited to clinical actions that truly threaten the health or life of the patient.
From prisoners to advocates. The failure of EHRs thus far to achieve the goals of improving health care productivity, outcomes, and clinician satisfaction is the result both of immature technology and the failure of their architects to fully respect the complexity of converting the massive health care system from one way of doing work to another. Today, one can see a path to turning the EHR into a well-designed and useful partner to clinicians and patients in the care process. To do this, we must use AI, vastly improved data visualization, and modern interface design to improve usability. When this has been accomplished, we believe that clinicians will be converted from surly prisoners of poorly realized technology to advocates of the systems themselves and enthusiastic leaders of efforts to further improve them.
Article link: https://hbr.org/2018/03/to-combat-physician-burnout-and-improve-care-fix-the-electronic-health-record
Robert Wachter, MD, is chair of the Department of Medicine at the University of California, San Francisco and author of The Digital Doctor: Hope and Hype at the Dawn of Medicine’s Computer Age.
Jeff Goldsmith is a national adviser to Navigant Consulting and an associate professor of public health sciences at the University of Virginia.
