healthcarereimagined

Envisioning healthcare for the 21st century

  • About
  • Economics

How to Promote Racial Equity in the Workplace – HBR

Posted by timmreardon on 09/08/2020
Posted in: Uncategorized. Leave a comment
  • Robert Livingston

From the September–October 2020 Issue

Executive Summary

Many White people deny the existence of racism against people of color because they assume that racism is defined by deliberate actions motivated by malice and hatred. However, racism can occur without conscious awareness or intent. When defined simply as differential evaluation or treatment based solely on race, regardless of intent, racism occurs far more frequently than most White people suspect.

As intractable as it seems, racism in the workplace can be effectively addressed. Because organizations are small, autonomous entities that afford leaders a high level of control over norms and policies, they are ideal sites for promoting racial equity.

Companies should move through the five stages of a process called PRESS: (1) Problem awareness, (2) Root-cause analysis, (3) Empathy, or level of concern about the problem and the people it afflicts, (4) Strategies for addressing the problem, and (5) Sacrifice, or willingness to invest the time, energy, and resources necessary for strategy implementation.

Idea in Brief

The Problem

Racial discrimination—defined as differential evaluation or treatment based solely on race, regardless of intent—remains prevalent in organizations and occurs far more frequently than most White people suspect.

The Opportunity

Intractable as it seems, racism in the workplace can be effectively addressed. Because organizations are autonomous entities that afford leaders a high level of control over norms and policies, they are ideal places to promote racial equity.

The Way Forward

Effective interventions move through stages, from understanding the underlying condition, to developing genuine concern, to focusing on correction.Leer en español

Intractable as it seems, the problem of racism in the workplace can be effectively addressed with the right information, incentives, and investment. Corporate leaders may not be able to change the world, but they can certainly change their world. Organizations are relatively small, autonomous entities that afford leaders a high level of control over cultural norms and procedural rules, making them ideal places to develop policies and practices that promote racial equity. In this article, I’ll offer a practical road map for making profound and sustainable progress toward that goal.

I’ve devoted much of my academic career to the study of diversity, leadership, and social justice, and over the years I’ve consulted on these topics with scores of Fortune 500 companies, federal agencies, nonprofits, and municipalities. Often, these organizations have called me in because they are in crisis and suffering—they just want a quick fix to stop the pain. But that’s akin to asking a physician to write a prescription without first understanding the patient’s underlying health condition. Enduring, long-term solutions usually require more than just a pill. Organizations and societies alike must resist the impulse to seek immediate relief for the symptoms, and instead focus on the disease. Otherwise they run the risk of a recurring ailment.

To effectively address racism in your organization, it’s important to first build consensus around whether there is a problem (most likely, there is) and, if so, what it is and where it comes from. If many of your employees do not believe that racism against people of color exists in the organization, or if feedback is rising through various communication channels showing that Whites feel that they are the real victims of discrimination, then diversity initiatives will be perceived as the problem, not the solution. This is one of the reasons such initiatives are frequently met with resentment and resistance, often by mid-level managers. Beliefs, not reality, are what determine how employees respond to efforts taken to increase equity. So, the first step is getting everyone on the same page as to what the reality is and why it is a problem for the organization.

But there’s much more to the job than just raising awareness. Effective interventions involve many stages, which I’ve incorporated into a model I call PRESS. The stages, which organizations must move through sequentially, are: (1) Problem awareness, (2) Root-cause analysis, (3) Empathy, or level of concern about the problem and the people it afflicts, (4) Strategies for addressing the problem, and (5) Sacrifice, or willingness to invest the time, energy, and resources necessary for strategy implementation. Organizations going through these stages move from understanding the underlying condition, to developing genuine concern, to focusing on correction.

Let’s now have a closer look at these stages and examine how each informs, at a practical level, the process of working toward racial equity.

Problem Awareness

To a lot of people, it may seem obvious that racism continues to oppress people of color. Yet research consistently reveals that many Whites don’t see it that way. For example, a 2011 study by Michael Norton and Sam Sommers found that on the whole, Whites in the United States believe that systemic anti-Black racism has steadily decreased over the past 50 years—and that systemic anti-White racism (an implausibility in the United States) has steadily increased over the same time frame. The result: As a group, Whites believe that there is more racism against them than against Blacks. Other recent surveys echo Sommers and Norton’s findings, one revealing, for example, that 57% of all Whites and 66% of working-class Whites consider discrimination against Whites to be as big a problem as discrimination against Blacks and other people of color. These beliefs are important, because they can undermine an organization’s efforts to address racism by weakening support for diversity policies. (Interestingly, surveys taken since the George Floyd murder indicate an increase in perceptions of systemic racism among Whites. But it’s too soon to tell whether those surveys reflect a permanent shift or a temporary uptick in awareness.)

Even managers who recognize racism in society often fail to see it in their own organizations. For example, one senior executive told me, “We don’t have any discriminatory policies in our company.” However, it is important to recognize that even seemingly “race neutral” policies can enable discrimination. Other executives point to their organizations’ commitment to diversity as evidence for the absence of racial discrimination. “Our firm really values diversity and making this a welcoming and inclusive place for everybody to work,” another leader remarked.

The real challenge for organizations is not figuring out “What can we do?” but rather “Are we willing to do it?”

Despite these beliefs, many studies in the 21st century have documented that racial discrimination is prevalent in the workplace, and that organizations with strong commitments to diversity are no less likely to discriminate. In fact, research by Cheryl Kaiser and colleagues has demonstrated that the presence of diversity values and structures can actually make matters worse, by lulling an organization into complacency and making Blacks and ethnic minorities more likely to be ignored or harshly treated when they raise valid concerns about racism.

Many White people deny the existence of racism against people of color because they assume that racism is defined by deliberate actions motivated by malice and hatred. However, racism can occur without conscious awareness or intent. When defined simply as differential evaluation or treatment based solely on race, regardless of intent, racism occurs far more frequently than most White people suspect. Let’s look at a few examples.

In a well-publicized résumé study by the economists Marianne Bertrand and Sendhil Mullainathan, applicants with White-sounding names (such as Emily Walsh) received, on average, 50% more callbacks for interviews than equally qualified applicants with Black-sounding names (such as Lakisha Washington). The researchers estimated that just being White conferred the same benefit as an additional eight years of work experience—a dramatic head start over equally qualified Black candidates.

Research shows that people of color are well-aware of these discriminatory tendencies and sometimes try to counteract them by masking their race. A 2016 study by Sonia Kang and colleagues found that 31% of the Black professionals and 40% of the Asian professionals they interviewed admitted to “Whitening” their résumés, either by adopting a less “ethnic” name or omitting extracurricular experiences (a college club membership, for instance) that might reveal their racial identities.

These findings raise another question: Does Whitening a résumé actually benefit Black and Asian applicants, or does it disadvantage them when applying to organizations seeking to increase diversity? In a follow-up experiment, Kang and her colleagues sent Whitened and non-Whitened résumés of Black or Asian applicants to 1,600 real-world job postings across various industries and geographical areas in the United States. Half of these job postings were from companies that expressed a strong desire to seek diverse candidates. They found that Whitening résumés by altering names and extracurricular experiences increased the callback rate from 10% to nearly 26% for Blacks, and from about 12% to 21% for Asians. What’s particularly unsettling is that a company’s stated commitment to diversity failed to diminish this preference for Whitened résumés.

A Road Map for Racial Equity

Organizations move through these stages sequentially, first establishing an understanding of the underlying condition, then developing genuine concern, and finally focusing on correcting the problem.

This is a very small sample of the many studies that have confirmed the prevalence of racism in the workplace, all of which underscore the fact that people’s beliefs and biases must be recognized and addressed as the first step toward progress. Although some leaders acknowledge systemic racism in their organizations and can skip step one, many may need to be convinced that racism persists, despite their “race neutral” policies or pro-diversity statements.

Root-Cause Analysis

Understanding an ailment’s roots is critical to choosing the best remedy. Racism can have many psychological sources—cognitive biases, personality characteristics, ideological worldviews, psychological insecurity, perceived threat, or a need for power and ego enhancement. But most racism is the result of structural factors—established laws, institutional practices, and cultural norms. Many of these causes do not involve malicious intent. Nonetheless, managers often misattribute workplace discrimination to the character of individual actors—the so-called bad apples—rather than to broader structural factors. As a result, they roll out trainings to “fix” employees while dedicating relatively little attention to what may be a toxic organizational culture, for example. It is much easier to pinpoint and blame individuals when problems arise. When police departments face crises related to racism, the knee-jerk response is to fire the officers involved or replace the police chief, rather than examining how the culture licenses, or even encourages, discriminatory behavior.

Appealing to circumstances beyond one’s control is another way to exonerate deeply embedded cultural or institutional practices that are responsible for racial disparities. For example, an oceanographic organization I worked with attributed its lack of racial diversity to an insurmountable pipeline problem. “There just aren’t any Black people out there studying the migration patterns of the humpback whale,” one leader commented. Most leaders were unaware of the National Association of Black Scuba Divers, an organization boasting thousands of members, or of Hampton University, a historically Black college on the Chesapeake Bay, which awards bachelor’s degrees in marine and environmental science. Both were entities that could source Black candidates for the job, especially given that the organization only needed to fill dozens, not thousands, of openings.

Diana Ejaita

A Fortune 500 company I worked with cited similar pipeline problems. Closer examination revealed, however, that the real culprit was the culture-based practice of promoting leaders from within the organization—which already had low diversity—rather than conducting a broader industry-wide search when leadership positions became available. The larger lesson here is that an organization’s lack of diversity is often tied to inadequate recruitment efforts rather than an empty pipeline. Progress requires a deeper diagnosis of the routine practices that drive the outcomes leaders wish to change.

To help managers and employees understand how being embedded within a biased system can unwittingly influence outcomes and behaviors, I like to ask them to imagine being fish in a stream. In that stream, a current exerts force on everything in the water, moving it downstream. That current is analogous to systemic racism. If you do nothing—just float—the current will carry you along with it, whether you’re aware of it or not. If you actively discriminate by swimming with the current, you will be propelled faster. In both cases, the current takes you in the same direction. From this perspective, racism has less to do with what’s in your heart or mind and more to do with how your actions or inactions amplify or enable the systemic dynamics already in place.

Workplace discrimination often comes from well-educated, well-intentioned, open-minded, kindhearted people who are just floating along, severely underestimating the tug of the prevailing current on their actions, positions, and outcomes. Anti-racism requires swimming against that current, like a salmon making its way upstream. It demands much more effort, courage, and determination than simply going with the flow.

In short, organizations must be mindful of the “current,” or the structural dynamics that permeate the system, not just the “fish,” or individual actors that operate within it.

Empathy

Once people are aware of the problem and its underlying causes, the next question is whether they care enough to do something about it. There is a difference between sympathy and empathy. Many White people experience sympathy, or pity, when they witness racism. But what’s more likely to lead to action in confronting the problem is empathy—experiencing the same hurt and anger that people of color are feeling. People of color want solidarity—and social justice—not sympathy, which simply quiets the symptoms while perpetuating the disease.

If your employees don’t believe that racism exists in the company, then diversity initiatives will be perceived as the problem, not the solution.

One way to increase empathy is through exposure and education. The video of George Floyd’s murder exposed people to the ugly reality of racism in a visceral, protracted, and undeniable way. Similarly, in the 1960s, northern Whites witnessed innocent Black protesters being beaten with batons and blasted with fire hoses on television. What best prompts people in an organization to register concern about racism in their midst, I’ve found, are the moments when their non-White coworkers share vivid, detailed accounts of the negative impact that racism has on their lives. Managers can raise awareness and empathy through psychologically safe listening sessions—for employees who want to share their experiences, without feeling obligated to do so—supplemented by education and experiences that provide historical and scientific evidence of the persistence of racism.

For example, I spoke with Mike Kaufmann, CEO of Cardinal Health—the 16th largest corporation in America—who credited a visit to the Equal Justice Initiative’s National Memorial for Peace and Justice, in Montgomery, Alabama as a pivotal moment for the company. While diversity and inclusion initiatives have been a priority for Mike and his leadership team for well over a decade, their focus and conversations related to racial inclusion increased significantly during 2019. As he expressed to me, “Some Americans think when slavery ended in the 1860s that African Americans have had an equal opportunity ever since. That’s just not true. Institutional systemic racism is still very much alive today; it’s never gone away.” Kaufmann is planning a comprehensive education program, which will include a trip for executives and other employees to visit the museum, because he is convinced that the experience will change hearts, open eyes, and drive action and behavioral change.

Empathy is critical for making progress toward racial equity because it affects whether individuals or organizations take any action and if so, what kind of action they take. There are at least four ways to respond to racism: join in and add to the injury, ignore it and mind your own business, experience sympathy and bake cookies for the victim, or experience empathic outrage and take measures to promote equal justice. The personal values of individual employees and the core values of the organization are two factors that affect which actions are undertaken.

Strategy

After the foundation has been laid, it’s finally time for the “what do we do about it” stage. Most actionable strategies for change address three distinct but interconnected categories: personal attitudes, informal cultural norms, and formal institutional policies.

To most effectively combat discrimination in the workplace, leaders should consider how they can run interventions on all three of these fronts simultaneously. Focusing only on one is likely to be ineffective and could even backfire. For example, implementing institutional diversity policies without any attempt to create buy-in from employees is likely to produce a backlash. Likewise, focusing just on changing attitudes without also establishing institutional policies that hold people accountable for their decisions and actions may generate little behavioral change among those who don’t agree with the policies. Establishing an anti-racist organizational culture, tied to core values and modeled by behavior from the CEO and other top leaders at the company, can influence both individual attitudes and institutional policies.

Just as there is no shortage of effective strategies for losing weight or promoting environmental sustainability, there are ample strategies for reducing racial bias at the individual, cultural, and institutional levels. The hard part is getting people to actually adopt them. Even the best strategies are worthless without implementation.

Fairness requires treating people equitably—which may entail treating people differently, but in a way that makes sense.

I’ll discuss how to increase commitment to execution in the final section. But before I do, I want to give a specific example of an institutional strategy that works. It comes from Massport, a public organization that owns Boston Logan International Airport and commercial lots worth billions of dollars. When its leaders decided they wanted to increase diversity and inclusion in real estate development in Boston’s booming Seaport District, they decided to leverage their land to do it. Massport’s leaders made formal changes to the selection criteria determining who is awarded lucrative contracts to build and operate hotels and other large commercial buildings on their parcels. In addition to evaluating three traditional criteria—the developer’s experience and financial capital, Massport’s revenue potential, and the project’s architectural design—they added a fourth criterion called “comprehensive diversity and inclusion,” which accounted for 25% of the proposal’s overall score, the same as the other three. This forced developers not only to think more deeply about how to create diversity but also to go out and do it. Similarly, organizations can integrate diversity and inclusion into managers’ scorecards for raises and promotions—if they think it’s important enough. I’ve found that the real barrier to diversity is not figuring out “What can we do?” but rather “Are we willing to do it?”

Sacrifice

Many organizations that desire greater diversity, equity, and inclusion may not be willing to invest the time, energy, resources, and commitment necessary to make it happen. Actions are often inhibited by the assumption that achieving one desired goal requires sacrificing another desired goal. But that’s not always the case. Although nothing worth having is completely free, racial equity often costs less than people may assume. Seemingly conflicting goals or competing commitments are often relatively easy to reconcile—once the underlying assumptions have been identified.

As a society, are we sacrificing public safety and social order when police routinely treat people of color with compassion and respect? No. In fact, it’s possible that kinder policing will actually increase public safety. Famously, the city of Camden, New Jersey, witnessed a 40% drop in violent crime after it reformed its police department, in 2012, and put a much greater emphasis on community policing.

The assumptions of sacrifice have enormous implications for the hiring and promotion of diverse talent, for at least two reasons. First, people often assume that increasing diversity means sacrificing principles of fairness and merit, because it requires giving “special” favors to people of color rather than treating everyone the same. But take a look at the scene below. Which of the two scenarios appears more “fair,” the one on the left or the one on the right?

People often assume that fairness means treating everyone equally, or exactly the same—in this case, giving each person one crate of the same size. In reality, fairness requires treating people equitably—which may entail treating people differently, but in a way that makes sense. If you chose the scenario on the right, then you subscribe to the notion that fairness can require treating people differently in a sensible way.

Of course, what is “sensible” depends on the context and the perceiver. Does it make sense for someone with a physical disability to have a parking space closer to a building? Is it fair for new parents to have six weeks of paid leave to be able to care for their baby? Is it right to allow active-duty military personnel to board an airplane early to express gratitude for their service? My answer is yes to all three questions, but not everyone will agree. For this reason, equity presents a greater challenge to gaining consensus than equality. In the first panel of the fence scenario, everybody gets the same number of crates. That’s a simple solution. But is it fair?

In thinking about fairness in the context of American society, leaders must consider the unlevel playing fields and other barriers that exist—provided they are aware of systemic racism. They must also have the courage to make difficult or controversial calls. For example, it might make sense to have an employee resource group for Black employees but not White employees. Fair outcomes may require a process of treating people differently. To be clear, different treatment is not the same as “special” treatment—the latter is tied to favoritism, not equity.

There is no test or interview that can invariably identify the “best candidate.” Instead, hire good people and invest in their potential.

One leader who understands the difference is Maria Klawe, the president of Harvey Mudd College. She concluded that the only way to increase the representation of women in computer science was to treat men and women differently. Men and women tended to have different levels of computing experience prior to entering college—different levels of experience, not intelligence or potential. Society treats boys and girls differently throughout secondary school—encouraging STEM subjects for boys but liberal arts subjects for girls, creating gaps in experience. To compensate for this gap created by bias in society, the college designed two introductory computer-science tracks—one for students with no computing experience and one for students with some computing experience in high school. The no-experience course tended to be 50% women whereas the some-experience course was predominantly men. By the end of the semester, the students in both courses were on par with one another. Through this and other equity-based interventions, Klawe and her team were able to dramatically increase the representation of women and minority computer-science majors and graduates.

The second assumption many people have is that increasing diversity requires sacrificing high quality and standards. Consider again the fence scenario. All three people have the same height or “potential.” What varies is the level of the field and the fence—apt metaphors for privilege and discrimination, respectively. Because the person on the far left has lower barriers to access, does it make sense to treat the other two people differently to compensate? Do we have an obligation to do so when differences in outcomes are caused by the field and the fence, not someone’s height? Maria Klawe sure thought so. How much human potential is left unrealized within organizations because we do not recognize the barriers that exist?

Finally, it’s important to understand that quality is difficult to measure with precision. There is no test, instrument, survey, or interviewing technique that will enable you to invariably predict who the “best candidate” will be. The NFL draft illustrates the difficulty in predicting future job performance: Despite large scouting departments, plentiful video of prior performance, and extensive tryouts, almost half of first round picks turn out to be busts. This may be true for organizations as well. Research by Sheldon Zedeck and colleagues on corporate hiring processes has found that even the best screening or aptitude tests predict only 25% of intended outcomes, and that candidate quality is better reflected by “statistical bands” rather than a strict rank ordering. This means that there may be absolutely no difference in quality between the candidate who scored first out of 50 people and the candidate who scored eighth.

The big takeaway here is that “sacrifice” may actually involve giving up very little. If we look at people within a band of potential and choose the diverse candidate (for example, number eight) over the top scorer, we haven’t sacrificed quality at all—statistically speaking—even if people’s intuitions lead them to conclude otherwise.

Managers should abandon the notion that a “best candidate” must be found. That kind of search amounts to chasing unicorns. Instead, they should focus on hiring well-qualified people who show good promise, and then should invest time, effort, and resources into helping them reach their potential.

CONCLUSION

The tragedies and protests we have witnessed this year across the United States have increased public awareness and concern about racism as a persistent problem in our society. The question we now must confront is whether, as a nation, we are willing to do the hard work necessary to change widespread attitudes, assumptions, policies, and practices. Unlike society at large, the workplace very often requires contact and cooperation among people from different racial, ethnic, and cultural backgrounds. Therefore, leaders should host open and candid conversations about how their organizations are doing at each of the five stages of the model—and use their power to press for profound and perennial progress.A version of this article appeared in the September–October 2020 issue of Harvard Business Review.

Article link: https://hbr.org/2020/09/how-to-promote-racial-equity-in-the-workplace?utm_source=twitter&utm_medium=social&utm_campaign=hbr

Happy Labor Day!

Posted by timmreardon on 09/07/2020
Posted in: Uncategorized. Leave a comment

“Your work is going to fill a large part of your life and the only way to be truly satisfied is to do what you believe is great work. And the only way to do great work is to love what you do.”
– Steve Jobs


“The big secret in life is that there is no big secret. Whatever your goal, you can get there if you’re willing to work.”
– Oprah Winfrey


Happy Labor Day #laborday #hardwork #greatwork #goals #chestereastside

We Need Trugoogle!

Posted by timmreardon on 09/03/2020
Posted in: Uncategorized. Leave a comment

We need a search engine/agent/service that analyzes and categorizes content and ranks and presents it based upon the veracity of what it contains. Today, google ranks content based upon what’s popular vs. what’s true and reality. Someone recently pointed out that if they googled “Sandy Hook” they would get things like: https://nymag.com/intelligencer/2016/09/the-sandy-hook-hoax.html that would pop up in the top of the search. Is that what we want? Don’t think so. Accordingly, content can be search, analyzed, and categorized according to the truth or veracity of its content and authors. Presenting what’s true vs. what’s popular has never been more important in fighting disinformation so prevalent in the Internet and in society. Surely the technical means exist to do this type of analysis, categorization, and presentation to everyone. We need to figure out how to do it and then do it. Just one viewpoint.

OIG Finds Number of Issues with VA Health Information Exchanges – EHR Intelligence

Posted by timmreardon on 08/29/2020
Posted in: Uncategorized. Leave a comment

Technology background PCB and code. 3d Illustration

Technology background red and blue color, circuit board and code. 3d Illustration

A lack of adequate staff training and communication is disrupting the way the VA is able to productively use its variety of health information exchanges.

August 11, 2020 – Editor’s Note 8/17/2020: This article has been updated with a statement from DirectTrust.

A recent report from the Office of Inspector General (OIG) found training challenges, the need for increased community partners, the use of community coordinators, and technology issues that need to be addressed to enhance the Department of Veterans Affairs’ (VA) ability to effectively utilize its health information exchanges and the ability to exchange patient data.

However, OIG found respondents who utilized either VA Exchange or VA Direct cited successes and said VA goals were more attainable with more community participation.

Patient data exchange and interoperability is crucial for the VA to enhance patient care. Not only does the Veterans Health Information Exchange (VHIE) program use the VA Exchange and VA Direct, which is directly connected to DirectTrust, but it also has community partnerships to promote patient data exchange and accessibility.

OIG interviewed and surveyed faculty at the 48 Level 2 and 3 Veterans Health Administration (VHA) facilities. The group also interviewed the VHIE Program Office, while meeting with the Office of Information Technology, Office of Community Care, Office of Rural Health, Cerner Corporation, and two unnamed state HIEs to gain a variety of differing perspectives.

READ MORE: OIG Finds Major Patient Safety Issues with VA EHR Implementation

Based on the VHIE Program Office, 140 VA facilities have access to VA Exchange and VA Direct, but only 28 have implemented VA Direct. Facilities that did not have VA Direct access were either not adequately trained by DirectTrust, did not have community partners that used DirectTrust or were using other HIE options.

“Expansion of VA Direct usage to all facilities would increase the instances of health information sharing and improve the timeliness of health information exchange while efforts continue with development of community partnerships through VA Exchange,” wrote OIG.

“The VA OIG report provides valuable insight and recommendations on how to enhance the Veterans Health Information Exchange program.”

“DirectTrust is a volunteer-driven membership organization and standards body serving as custodian of the Direct Standard, the foundation of Direct Secure Messaging. As such, DirectTrust is not set up to provide end-user training. Typically, vendors provide training on how to use Direct within their platform, as Direct Secure Messaging is implemented differently across vendor platforms.  We’re pleased to report that the DirectTrust EHR Roundtable, in which the VA participates, recognizes the variability in utilization across vendors, and is creating ‘best practices’ guidelines to advance the usability and utilization of Direct Secure Messaging.”

“The VA provides a critical service to our veterans, and coordination of care with community partners is extremely important. DirectTrust values our partnership with the VA, and we look forward to continued expansion and improvement of health information exchange and interoperability throughout all of their facilities and with their community partners.”

READ MORE: VA to Resume EHR Modernization Deployment in October Amid Concerns

But according to additional survey responses and interviews from the 48 VA facilities, OIG found 46 facilities use either VA Exchange or VA Direct, while two facilities do not use either. And of those 48 facilities, 22 said they exchange patient data by mail, fax machine, or scanner.

Respondents noted additional training, an increase in community partners, and an overall better understanding of how to use the health information exchanges as common ways to defeat these HIE challenges.

“In addition, facilities reported technology challenges to viewing community health information through VA Exchange, including the dual sign-on requirement for VHA providers to first sign into the electronic health record and then sign into the Joint Legacy Viewer (JLV) to access community partner patient information,” wrote OIG.

“The JLV data quality was not ideal, information naming and access was not user friendly, and facilities reported a cumbersome process that resulted in delays in finding needed information.”

Currently, VA has two separate contracts that establish community coordination for VHIE, and OIG found 56 community coordinators who work to enhance infrastructure, outreach, and training.

READ MORE: VA Implements Platform to Increase EHR Interoperability for Veterans

But while there are 56 community coordinators, the organization found a varied level of coordinator engagement from high to little or no participation. Respondents also noted that once a coordinator leaves her position, a lack of communication and training issues typically occurs.

“With the addition of more training, communication, and future planned technological changes, VHA could more effectively streamline the continuity of care received by veterans,” OIG wrote.

“Electronic Health Records Modernization should alleviate some of the technology challenges currently experienced with the use of VHIE,” OIG continued. “Cerner reported the implementation of Millennium/Power Chart would eliminate the need for dual sign-in to review community care documents and allow for exchange accesses between VHA, the Department of Defense, and community providers.”

In late April, OIG found major patient safety issues and EHR capability problems centered around the new Electronic Health Record Modernization (EHRM) system and a recent POLITICO report stated VA leaders have acknowledged those issues.

“The OIG made four recommendations to the Under Secretary for Health related to the need for increased utilization of VA Direct, education for staff and veterans on VA Exchange and VA Direct, expansion of community partnerships, and use of contract VHIE community coordinators,” wrote OIG.

OIG said it would follow up with VA until the recommendations are completed or acknowledged.

Article link: https://ehrintelligence.com/news/oig-finds-number-of-issues-with-va-health-information-exchanges

The Pentagon’s AI Factory Gets a Powerful New Tool – Defense One

Posted by timmreardon on 08/29/2020
Posted in: Uncategorized. Leave a comment

DefOne1
The Joint Common Foundation aims to help the Department to standardize and secure its data and make it easier to find.

The “factory” that pumps out AI tools for the Pentagon is about to get a new tool of its own, one that leaders of the Joint Artificial Intelligence Center, or JAIC, hope will streamline their production and boost output.

The JAIC has awarded a $100 million contract to Deloitte Consulting to create the Joint Common Foundation, or JCF — basically, a tool to help organize the factory, secure it against intruders, direct its workers, and test its products.

. “The Joint Common Foundation will provide an AI development environment to test, validate, and field AI capabilities at scale across the Department of Defense,” the Center said in a statement.

Moreover, the JCF is supposed to help meld various existing programming setups and ease the tasks of building tools that can work across multiple service branches.

“What we currently have is a bunch of disparate AI and ML development environments, duplicative and siloed. A lot of these products in these environments are not interoperable,”  said Col. Sang Han, the Center’s infrastructure chief.

The Joint Common Foundation will also catalogue all the data that the Defense Department has that a programmer might need to train an AI system on.

“We currently don’t have a central marketplace. We don’t have a central…repository or catalogue. That’s what we’re going to do so developers can easily search for data, easily search for AI source code, AI and ML [machine learning] models and products in the DoD,” said Han. The goal is to create a place where “developers can easily come in, pick out the part that they need to build their high-performance application.”

The data catalogue is particularly important. It produces more data than a Forture 500 business and does do in a dizzying array of different formats, from video and voice feeds to images to spreadsheets and text, all classified at different levels. But the center exists, in part, to make sure that AI innovations that occur in one part of DoD can be used elsewhere (so long as that use is ethically justified). Some of that data is what Don Bitner, strategy chief for the JAIC’s Joint Common Foundation and Infrastructure Team, describes as “value-add” data. “We’ve done something to it, we’ve ingested it. We’ve probably paired it down,” Bitner said.

The JCF allows the Department to take that data and put it into a place where anyone–with approval–can look it up and find it. “Being able to do that type of work and have standardized training sets to at least let components and services start the work of natural language processing in a chat room, elevates a lot of the upfront work in presenting data.”

They say that will allow the center to scale up the delivery of new AI tools to services.

Article link:https://www.defenseone.com/technology/2020/08/pentagons-ai-factory-gets-powerful-new-tool/167708/

If AI is going to help us in a crisis, we need a new kind of ethics – MIT Technology Review

Posted by timmreardon on 08/29/2020
Posted in: Uncategorized. Leave a comment

MITAI
Ethics for urgency means making ethics a core part of AI rather than an afterthought, says Jess Whittlestone.

Jess Whittlestone at the Leverhulme Centre for the Future of Intelligence at the University of Cambridge and her colleagues published a comment piece in Nature Machine Intelligence this week arguing that if artificial intelligence is going to help in a crisis, we need a new, faster way of doing AI ethics, which they call ethics for urgency.

For Whittlestone, this means anticipating problems before they happen, finding better ways to build safety and reliability into AI systems, and emphasizing technical expertise at all levels of the technology’s development and use. At the core of these recommendations is the idea that ethics needs to become simply a part of how AI is made and used, rather than an add-on or afterthought.

Ultimately, AI will be quicker to deploy when needed if it is made with ethics built in, she argues. I asked her to talk me through what this means.

This interview has been edited for length and clarity.

Why do we need a new kind of ethics for AI?

With this pandemic we’re suddenly in a situation where people are really talking about whether AI could be useful, whether it could save lives. But the crisis has made it clear that we don’t have robust enough ethics procedures for AI to be deployed safely, and certainly not ones that can be implemented quickly.

What’s wrong with the ethics we have?

I spent the last couple of years reviewing AI ethics initiatives, looking at their limitations and asking what else we need. Compared to something like biomedical ethics, the ethics we have for AI isn’t very practical. It focuses too much on high-level principles. We can all agree that AI should be used for good. But what does that really mean? And what happens when high-level principles come into conflict?

For example, AI has the potential to save lives but this could come at the cost of civil liberties like privacy. How do we address those trade-offs in ways that are acceptable to lots of different people? We haven’t figured out how to deal with the inevitable disagreements.

AI ethics also tends to respond to existing problems rather than anticipate new ones. Most of the issues that people are discussing today around algorithmic bias came up only when high-profile things went wrong, such as with policing and parole decisions.

But ethics needs to be proactive and prepare for what could go wrong, not what has gone wrong already. Obviously, we can’t predict the future. But as these systems become more powerful and get used in more high-stakes domains, the risks will get bigger.

Article link: https://www.technologyreview.com/2020/06/24/1004432/ai-help-crisis-new-kind-ethics-machine-learning-pandemic/?

Rethinking AI talent strategy as automated machine learning comes of age – McKinsey

Posted by timmreardon on 08/19/2020
Posted in: Uncategorized. Leave a comment

McKinAI1
Do companies still need to hire a large contingent of data scientists to build machine learning models or can AutoML reduce the demand for this elusive talent?

In recent years, as the promise of artificial intelligence (AI) crystallized across industries, organizations revamped their talent strategies to gain the skills necessary to deploy and scale AI systems. They hired legions of data scientists and other data experts to build AI applications, trained analytics translators to connect the business and technical realms, and upskilled frontline staff to use AI applications effectively.

One role in particular, the data scientist, has been especially difficult for leaders to fill as competition for its illusive knowledge increased. Last year, employment-related search engine Indeed.com reported that job postings on its site for data scientists had more than tripled since December 2013. McKinsey Global Institute research has also highlighted the talent shortage and the potential for hundreds of thousands of positions to go unfilled.

Incumbent companies found it especially hard to compete with start-ups and tech giants such as Google to attract or retain the best practicing data scientists and the newest crop of graduates. One multinational retail conglomerate, for example, put in place a highly attractive package last year, with education perks and salaries up to 20 percent higher than market rates, to attract the 30-plus data scientists it needed to support its strategic road map of priority AI use cases.

Certainly, some of this competition may soften as tech start-ups struggle to survive in the wake of the COVID-19 crisis, making it somewhat easier for incumbents to acquire these hard-to-get skills. But there are also new tools that have the potential to fill the data-science talent gap and increase the efficiency of analytics teams. Automated machine learning (ML) tools, commonly called AutoML, are designed to automate many steps in developing machine learning models. Business experts armed with AutoML can build some types of models that once would have needed a trained data scientist.

As one might imagine, there’s a great deal of discussion around what can or should be automated when it comes to model development. However, one thing is clear: the evolution of AutoML tools is driving a radically new way of thinking about data science, expanding its bench to include business experts with extensive domain knowledge, basic data-science skills or the willingness to learn them, and AutoML training, rather than solely filling the team with experienced data scientists.

To stay competitive, we believe companies will be best served by not putting all their resources into the fight for sparse technical talent, but instead focusing at least part of their attention on building up their troop of AutoML practitioners, who will become a substantial proportion of the talent pool for the next decade.

How AutoML tools change the data-science game

To understand this shift in AI talent needs, it’s helpful to grasp at a high level how models—the basic building blocks of AI systems—are created and where data scientists spend most of their time (exhibit).

McKinAI2

We strive to provide individuals with disabilities equal access to our website. If you would like information about this content we will be happy to work with you. Please email us at: McKinsey_Website_Accessibility@mckinsey.com

There are typically six broad steps in the model-development workflow:

  1. Understanding the business challenge and translating it into a mathematical one. This is arguably one of the most crucial steps, as the decisions data scientists make here (for example, how they account for the interplay of pricing and demand in an AI-driven price-optimization system) can determine the performance and ultimate success of the model.
  2. Understanding the data, including assessing what data are available to support the business goal and the feasibility of leveraging that data to fuel an effective analytical model for the job.
  3. Preparing the data, including cleansing the data and identifying the most important features. For example, average operating temperature of equipment and time between maintenance would be key features for helping to predict when maintenance is needed.
  4. Developing the models using programming languages such as R and Python by either leveraging one of the many readily available algorithms on open-source platforms or, in much rarer instances, developing a new tailored approach for the problem at hand.
  5. Testing and fine-tuning models for performance in meeting the original business goals as well as to address any risks, such as bias, fairness, production readiness, and so on.
  6. Deploying the new models into production, embedding them into business and decision-making workflows, and monitoring their performance, making updates as needed.

Many organizations have found that 60 to 80 percent of a data scientist’s time is spent preparing the data for modeling. Once the initial model is built, only a fraction of his or her time—4 percent, according to some analyses—is spent on testing and tuning code. In essence, tuning model parameters has become a commodity, and performance is driven by data selection and preparation.

The field of AutoML aims to automate all data preparation, as well as modeling and tuning steps, so that manual technical work is no longer required. While these tools don’t automate everything yet, they are currently able to produce machine learning models that perform well enough to deliver returns. In the telecom industry, for example, some companies have successfully leveraged AutoML to build profitable churn-management models that predict with sufficient accuracy which customers have a high risk of canceling their contracts.

It’s important to note here that the push to eliminate manual data tasks isn’t new. Today, most machine learning models, including powerful ones such as deep learning, are already fully integrated into programming languages, meaning that data scientists can apply these techniques using very little code. For example, one energy company was able, once it had prepared the data, to build a model that accurately predicted customer cancellations by applying just one line of code. An active and growing open-source community also provides “snippets” of code that data scientists can copy and paste into their models to make the data-preparation and modeling part of their work easier than ever.

It is unclear how far AutoML capabilities will go in automating modeling tasks; complete automation still seems far away. However, it seems certain that these capabilities will make data science ever more accessible to business experts, and, in some cases, business-domain understanding will enrich the quality of many models more than the technical skills of a data scientist.

We already see the transition happening: state-of-the-art tools are enabling AutoML practitioners to build reasonably-high-performing ML pipelines that include all steps—from reading the data to tuning the parameters—without substantial knowledge of machine learning or statistics. One North American retailer, for example, retrained several hundred employees in its business-intelligence team to use an off-the-shelf AutoML platform to perform customer-segmentation tasks that were previously carried out by highly trained data scientists. The move has enabled the company to fill the talent gap between basic business-intelligence functions and very complex ML modeling tasks and save hundreds of thousands of dollars in data preparation.

Certainly, not all data-science challenges can be solved using AutoML tools. At present, the technology is best suited to streamlining the development of common forecasting tasks, where the goal is to predict an outcome, given a few metrics, and the use of black-box models is permissible. Models that require statistical expertise to ensure fairness or build trust—for example, customer-engagement models that help salespeople understand what a prospect is likely to buy and why—still require the expertise of trained data scientists.

The impact on hiring strategies

Given the current limitations of AutoML tools, we don’t foresee demand for substantial, functional data-science expertise going away anytime soon. Over the long term, purely technical data scientists will still be needed, but simply far fewer than most currently predict. We estimate that over the next five years, demand for AutoML practitioners is likely to be twice as high as demand for data scientists as companies build out their talent strategies with both levels of expertise:

  • AutoML practitioners, such as biochemists in pharma research, will be able to perform simpler data-science tasks.
  • Data scientists with the statistical expertise to understand which tasks can safely be automated without risk will perform highly specialized tasks that can’t be automated, such as developing new algorithms or optimizing accuracy down to the last few percentage points.

How to get started

Where should organizations begin to rethink their data-science talent needs? We recommend companies take the following steps.

Reassess your requirements

The distinction between tasks that can be left to AutoML practitioners and those that require data scientists with deep statistical expertise is not trivial. It requires experienced analytics practitioners to take stock of all the initiatives on the AI road map and triage them based on the complexity of the data and modeling techniques and the necessary level of predictive accuracy. We find the following questions can serve as a useful guide to determine how to divvy up the work on any given task:

  • Is this a nonstandard data-science task as opposed to a standard predictive task, such as classification or regression?
  • Will we need to use rich and complex data to solve the business problem?
  • Is there potential bias in the data, such as in the case of a resume-screening model that may unintentionally reflect historical prejudices?
  • Will the problem likely require deeper understanding of statistical methods, such as causal inference?
  • Would a slight difference in model performance (for example, a 1 to 2 percent bump in predictive accuracy) significantly influence the value of the model?

To handle tasks for which the answer to any of these questions is “yes,” the organization will most certainly need highly trained data scientists in its talent mix.

Upskill domain experts

The best way to get started with AutoML tools is to train your existing business experts, as opposed to recruiting new hires. Training should include education both in using AutoML tools and in the fundamentals of data science. For example, the business experts should be aware of how common modeling techniques work, what form of data (numeric or text fields) they require, and what patterns the data can (and cannot) reveal. To build out its AutoML team, a manufacturing company piloted a capability-building program for approximately 200 process engineers and line managers. The program consisted of five training days with exercises along the entire use-case life cycle, including standard coding tasks such as cleaning the data and running automated standard ML models, followed by on-the-job coaching when they applied their new skills on their own projects. While education in a technical field such as engineering, physics, or mathematics was a plus, the only prerequisite for these business experts was an interest in and curiosity about data science.

Discuss the limitations—and the opportunities

As highlighted, there are clearly limitations to AutoML technology and numerous pitfalls for companies that use it inappropriately—not least of which are the potential for faulty outputs when it’s used outside its realm of expertise, undetected biases, and lack of explainability. It’s these dangers that have led to concerns in the data-science community. However, organizations that are mindful of the issues and engage in open discussions with their data scientists about the potential of AutoML will not only be able to better deal with current talent gaps but also free up their data scientists for the tasks that really interest them. At the manufacturing company mentioned earlier, data scientists were happy they no longer needed to run every standardized task in the local plants and instead could focus on the tasks that really required their deep and specialized knowledge.


The time is ripe for companies to adjust their talent strategy to take advantage of AutoML tools. These tools enable business experts to efficiently and cost-effectively complete many of today’s simpler data-science tasks and will be even more important in the future as they improve. At the same time, expert data scientists will be freed up for the technically most challenging tasks, enabling them to use their skill set more efficiently and innovate faster, while increasing their job satisfaction—benefits for both the data scientists and the companies that seek to maximize their outputs and retention.

About the author(s)

Holger Hürtgen is a partner in McKinsey’s Düsseldorf office, where Sebastian Kerkhoff is a senior expert and Jan Lubatschowski is a specialist; Manuel Möller is a partner in the Frankfurt office.

The authors wish to thank Angus Williams for his contributions to this article.

Article link: https://www.mckinsey.com/business-functions/mckinsey-analytics/our-insights/rethinking-ai-talent-strategy-as-automated-machine-learning-comes-of-age?

Planning Doesn’t Have to Be the Enemy of Agile – HBR

Posted by timmreardon on 08/17/2020
Posted in: Uncategorized. Leave a comment

HBR Agile1

Planning has long been one of the cornerstones of management. Early in the twentieth century Henri Fayol identified the job of managers as to plan, organize, command, coordinate, and control. The capacity and willingness of managers to plan developed throughout the century. Management by Objectives (MBO) became the height of corporate fashion in the late 1950s. The world appeared predictable. The future could be planned. It seemed sensible, therefore, for executives to identify their objectives. They could then focus on managing in such a way that these objectives were achieved.

This was the capitalist equivalent of the Communist system’s five-year plans. In fact, one management theorist of the 1960s suggested that the best managed organizations in the world were the Standard Oil Company of New Jersey, the Roman Catholic Church and the Communist Party. The belief was that if the future was mapped out, it would happen.

Later, MBO evolved into strategic planning. Corporations developed large corporate units dedicated to it. They were deliberately detached from the day-to-day realities of the business and emphasized formal procedures around numbers. Henry Mintzberg defined strategic planning as “a formalized system for codifying, elaborating and operationalizing the strategies which companies already have.” The fundamental belief was still that the future could largely be predicted.

Now, strategic planning has fallen out of favor. In the face of relentless technological change, disruptive forces in industry after industry, global competition, and so on, planning seems like pointless wishful thinking.

And yet, planning is clearly essential for any company of any size. Look around your own organization. The fact that you have a place to work which is equipped for the job, and you and your colleagues are working on a particular project at a particular time and place, requires some sort of planning. The reality is that plans have to be made about the use of a company’s resources all of the time. Some are short-term, others stretch into an imagined future.

Universally valuable, but desperately unfashionable, planning waits like a spinster in a Jane Austen novel for someone to recognize her worth.

But executives are wary of planning because it feels rigid, slow, and bureaucratic. The Fayol legacy lingers. A 2016 HBR Analytics survey of 385 managers revealed that most executives were frustrated with planning because they believed that speed was important and that plans frequently changed anyway. Why engage in a slow, painful planning exercise when you’re not even going to follow the plan?

The frustrations with current planning practices intersect with another fundamental managerial trend: organizational agility. Reorganizing around small self-managing teams — enhanced by agility methods like Scrum and LeSS — is emerging as the route to the organizational agility required to compete in the fast-changing business reality. One of the key principles underpinning team-based agility is that teams autonomously decide their priorities and where to allocate their own resources.

The logic of centralized long-term strategic planning (done once a year at a fixed time) is the antithesis of an organization redesigned around teams who define their own priorities and resources allocation on a weekly basis.

But if planning and agility are both necessary, organizations have to make them work. They have to create a Venn diagram with planning on one side, agility on the other, and a practical and workable sweet-spot in the middle.  This is why the quest to rethink strategic planning has never been more urgent and critical. Planning twenty-first century style should be reconceived as agile planning.

Agile planning has a number of characteristics:

  • frameworks and tools able to deal with a future that will be different;
  • the ability to cope with more frequent and dynamic changes;
  • the need for quality time to be invested for a true strategic conversation rather than simply being a numbers game;
  • resources and funds are available in a flexible way for emerging opportunities.

The intersection of planning with organizational agility generates two other paramount requirements:

A process able to coordinate and align with agile teams

Agile organizations face the challenge of managing the local autonomy of squads (bottom-up input) consistently with a bigger picture represented by the tribe’s goals and by cross-tribe interdependencies and the strategic priorities of the organization (top-down view). Governing this tension requires new processes and routines for planning and coordination.

Consider the Dutch financial services firm ING Bank. It restructured its operations in the Netherlands by reorganizing 3,500 employees into agile squads. These are autonomous multidisciplinary teams (up to nine people per team) able to define their work and make business decisions quickly and flexibly. Squads are organized into a Tribe (of no more than 150 people), a collection of squads working on related areas.

ING Bank revisited its process and introduced routine meetings and formats to create alignment between and within tribes. Each tribe develops a QBR (Quarterly Business Review), a six-page document outlining tribe-level priorities, objectives and key results.   This is then discussed in a large alignment meeting (labelled the QBR Marketplace) attended by tribe leads and other relevant leaders. At this meeting one fundamental question is addressed: when we add up everything, does this contribute to our company’s strategic goals?

The alignment within a tribe happens at what is called a Portfolio Marketplace event: representatives of each of the squads which make up the tribe come together to agree on how the set goals are going to be achieved and to address opportunities for synergies.

The ING Bank example shows how the planning process is still necessary and essential to an agile company although in a different fashion with different processes, mechanisms and routines.

As more and more companies transform into agile organizations, agile planning will likely become the new normal replacing the traditional centralized planning approach.

A process that makes use of both limitless hard data and human judgment

Planners have traditionally been obsessed with gathering hard data on their industry, markets, competitors. Soft data — networks of contacts, talking with customers, suppliers and employees, using intuition and using the grapevine — have all but been ignored.

From the 1960s onwards, planning was built around analysis.  Now, thanks to Big Data, the ability to generate data is pretty well limitless.  This does not necessarily allow us to create better plans for the future.

Soft data is also vital. “While hard data may inform the intellect, it is largely soft data that generate wisdom. They may be difficult to ‘analyze’, but they are indispensable for synthesis — the key to strategy making,” says Henry Mintzberg.

Companies need first to imagine possibilities and second, pick the one for which the most compelling argument can be made.  In deciding which is backed by the most compelling argument, they should indeed take into account all data that can be crunched. But in addition, they should use qualitative judgment.

In an agile organization, teams use design thinking and other exploratory techniques (plus data) to make rapid decisions and change the course on a weekly basis. Decision making is done by a team of people, offsetting in this way the potential biases of a single person making a decision based on her individual judgement. To some extent, an agile team-based organization enables the possibility to leverage qualitative data and judgement — combined today with infinite hard data — for better decisions.

Relying solely on hard data has unquestionably killed many potential great businesses. Take Nespresso, the coffee pod pioneer developed by Nestle.  Nespresso took off when it stopped targeting offices and started marketing itself to households. There was little data on how households would respond to the concept and whatever information was available suggested a perceived consumer value of just 25 Swiss centimes versus a company-wide threshold requirement of 40 centimes. The Nespresso team had to interpret the data skillfully to present a better case to top management. Because it believed strongly in the idea, it forced the company to take a bigger-than-usual risk. If Nestle had been guided solely by quantitative market research the concept would never have gotten off the ground.

The traditional planning approach needs to be revisited to better serve the purposes of the agile enterprise of the twenty-first century. Agile planning is the future of planning. This new approach will require two fundamental elements. First, replacing the traditional obsessions on hard data and playing the numbers-game with a more balanced co-existence of hard and soft data where judgment also plays an important role. Second, introducing new mechanisms and routines to ensure alignment between the hundreds of self-organizing autonomous local teams and the overarching goals and directions of the company.


Alessandro Di Fiore is the founder and CEO of the European Centre for Strategic Innovation (ECSI) and ECSI Consulting. He is based in Boston and Milan. He can be reached at adifiore@ecsi-consulting.com. Follow him on twitter @alexdifiore

 

 

Article link: https://hbr.org/2018/09/planning-doesnt-have-to-be-the-enemy-of-agile?

Esper eyes $2.2 billion cut to military health care. Yet defense officials say troops and their families could be hurt – Politico

Posted by timmreardon on 08/17/2020
Posted in: Uncategorized. Leave a comment

Esper1
By LARA SELIGMAN and DAN DIAMOND

08/16/2020 07:00 AM EDT

Updated: 08/16/2020 10:38 AM EDT

Pentagon officials working on Defense Secretary Mark Esper’s cost-cutting review of the department have proposed slashing military health care by $2.2 billion, a reduction that some defense officials say could effectively gut the Pentagon’s health care system during a nationwide pandemic.

The proposed cut to the military health system over the next five years is part of a sweeping effort Esper initiated last year to eliminate inefficiencies within the Pentagon’s coffers. But two senior defense officials say the effort has been rushed and driven by an arbitrary cost-savings goal, and argue that the cuts to the system will imperil the health care of millions of military personnel and their families as the nation grapples with Covid-19.

Esper and his deputies have argued that America’s private health system can pick up the slack.

Roughly 9.5 million active-duty personnel, military retirees and their dependents rely on the military health system, which is the military’s sprawling government-run health care framework that operates hundreds of facilities around the world. The military health system also provides care through TRICARE, which enables military personnel and their families to obtain civilian healthcare outside of military networks.

Under the proposal in the latest version of Esper’s defense-wide review, the armed services, the defense health system and officials at the Office of the Secretary of Defense for Personnel and Readiness would be tasked to find savings in their budgets to the tune of $2.2 billion for military health. Officials arrived at that number recently after months of discussions with the impacted offices during the review, said a third defense official. A fourth added that the cuts will be “conditions-based and will only be implemented to the extent that the [military health system] can continue to maintain our beneficiaries access to quality care, be it through our military health care facilities or with our civilian health care provider partners.”

However, the first two senior defense officials said the cuts are not supported by program analysis nor by warfighter requirements.

Esper2

The department’s effort to overhaul the military health system have recently come under scrutiny, as lawmakers pressed the Pentagon on whether the pandemic would affect those plans.

“A lot of the decisions were made in dark, smoky rooms, and it was driven by arbitrary numbers of cuts,” said one senior defense official with knowledge of the process. “They wanted to book the savings to be able to report it.”

“It imperils the ability to support our combat forces overseas,” added a second senior official, who argued that Esper’s moves are weakening the ability to protect the health of active-duty troops in military theaters abroad. “They’re actively pushing very skilled medical people out the door.”

However, a Pentagon spokesperson said the system will “continually assesses how it can most effectively align its assets in support of the National Defense Strategy.

“The MHS will not waver from its mission to provide a ready medical force and a medically ready force,” said Pentagon spokesperson Lisa Lawrence. “Any potential changes to the health system will only be pursued in a manner that ensures its ability to continue to support the Department’s operational requirements and to maintain our beneficiaries access to quality health care.”

Esper rolled out the results of the first iteration of the defense-wide review in February, revealing $5.7 billion in cost savings that he said would be put toward preparing the Pentagon to better compete with Russia and China, including research into hypersonic weapons, artificial intelligence, missile defense and more.

But the proposed health cuts, in the second iteration of the defense-wide review, would degrade military hospitals to the point that they will no longer be able to sustain the current training pipeline for the military’s medical force, potentially necessitating something akin to a draft of civilian medical workers into the military, the two defense officials said.

The second official noted the challenge in finding outside doctors given longstanding complaints from some U.S. hospitals and researchers that there aren’t enough physicians to serve civilians.

As a result, the proposed reductions would hurt combat medical capability without actually saving money, the officials argued. The Pentagon is already significantly overspending on private sector care and TRICARE because patients are being pushed out of undermanned military health facilities to the private health care network, they said. The cuts also would follow nearly a decade of the Pentagon holding military health spending flat, even as spending on care for veterans and civilians has ballooned.

The officials blamed the Pentagon’s Cost Assessment and Program Evaluation office, or CAPE, under the leadership of John Whitley, who has been acting director since August 2019, for the cuts. CAPE conducts analysis and provides advice to the secretary of defense on potential cuts to the defense budget.

During Whitley’s confirmation hearing to be the permanent CAPE director last week, Sen. Doug Jones (D-Ala.) pressed him on the health cuts.

“Folks in my state have expressed some concern and opposition to some of the policies, which allow only active-duty service members to visit military treatment facilities,” Jones said. “What do I tell those folks?”

“The department does have work to do on expanding choice and access to beneficiaries,” Whitley responded. “Sometimes that’s in an MTF, sometimes that’s in the civilian health care setting.”

Whitley has specifically tried to eliminate the Murtha Cancer Center as an unnecessary expense, said one senior official.

Last fall, Whitley and CAPE also sought to close the Uniformed Services University of the Health Sciences, which prepares graduates for the medical corps, as part of the defense-wide review, the people said. Although at the time Esper denied the proposal, CAPE is now seeking major cuts to USU as part of the $2.2 billion. The reductions include eliminating all basic research dollars for combat casualty care, infectious disease and military medicine for USU, as well as slicing operational funds.

“What’s been proposed would be devastating, and it’s coming right out of Whitley’s shop,” said the senior official. “Instead of a clean execution, USU would be bled to death.”

The officials pointed out that USU has contributed to the Covid-19 response in recent months by graduating 230 medical officers and Nurse Corps officers early from the class of 2020 School of Medicine, leading and participating in research clinical trials for virus countermeasures and contributing to the Operation Warp Speed effort to develop a vaccine.

The cuts to USU will leave the department ill-prepared “for the next pandemic,” the first defense official said.

Officials with the Department of Health and Human Services in January raised concerns about last year’s cuts to the Pentagon’s medical corps, which were unrelated to the defense-wide review. In a Jan. 14 memo sent to the Pentagon, the health department’s top emergency-response official stressed that the private health sector would not be able to accommodate “potential casualty estimates shared with HHS.”

The U.S. civilian health system “is unable to absorb and provide sustained care for large numbers of injured service members returning from combat,” wrote Robert Kadlec, the HHS assistant secretary for emergency and preparedness and a retired Air Force colonel.

Kadlec’s memo came in response to the Pentagon’s announced intent to cut the active duty medical force by about 20 percent, or roughly 17,000 personnel, over the next five years. “Explicit” in the move was the expectation that the civilian health care system would pick up the slack, according to Kadlec’s memo.

CLARIFICATION: This story has been updated to make clear that the cuts are part of a proposal to Defense Secretary Mark Esper.

Article link: https://www.politico.com/news/2020/08/16/esper-eyes-22-billion-cut-military-health-care-395578

Service Chiefs to SecDef: Stop the Handover of Military Hospitals to Defense Health Agency – Military.com

Posted by timmreardon on 08/17/2020
Posted in: Uncategorized. Leave a comment

FtBelvoir
The heads of the U.S. military branches are calling on the Defense Department to stop the transfer of all medical facilities to the Defense Health Agency, saying the novel coronavirus pandemic has shown that the plan to convey the services’ hospitals and clinics to the agency is “not viable.”

In a memo sent to Defense Secretary Mark Esper on Aug. 5, the secretaries of the Army, Navy and Air Force, along with the branch chiefs of the Army, Navy, Air Force, Marine Corps and Space Force, called for the return of all military hospitals and clinics already transferred to the DHA and suspension of any planned moves of personnel or resources.

They said that the COVID-19 outbreak has demonstrated that the reform, which was proposed by Congress in the fiscal 2017 National Defense Authorization Act, “introduces barriers, creates unnecessary complexity and increases inefficiency and cost.”

“The proposed DHA end-state represents unsustainable growth with a disparate intermediate structure that hinders coordination of service medical response to contingencies such as a pandemic,” they wrote in the memo, first obtained by a reporter for Synopsis, a Capitol Hill newsletter that focuses on military and veterans health care.

The DoD launched major reforms of its health system in 2013 with the creation of the Defense Health Agency, an organization initially established to improve the quality of health care available to military personnel and family members and reduce services such as administration, IT, logistics and training that existed in triplicate across the three service medical commands.

But the initiatives ballooned in 2016, with Congress passing legislation that placed the DHA in charge of military hospitals and clinics worldwide, as well as research and development, public health agencies, medical logistics and other operations run by the service medical commands.

On Oct. 1, 2019, all military hospitals and clinics in the continental United States were transferred to the DHA, with those overseas expected to move over by October 2021.

But in December, Army Secretary Ryan McCarthy asked for a temporary halt of the transfers of Army facilities and requested that the Army Public Health Center and Army Medical Research and Development Command remain permanently under the service’s control.

Ryan said he had concerns with what he viewed as a “lack of performance and planning with respect to the transition” by the DHA and Defense Department Health Affairs, according to a memo he sent Deputy Defense Secretary David Norquist.

McCarthy’s comments were the first public statements by a military service in opposition to the transformation, which also calls for cutting roughly 18,000 military medical personnel.

In early March, the Air Force and Army surgeons general weighed in, telling the House Appropriations defense subcommittee that the reorganization is an “extremely difficult” and “complicated merger of four cultures.” They suggested that the Defense Health Agency isn’t ready for some of the coming changes.

The DHA assumed management of all domestic military treatment facilities without the staff or management capabilities to actually run them. As part of the plan, the services were to provide support and guidance for the DHA to run the hospitals and clinics in the interim, until its personnel were ready to operate them.

But then the pandemic struck. And according to a source familiar with operations at several medical treatment facilities in the Washington, D.C., region, tensions that had been bubbling since the initial facility transfer erupted.

At one facility, commanders and DHA leadership argued over who was responsible for the COVID-19 screening tents in the parking lot.

“There are definitely turf battles going on,” said the source, a DoD civilian employee. “[The services] are making it very hard.”

The COVID-19 pandemic has delayed several elements of the military health system reform effort. In March, the DoD placed a 60-day hold on a step to establish administrative markets responsible for military treatment facilities in five regions in the U.S.

In April, the department paused the rollout of its Military Health Systems Genesis electronic medical records program to several new medical facilities, although it continued to modernize the IT infrastructure needed to support the system.

And in June, the Pentagon’s top health official announced that the DoD would delay some of the changes planned for this year, including an effort to begin closing or restructuring 48 hospitals and clinics and sending at least 200,000 patients to private care.

But Assistant Secretary of Defense for Health Affairs Thomas McCaffery, a former health industry executive who took office last August, has said he remains committed to reform, which he believes will improve quality of care while also saving taxpayer dollars.

“There’s been at least 12 times since World War II where there has been efforts to change our system,” McCaffery said during a visit to military health facilities in Washington last week. “All focused on the best way to organize and manage for the mission, have a ready medical force and a medically ready force. The mission is still the same, and having a more integrated system is the way to do it.”

In their letter to Esper, the service heads said the DHA has been helpful during the pandemic in developing standardized clinical practices for the coronavirus response.

But they still asked him to suspend any transfer activity and appoint a working group to explore different options for management of the hospitals.

They also asked that all military hospitals, including two that have operated under the DHA and the National Capital Region since 2013 — Walter Reed National Military Medical Center in Maryland and Fort Belvoir Community Hospital — be returned to their respective services.

They did not say which service Walter Reed would fall under; the medical center was created after a merger between the Army’s Walter Reed Medical Center in Washington, D.C., and the Navy’s National Naval Medical Center in Bethesda, Maryland. It remains housed at Bethesda, a Navy installation.

“We look forward to working together to achieve successful reform of the military health system,” they wrote.

Lisa Lawrence, a public affairs officer at the Pentagon, said the department plans to continue pursuing reforms as spelled out in the fiscal 2017 defense policy bill.

“The Department remains focused on ensuring the Services maintain a medically ready force and a ready medical force, as well as [ensuring] all eligible beneficiaries have continued access to quality health care,” Lawrence said.

A staff member for the National Military Family Association said that it “makes sense” the pandemic would lead to a reevaluation of the military health system reforms, adding that the organization hopes the DoD, DHA and military services will continue focusing on accountability, transparency and standardization across the system.

“Whatever the outcome, our priority is that service members and families have access to high-quality health care, wherever they happen to be stationed,” said Eileen Huck, deputy director for health care at NMFA.

— Patricia Kime can be reached at Patricia.Kime@Monster.com. Follow her on Twitter @patriciakime.

Article link: https://www.military.com/daily-news/2020/08/10/service-chiefs-secdef-stop-handover-of-military-hospitals-defense-health-agency.html

Posts navigation

← Older Entries
Newer Entries →
  • Search site

  • Follow healthcarereimagined on WordPress.com
  • Recent Posts

    • When Not to Use AI – MIT Sloan 04/01/2026
    • There are more AI health tools than ever—but how well do they work? – MIT Technology Review 03/30/2026
    • Are AI Tools Ready to Answer Patients’ Questions About Their Medical Care? – JAMA 03/27/2026
    • How AI use in scholarly publishing threatens research integrity, lessens trust, and invites misinformation – Bulletin of the Atomic Scientists 03/25/2026
    • VA Prepares April Relaunch of EHR Program – GovCIO 03/19/2026
    • Strong call for universal healthcare from Pope Leo today – FAN 03/18/2026
    • EHR fragmentation offers an opportunity to enhance care coordination and experience 03/16/2026
    • When AI Governance Fails 03/15/2026
    • Introduction: Disinformation as a multiplier of existential threat – Bulletin of the Atomic Scientists 03/12/2026
    • AI is reinventing hiring — with the same old biases. Here’s how to avoid that trap – MIT Sloan 03/08/2026
  • Categories

    • Accountable Care Organizations
    • ACOs
    • AHRQ
    • American Board of Internal Medicine
    • Big Data
    • Blue Button
    • Board Certification
    • Cancer Treatment
    • Data Science
    • Digital Services Playbook
    • DoD
    • EHR Interoperability
    • EHR Usability
    • Emergency Medicine
    • FDA
    • FDASIA
    • GAO Reports
    • Genetic Data
    • Genetic Research
    • Genomic Data
    • Global Standards
    • Health Care Costs
    • Health Care Economics
    • Health IT adoption
    • Health Outcomes
    • Healthcare Delivery
    • Healthcare Informatics
    • Healthcare Outcomes
    • Healthcare Security
    • Helathcare Delivery
    • HHS
    • HIPAA
    • ICD-10
    • Innovation
    • Integrated Electronic Health Records
    • IT Acquisition
    • JASONS
    • Lab Report Access
    • Military Health System Reform
    • Mobile Health
    • Mobile Healthcare
    • National Health IT System
    • NSF
    • ONC Reports to Congress
    • Oncology
    • Open Data
    • Patient Centered Medical Home
    • Patient Portals
    • PCMH
    • Precision Medicine
    • Primary Care
    • Public Health
    • Quadruple Aim
    • Quality Measures
    • Rehab Medicine
    • TechFAR Handbook
    • Triple Aim
    • U.S. Air Force Medicine
    • U.S. Army
    • U.S. Army Medicine
    • U.S. Navy Medicine
    • U.S. Surgeon General
    • Uncategorized
    • Value-based Care
    • Veterans Affairs
    • Warrior Transistion Units
    • XPRIZE
  • Archives

    • April 2026 (1)
    • March 2026 (9)
    • February 2026 (6)
    • January 2026 (8)
    • December 2025 (11)
    • November 2025 (9)
    • October 2025 (10)
    • September 2025 (4)
    • August 2025 (7)
    • July 2025 (2)
    • June 2025 (9)
    • May 2025 (4)
    • April 2025 (11)
    • March 2025 (11)
    • February 2025 (10)
    • January 2025 (12)
    • December 2024 (12)
    • November 2024 (7)
    • October 2024 (5)
    • September 2024 (9)
    • August 2024 (10)
    • July 2024 (13)
    • June 2024 (18)
    • May 2024 (10)
    • April 2024 (19)
    • March 2024 (35)
    • February 2024 (23)
    • January 2024 (16)
    • December 2023 (22)
    • November 2023 (38)
    • October 2023 (24)
    • September 2023 (24)
    • August 2023 (34)
    • July 2023 (33)
    • June 2023 (30)
    • May 2023 (35)
    • April 2023 (30)
    • March 2023 (30)
    • February 2023 (15)
    • January 2023 (17)
    • December 2022 (10)
    • November 2022 (7)
    • October 2022 (22)
    • September 2022 (16)
    • August 2022 (33)
    • July 2022 (28)
    • June 2022 (42)
    • May 2022 (53)
    • April 2022 (35)
    • March 2022 (37)
    • February 2022 (21)
    • January 2022 (28)
    • December 2021 (23)
    • November 2021 (12)
    • October 2021 (10)
    • September 2021 (4)
    • August 2021 (4)
    • July 2021 (4)
    • May 2021 (3)
    • April 2021 (1)
    • March 2021 (2)
    • February 2021 (1)
    • January 2021 (4)
    • December 2020 (7)
    • November 2020 (2)
    • October 2020 (4)
    • September 2020 (7)
    • August 2020 (11)
    • July 2020 (3)
    • June 2020 (5)
    • April 2020 (3)
    • March 2020 (1)
    • February 2020 (1)
    • January 2020 (2)
    • December 2019 (2)
    • November 2019 (1)
    • September 2019 (4)
    • August 2019 (3)
    • July 2019 (5)
    • June 2019 (10)
    • May 2019 (8)
    • April 2019 (6)
    • March 2019 (7)
    • February 2019 (17)
    • January 2019 (14)
    • December 2018 (10)
    • November 2018 (20)
    • October 2018 (14)
    • September 2018 (27)
    • August 2018 (19)
    • July 2018 (16)
    • June 2018 (18)
    • May 2018 (28)
    • April 2018 (3)
    • March 2018 (11)
    • February 2018 (5)
    • January 2018 (10)
    • December 2017 (20)
    • November 2017 (30)
    • October 2017 (33)
    • September 2017 (11)
    • August 2017 (13)
    • July 2017 (9)
    • June 2017 (8)
    • May 2017 (9)
    • April 2017 (4)
    • March 2017 (12)
    • December 2016 (3)
    • September 2016 (4)
    • August 2016 (1)
    • July 2016 (7)
    • June 2016 (7)
    • April 2016 (4)
    • March 2016 (7)
    • February 2016 (1)
    • January 2016 (3)
    • November 2015 (3)
    • October 2015 (2)
    • September 2015 (9)
    • August 2015 (6)
    • June 2015 (5)
    • May 2015 (6)
    • April 2015 (3)
    • March 2015 (16)
    • February 2015 (10)
    • January 2015 (16)
    • December 2014 (9)
    • November 2014 (7)
    • October 2014 (21)
    • September 2014 (8)
    • August 2014 (9)
    • July 2014 (7)
    • June 2014 (5)
    • May 2014 (8)
    • April 2014 (19)
    • March 2014 (8)
    • February 2014 (9)
    • January 2014 (31)
    • December 2013 (23)
    • November 2013 (48)
    • October 2013 (25)
  • Tags

    Business Defense Department Department of Veterans Affairs EHealth EHR Electronic health record Food and Drug Administration Health Health informatics Health Information Exchange Health information technology Health system HIE Hospital IBM Mayo Clinic Medicare Medicine Military Health System Patient Patient portal Patient Protection and Affordable Care Act United States United States Department of Defense United States Department of Veterans Affairs
  • Upcoming Events

Blog at WordPress.com.
healthcarereimagined
Blog at WordPress.com.
  • Subscribe Subscribed
    • healthcarereimagined
    • Join 153 other subscribers
    • Already have a WordPress.com account? Log in now.
    • healthcarereimagined
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...