The U.S. Army Battle of the Bulge
— Read on www.army.mil/botb/
Archives
All posts for the month December, 2021
By ALEXANDRA KELLEYDECEMBER 2, 2021
Officials implemented new positions to help streamline and coordinate EHR system implementations following an arduous initial deployment.
The Department of Veterans’ Affairs formally launched its revamped plan to modernize its electronic health records this week following extensive reviews surrounding the new system’s rollout.
Announced on Dec. 1, VA officials plan on implementing the bevy of reviews documented in the Comprehensive Lessons Learned report on the EHR program that was submitted to Congress earlier this year.
The revitalized EHR system will emphasize coordinating EHR records with partners like the Department of Defense.
“We will do everything we can to get electronic health records right for Veterans and our health care staff, with patient safety being the key driver and nonnegotiable,” VA Deputy Secretary Donald M. Remy said in a statement. “I have incorporated the lessons learned I received during my recent meetings with our team at Mann-Grandstaff VA Medical Center in Spokane, Washington, into this new way forward. Under my direction, VA is refining EHR governance and management structures to establish additional rigor and oversight.”
The revised rollout for EHR systems across the department is planned for early 2024.
Some of the new structural features of the updated EHR system will include improved customer experience, risk management, and system development lifecycle management. The VA will also establish new positions, including a program executive director, to support the new EHR integration. The PED will report to the VA deputy secretary and oversee cross-organizational communication and EHR implementation strategies.
Terry Adirim, the Acting Assistant Secretary of Defense for Health Affairs at the Defense Health Agency, will serve as the inaugural PED.
The VA has struggled with efficiently rolling out an intuitive EHR system. Execution woes under a contract valued at $16 billion include a lack of standardization, and about two thirds of VA staff considered quitting their roles within the agency amid the new EHR system deployment.
This initial system, developed by technology company Cerner, was implemented at the Mann-Grandstaff VA Medical Clinic in 2020. Staff at Mann-Grandstaff overwhelmingly found the system to worsen their job satisfaction, largely attributed to poor implementation.
While the VA will continue to work with Cerner on the EHR platform, a new management structure, specifically seen in a “strengthened” Office of the Functional Champion, will ideally help streamline rollout plans and improve the user experience.
Article link: https://www.nextgov.com/emerging-tech/2021/12/va-revitalizes-plan-ehr-system-rollout/187215/

By ALEXANDRA KELLEYNOVEMBER 19, 2021
Federal agencies using electronic health records systems are hoping standardizing EHR interfaces will lead to greater cross-departmental and patient access.
Federal public health administrators are emphasizing interoperability as government agencies continue to implement electronic health records (EHRs) across federal public health agencies.
A litany of officials from government organizations including the Department of Health and Human Services and the Department of Defense joined a panel discussion Thursday talking about the health IT goals between federal organizations and the ongoing effort to implement EHRs into their daily business processes.
EHRs are digital copies of patient health records that turn a patient’s medical history into a single, shareable file to reduce the administrative and cost burdens and optimize clinicians’ ability to deliver positive health care outcomes.
A major goal of the current efforts to modernize and implement EHRs across several federal agencies hinge on sophisticated data collection and cross-department sharing capabilities, with a focus on helping patients access their records. As more vendors develop their own applications to access and transfer EHRs, compatibility issues between interfaces make it challenging to share patient data. Standardization of these platforms would facilitate patient data sharing between health care providers.
Rolling out EHR platforms stands as a major challenge that involves the deployment of new software and training of the existing workforce. Agencies like the Defense Department and Veterans Affairs Department have struggled to implement new EHR systems, spending billions amid scheduling delays and numerous implementation issues.
Featured speaker Micky Tripathi, the National Coordinator for Health IT within Health and Human Services, said that his office is focusing on standardizing EHR systems within application programming interfaces (APIs) to facilitate the sharing and accessibility of medical records.
“By the end of 2022, certified EHR vendors are required to make available an API based on a particular technical standard and implementation guide,” he said. His office will formulate these industry standards to harmonize EHR application features and encourage interoperability.
He added that this response was born from EHR vendors creating different applications to help patients and caregivers access digital records without an industry standard. Triparhi said the different apps create friction between providers trying to access the same data through a different interface program.
Standardizing the EHR platform and corresponding APIs will aid in fluid cross-agency sharing.
“We still see a lot of variation,” Tripathi said, noting that the new requirements due out next year will hopefully enable the market to develop better solutions for patients accessing their EHRs.
This standardization will be important for collaboration between public health agencies, such as the Centers for Medicare and Medicaid and Defense, which are working to develop a common EHR platform to enable a seamless transfer of care.
Alexandra Mugge, the deputy chief health informatics officer at CMS, said that providing a common application interface will remove barriers to sharing and collecting patient health information.
“We all have to coordinate to ensure that we get to better usability, to ensure that our requirements aren’t conflicting with one another,” Mugge said. “The main role of the EHR should be streamlining patient care.”
She said that Tripathi’s office has been supportive in collaborating with EHR platform standardization and encouraging interdepartmental data sharing.
In addition to the centralization of EHR interoperability, Mugge added that other priorities related to EHR modernization include adjusting the health plan prior authorization process within APIs, linking EHRs to health plan payers in real time, and integrating APIs in other CMS programs.
Adi Gundlapalli, the chief public health informatics officer at CDC, spoke alongside Mugge, and noted that the engagement of a diverse body of stakeholders is key to a standardized EHR interface that can be accessed by caregivers and patients alike.
“No one agency or individual can do it alone,” he said.
Bad Ideas in National Security Series
December 10, 2021 – by Peter Modigliani

Don’t bring a knife to a gun fight
The quickest way to lose a war with a near-peer adversary in the 21st Century is to fight with 20th Century systems. The average Department of Defense (DoD) aircraft is 30 years old. Further, DoD launched most of its ships, submarines, and satellites in the last century. The best way to sabotage DoD’s ability to modernize is to impose industrial age structures, processes, and culture on this massive bureaucracy. DoD executives, Combatant Commanders, and Congress have stressed the need for DoD to rapidly exploit leading technologies to retain its military advantage. Yet DoD still operates with enterprise processes and management practices designed around programs from 60 years ago. These program-centric constraints drive longer timelines, fewer quantities, and higher costs that erode DoD’s military advantage and increase operational risks.
DoD’s Program-Centric Process
Currently,the DoD processes are centered on programs designed to replace or upgrade existing capabilities. These processes were designed in an era where systems could be delivered in a few years; now it takes over a decade to navigate the bureaucracy. DoD first realizes it needs new fighters, aircraft, bombers, ships, submarines, vehicles, satellites, and/or missiles to replace aging, obsolete, or unreliable legacy systems. DoD then spends a few years defining and validating the requirements for the new system to remain operational for decades. Unsurprisingly, the new system often operates much like its legacy system but with improved performance. DoD typically charters a team to spend a year analyzing alternatives to said requirements with little trade space or appetite for unique solutions. The department also spends a year developing detailed cost estimates based on the shaky assumption that the program requirements are clear and stable. Even then, DoD must work concurrently with Congress to secure program funding, which requires a two-year lead time.
Once Congress appropriates the funds, DoD establishes a program office that spends a few years developing a comprehensive acquisition strategy and dozens of related documents. The program office coordinates with dozens of different oversight organizations as part of a gauntlet of reviews. The program manager (PM) presents the strategies at a major decision review with senior leaders, seeking approval to proceed with development. DoD locks down the program’s cost, schedule, and performance in an acquisition program baseline (APB) and holds the PM accountable to those benchmarks. A fierce source selection determines the winner-take-all contract that shapes the defense sector for the next decade. As a result, the loser immediately protests regardless of the integrity of the process. The contractor must report development progress using certified earned value management metrics based on a predefined work breakdown designed for an industrial age assembly line.
During the many years of development and production, congressional and DoD leaders drive significant fluctuations to the program’s budget, including delays to budget approvals for months after the start of nearly each fiscal year. Operations, threats, and technologies will regularly change, but programs of this size often face too many constraints to react. The PM often will not entertain any major course corrections given the corresponding workload and inevitable delays from updating and coordinating requirements and acquisition documents. It also takes an act of Congress to shift any meaningful funding between programs, wherein congressional representatives typically develop a suboptimal or outdated solution focused on hitting each program’s APB targets.
After 10 to 15 years, the program delivers technology to meet yesterday’s mission. But at least—in theory—it is leaps and bounds better than the legacy systems that can be retired. DoD continues to invest in what General Hyten calls “big juicy targets”on 10 to 15-year timelines while our adversaries regularly deliver high quantities of small and mid-sized capabilities and employ asymmetric warfare.
A Digital Age Portfolio-Centric Model
In the digital age, DoD should operate using dynamic capability portfolios. Doing so requires the development of a common portfolio structure across the requirements, budget, and acquisition enterprises. The roughly 50 program executive officers (PEOs) offer a strong introductory view of the capability portfolio structure. At the enterprise level, the PEO portfolios can be mapped across 10 notional major capability areas: aircraft systems; shipbuilding and maritime systems; ground-based systems; space-based systems; C4ISR—command, control, communications, intelligence, surveillance, and reconnaissance—and cybersecurity; missiles and munitions; missile defense programs; nuclear, chemical, and biological defense programs; business systems; and defense health systems.
These major capability areas each range from $10 billion to $50 billion in annual investments. DoD has struggled with portfolio management for years, to include a major initiative by then-Secretary of Defense Donald Rumsfeld, in part due to conflicting portfolio structures, processes, and program constraints. There is a renewed focus on portfolios in the Pentagon to include new enterprise portfolio reviews focused on risk and interoperability assessments. While there is certainly a need to look across DoD’s uniformed services, the heart of capability portfolio management should be at the PEO level. To that end, DoD should consider the following pivots from program-centric to portfolio-centric structure and processes.
Instead of spending years defining detailed program requirements that are nearly guaranteed to be wrong, DoD leaders should capture enduring portfolio level requirements and measures. These portfolio requirements can align with the Joint Capability Areas or the new Joint Warfighting Concepts. Subordinate capability requirements should be managed using dynamic, prioritized backlogs that are met with a modular suite of systems and services. Leading commercial technologies drive novel operational practices and capability requirements. Portfolio requirements and measures should also focus future government and industry research. A portfolio requirements executive for each portfolio would continually align requirements with evolving strategic direction, threats, technologies, and operations.
Instead of allocating budgets to 1,000 separate program elements, budgets should be aligned to the new portfoliostructure. There would need to be clear accounting and transparency across the major platforms, programs, projects, research, and infrastructure within each budget line item. Portfolios would be designed to have the flexibility necessary to shift funding among programs and activities as priorities, performance, risks, threats, and opportunities change. Best practices from recent budget activity, such as BA 8 software pilot program funding, can be scaled. This would shift the incentives away from use-or-lose spendingand strict monthly execution rates and instead focus on maximizing the portfolio’s return on investment, or its mission impact. Portfolio roadmaps would be used to align requirements, budgets, and acquisitions over the short and long term. Portfolio budget executives would collaborate with senior leadership to manage budget planning and execution across their respective areas of expertise.
Instead of setting countless promising government and commercial technology projects up for failure as they cross the Valley of Death into acquisition programs, DoD could leverage an innovation pipeline tuned to each portfolio’s specific needs. DoD lab research and commercial solutions across the National Security Innovation Base would fuel suites of new portfolio capabilities. Portfolio research directors would then be able to engage the innovation hubs and operational commands to shape strategies, investments, partnerships, and experimentation environments.
Instead of overseeing the execution of 50 separate programs, PEOs would be responsible for delivering integrated suites of capabilities to maximize portfoliomeasures. Portfolios optimize interoperability and cybersecurity by harnessing digital and mission engineering, architectures, APIs, and infrastructure. Meanwhile, PEOs would develop portfolio strategies, processes, and contracts to maximize competition and enable the delivery of better capabilities sooner. Programs would no longer be locked into APBs. The key measures would include how each capability maximizes portfolio measures and mission impact. To that end, PEOs would be renamed Portfolio Acquisition Executives to align with their new requirements and budget peers.
DoD’s program-centric industrial age bureaucracy represents its biggest risk to deterring or winning future conflicts with formidable adversaries. DoD operates in a dynamic environment where systems are networked together and employed by joint warfighting mission threads. Yet the three enterprise processes–requirements, budgets, and acquisition–are structured for standalone programs. Employing a modern portfolio centric approach enables the speed, agility, and innovation needed to deliver integrated suites of war-winning capabilities.
(Photo Credit: U.S. Air Force photo by Airman 1st Class Valerie Seelye)
Article link: https://defense360.csis.org/series/bad-ideas/
Executive Summary
Securing America’s 6G Future
By Martijn Rasser, Ainikki Riikonenand Henry Wu
Technological leadership by the United States requires forethought and organization. The plan necessary to maintain that leadership—a national technology strategy—should be broad in scope. Its range includes investments in research, nurturing human talent, revamping government offices and agencies, and ensuring that laws, regulations, and incentives provide private industry with the ability and opportunity to compete fairly and effectively on the merits of their products, capabilities, and know-how. Given that key inputs are diffused globally, this plan must also carefully consider how the United States can effectively partner with other tech-leading democracies for mutual economic and security benefit. This includes taking measures to promote norms for technology use that align with shared values.
In the context of strategic competition with China, the need to craft new approaches to technology development and deployment is increasingly apparent to government leaders. Many lawmakers grasped the stark reality that U.S. technological preeminence was eroding when they realized that China had become a global juggernaut in telecommunications, a situation exacerbated by Beijing’s push to dominate global fifth generation (5G) wireless networks. The state of play poses national and economic security risks to the United States, which, along with its allies and partners in the Indo-Pacific and Europe, has made notable headway in addressing and mitigating these risks. However, much work remains. Chinese firms continue to push for greater digital entanglement around the world, from Southeast Asia to Africa to Latin America. Given the fundamental importance to the digital economy of communications networks and the standards that govern them, the more successful Beijing’s policies are, the greater the challenge for tech-leading democracies to maintain their economic competitiveness. There is also the specter of norms. If these are dominated by illiberal actors, their power to shape how networks are used and to manipulate data flows poses threats to liberal democratic values the world over.
The more successful Beijing’s policies are, the greater the challenge for tech-leading democracies to maintain their economic competitiveness.
It is time for tech-leading democracies to heed lessons from the 5G experience to prepare for what comes next, known as Beyond 5G technologies and 6G, the sixth generation of wireless. With telecommunications operators around the world still in the early stages of rolling out 5G, it is reasonable to ask why policymakers should focus now on 6G technologies that are not expected to be commercialized until around 2030. One reason is that governments of leading technology powers have already crafted various visions and strategic plans for 6G. Myriad research efforts, though nascent, are under way. Second, the 5G experience shows that belated attention to global developments in telecommunications resulted in vexing geopolitical problems that could have been better mitigated, or perhaps in some cases avoided altogether. Finally, because communications technologies are of fundamental importance to economic and national security, prudent and proactive policymaking in the early stages of technological development will help ensure that the United States and its allies and partners are well positioned to reap the benefits while countering the capabilities of adversarial competitors, most notably China.
To secure America’s 6G future, the U.S. executive and legislative branches should act on an array of issues. First and foremost is setting a road map for American leadership in 6G. This framework will then inform the scope and scale of the actions needed to make that vision a reality. The necessary actions range from investing in research and development (R&D) to developing infrastructure to initiating novel tech diplomacy.
Promote American Competitiveness in 6G
The White House should:
- Craft a 6G strategy. The United States needs a strategic road map that lays out a vision for American leadership in 6G and the desired international and domestic telecommunications landscape of 2030 and beyond.
- Expand R&D funding for 6G technologies. The White House should explore opportunities for additional 6G R&D funding through research grants, tax credits, and financial support.
- Leverage existing capabilities for testing, verification, and experimentation of 6G technologies. The White House, working with the interagency Networking and Information Technology Research and Development Program, can establish government 6G testbeds (in the laboratory and field) to support and build upon 5G R&D.
- Open additional experimental spectrum licenses to accelerate R&D efforts.
- Establish a U.S. 6G Spectrum Working Group. The working group should identify spectrum needs for 6G rollouts and offer recommendations for spectrum access and management.
- Promote the development of new 6G use cases by using the purchasing power of the U.S. government.
Congress should:
- Designate the Department of Commerce a U.S. intelligence community (IC) member. Closer ties to the IC will improve information-sharing on foreign technology policy developments, such as adversaries’ strategies for challenging the integrity of standard-setting institutions. This action will also integrate the Department of Commerce’s analytical expertise and understanding of private industry into the IC.
- Enact R&D funding to solve challenges for rural 6G development. 6G offers an opportunity to develop alternatives to traditional hardware such as fiber-optic cables, for example wireless optic solutions or non-terrestrial platforms, that can fill network gaps to more readily connect rural areas.
- Attract and retain much-needed foreign science and technology talent by initiating immigration reform, such as by raising the cap for H-1B visas, eliminating the cap for advanced STEM degree holders, and amending the Department of Labor Schedule A occupations list so that it includes high-skilled technologists.
The National Science Foundation should:
- Create an equivalent of its Resilient & Intelligent NextG Systems (RINGS) program for start-ups. RINGS, supported by government and major industry partners, offers grants for higher education institutions to find solutions for NextG resilience.1
- Expand the Platforms for Advanced Wireless Research Program, a consortium of city-scale research testbeds, so that it includes software innovation hubs.2
Collaborate with Allies and Partners
Congress should:
- Create a Technology Partnership Office at the Department of State. A new office, headed by an assistant secretary for technology, is needed to initiate, maintain, and expand international technology partnerships.
The White House should:
- Organize an international 6G Policy and Security Conference series. U.S. policymakers should work with foreign counterparts of the techno-democracies to organize regular 6G conferences to discuss key issues including technology development, security, standard setting, and spectrum.
The White House, with the support of Congress, should:
- Lead the creation of a Multilateral Digital Development Bank. In partnership with export credit and export finance entities in allied countries, the United States should lead in establishing a new organization with the mission of promoting secure and fair digital infrastructure development around the world.
The State Department should:
- Spearhead a tech diplomacy campaign. The United States and allied governments should craft clear and consistent messaging to the Majority World about the risks of using technologies from techno-autocracies, especially China.
Ensure the Security of 6G Networks
The Federal Communications Commission, with the support of relevant agencies, should:
- Identify, develop, and apply security principles for 6G infrastructure and networks. A proactive approach, with partners in industry and academia, should be undertaken to identify 6G security risks and ensure that international standards have cyber protections.
The White House and Congress should:
- Promote and support the development of open and interoperable technologies. Coordinated outreach, joint testing, industry engagement, and policy collaboration can build global momentum and communicate risks associated with untrusted vendors.
- Create a 6G security fund, building on existing efforts to ensure 5G security. This fund could be established in concert with the activities of the proposed Multilateral Digital Development Bank.
ENDNOTES
- “NSF-Led, Multi-Sector Partnership Will Support Research That Leads to Superior Communication Networks and Systems,” National Science Foundation, press release, April 27, 2021, https://www.nsf.gov/news/special_reports/announcements/042721.jsp. ↩
- “About PAWR,” Platforms for Advanced Wireless Research, https://advancedwireless.org/about-pawr. ↩
Article link: https://www.cnas.org/publications/reports/edge-networks-core-policy
AUTHORS
Martijn Rasser Senior Fellow and Director, Technology and National Security ProgramMartijn Rasser is a Senior Fellow and Director of the Technology and National Security Program at the Center for a New American Security (CNAS). Prior to joining CNAS, Mr. Ras…
Ainikki Riikonen Research Associate, Technology and National Security ProgramAinikki Riikonen is a Research Associate for the Technology and National Security Program at the Center for a New American Security (CNAS). Her research focuses on emerging te…
Henry Wu Former Intern, Technology and National Security ProgramHenry Wu is a former Joseph S. Nye, Jr. Intern for the Technology and National Security Program at the Center for a New American Security (CNAS). Prior to CNAS, Henry interned..

WASHINGTON — The U.S. Defense Department is creating a new position to oversee its digital and artificial intelligence activities, with the hope the office will be able to drive faster progress in those areas and meet threats posed by China, according to a senior defense official.
The new chief digital and artificial intelligence officer, or CDAO, will directly report to the deputy defense secretary and oversee the Joint Artificial Intelligence Center, the Defense Digital Service and the DoD’schief data officer, according to a memo released Dec. 8. Today, those offices directly report to the deputy defense secretary, something the senior defense official said has led to disjointedness.
“We’ve created the CDO, the JAIC and DDS each operating independently and as if the other ones don’t exist,” said the officer, who briefed media Dec. 8 on the condition of anonymity. “That causes two kinds of inefficiencies. One, it means we don’t have the kind of integration across their lines of effort that we could really maximize the impact of the things that any one organization is doing. Two, it means we don’t take advantage of when there are overlaps in what they’re doing, or underlaps in what they are doing to drive the right kind of prioritization in these spaces.”
The official insisted this new position is not meant to create more bureaucracy, but rather serve as an integration function to better drive priorities across these related functional areas.
It is unclear who will lead this organization, but the senior official said the department is looking both inside and outside the Pentagon. The intent is to establish an initial operating capability for the office by Feb. 1, 2022, and reach full operating capability no later than June 1, 2022, the official said.
After establishing an initial operating capability, the office will work with existing authorities to integrate and align the three offices it will oversee. The CDAO will serve as the successor organization to the JAIC, the official said, meaning it will be the lead AI organization within the Pentagon. The CDAO will act as an intervening supervisor for DDS, working to scale new digital solutions and apply them to other problems.
After reaching full operating capability, the official said, the DoD will submit legislative proposals to Congress to adjust authorities and reporting lines.
The heart of JADC2
The senior official noted that this new organizational change gets to the heart of the Pentagon’s Joint All-Domain Command and Control approach, which seeks to more seamlessly connect sensor information to shooters to allow for faster decision-making.
“It is JADC2. JADC2 is the integration of disparate data sources into a common architecture that allows us to have clear, senior-leader-down-to-operator decisions to drive warfighting improvements,” the official said. “To do that. you need a range of capabilities from common data architecture to a common development and deployment environment that allows you to take your applications either digital or AI-enabled and move them to the warfighter.”
To realize this vision, the department needs a single driver inside the Office of the Secretary of Defense.
Moreover, the hope is that the new CDAO position will accelerate progress on initiatives such as common data fabrics, open architectures and open APIs — all key enablers of JADC2.
The position is expected to help the DoD identify solutions to fit these problems and build toward common foundational elements, common development environments and common deployment environments.
It will also help scale the department, which currently has several startup efforts in these areas but needs to make them full-fledged projects.
“We have a couple of startups here, and to get to the scale at the speed we need in the department, we need a central advocate who can manage the resources, manage the priorities, connect with [combatant command] commanders and service leadership to really drive the prioritization and deployment of those solutions,” the official explained.
About Mark Pomerleau
Mark Pomerleau is a reporter for C4ISRNET, covering information warfare and cyberspace.
Article link: https://www.c4isrnet.com/artificial-intelligence/2021/12/08/pentagon-creates-new-digital-and-artificial-intelligence-office/?
The 80th anniversary of the attack on Pearl Harbor.


Biopharma companies should consider a new, integrated approach to evidence-generation strategies to better demonstrate the value of therapies to all stakeholders.
Listen to Article:
DOWNLOADS
Open interactive popup Article (9 pages)
Understanding and improving patient outcomes through the generation of evidence is one of the most critical activities of an innovative biopharmaceutical company. Evidence that a therapy is both effective and safe is the primary requirement. But that is just the start. Clinicians need evidence to support optimal treatment decision making. Payers need evidence to support patient access. And patients need evidence to understand how a therapy might meet their needs. Increasing volumes and types of available data raise the potential to generate this evidence, but if it is to be realized, biopharmaceutical companies may need to change their approach to evidence generation, working far more strategically and collaboratively than they do at present.
The status quo in most companies is one in which each function draws up its own evidence-generation plan. These may well be included in a master, asset-level plan. Yet they tend to be bolted together, not integrated, and so are a far cry from the integrated evidence-generation plans (IEPs) being developed by a handful of companies. From the outset, IEPs take into account the evidence needs of different functions and geographies across the life cycle of an asset, and then collaboratively determine how to meet them using a broad range of methods and data (Exhibit 1). The end result is a significantly more efficient and effective use of resources in pursuit of better patient outcomes.

The case for change
The role that evidence generation plays among key stakeholders is evolving and becoming increasingly important.
Regulators are beginning to consider evidence beyond randomized controlled trials (RCTs). They recognize that real-world evidence (RWE) can help to accelerate drug development or indication expansion, minimize exposure of patients to placebo control arms, and help to offset rising drug-development costs. As a result, regulators around the world are beginning to incorporate RWE into their approval processes.1 In a few instances, RWE has served as a primary source of evidence for the approval of therapies. The US Food and Drug Administration (FDA), for example, approved the expansion of indications for Pfizer’s Ibrance to include male breast cancer on the basis of data from electronic health records and insurance claims.2 More commonly, RWE has been used to augment evidence from RCTs, rather than accepting RWE on its own merit.3 The growing importance of RWE is nonetheless clear: roughly one in three new drug applications and biologic license applications in 2019 offered RWE as supportive evidence.4
Payers need to make increasingly complex decisions about coverage and reimbursement. Against a backdrop of rising healthcare spending, aging populations, and rising numbers of innovative, high-cost treatments, the strategic generation of evidence can help clarify the clinical, economic, and humanistic impact a new therapy might have versus the standard of care, bearing in mind that different payers will have different evidence needs. For instance, Spain and Italy prioritize evidence of budget impact, the United Kingdom stresses cost effectiveness, while Germany emphasizes using the right standard of care, as well as safety and efficacy in patient subpopulations, as the comparators.5
Providing the right evidence can prove powerful. In Japan, post-launch evidence showed that Boehringer Ingelheim and Eli Lilly’s Jardiance, a drug for treating type 2 diabetes, reduced the risk of cardiovascular events.6
Clinicians need more support.The more new therapies that launch, the more clinicians need to understand how they differ, how best to use them, the predictors of response, and whether to switch patients to alternative therapies. Understanding clinicians’ concerns and then generating the evidence to address them is therefore critical, and biopharmaceutical companies are increasingly turning to RWE for assistance. For example, a retrospective review of real-world data from the US VICTORY Consortium demonstrated a favorable safety profile for Takeda’s Entyvio in treating ulcerative colitis and Crohn’s diseases.7
Patient centricity is growing.Companies must consider patients’ needs, preferences, and concerns throughout the clinical development process and generate evidence to address them, hence the importance of patient engagement. At least ten EU member states have a mechanism for patient engagement in their health technology assessment process, mostly at the advice and decision-making stage.8
The results of patient engagement can be impressive. Amgen’s Aimovig for migraines, for instance, was approved by the FDA with the help of a new, patient-reported instrument to assess how migraines were affecting patients’ ability to function. The information that patients reported was successful in serving as secondary end points.9
Why integration matters
Together, these trends make a strong case for why an integrated approach to generating evidence is valuable. An IEP identifies evidence gaps across functions, geographies, and the asset life cycle to meet the needs of external and internal stakeholders. And it aligns priorities and resource allocation to fill those gaps. As a result, an IEP can hold down drug-development costs for an asset’s primary indications and the costs for indication expansions through the judicious use of RWE in lieu of costly RCTs. It can also avoid the duplication of evidence generation within the company or even, as sometimes occurs, the purchase of similar data sets. And the collaborative approach can more rapidly identify and fill critical evidence gaps. Increasing numbers of companies are using them to great effect:
- One company’s IEP for an oncology therapy identified no fewer than 18 critical evidence gaps that were not previously spotted by individual functions. Some gaps—such as those involving data to support payer needs in specific geographies, data to support patient adherence, and data on optimal treatment duration—needed to be addressed urgently. Others, such as supporting follow-on indications and improving patient outcomes with combination therapies, related to longer-term needs.Further, the IEP process helped focus and prioritize resources, such as by reducing the number of areas of interest for investigator-initiated research from more than 50 to 15. It also identified opportunities to generate data more effectively by means of collaboration—by medical supporting the design of health economics and outcomes research (HEOR) studies, and by HEOR and clinical working on post hoc analyses of clinical trial data for health technology assessment submissions and payer dossiers.
- A biopharma company initiating an IEP for a pipeline cell therapy uncovered high-priority evidence gaps on the importance of manufacturing turnaround time on patient outcomes. This resulted in prioritizing seven manufacturing studies, which ultimately led to reduced vein-to-vein time. Further, the process highlighted the need to demonstrate safety and efficacy versus competitors given the increasingly crowded landscape. The IEP team identified a specific disease registry as the optimal way to generate this data and was able to catalyze leadership to approve the registry approximately six months earlier than expected.
- An IEP identified a follow-on indication with high potential to be developed through RWE rather than sponsored trials as originally proposed—the result of insight sharing between medical, clinical, and commercial teams.
- In a McKinsey survey of four IEP teams across functions and geographies, all respondents said the IEP had improved strategic alignment on evidence needs across stakeholders and led to the development of an effective execution plan. Over 80 percent said the IEP had increased synergies in cross-functional evidence generation.
The challenges of an integrated evidence-generation strategy
Companies that choose to transition to a more integrated evidence-generation strategy face three initial obstacles. Functions and geographies are siloed, hampering communication and collaboration; there is a tendency to focus on generating evidence to win near-term regulatory approval in key markets, rather than evidence to support the asset throughout its life cycle; and data are fragmented, making it hard to know what evidence is available and what might be needed. Understanding these challenges can help companies devise a sound plan to overcome them.
Functional and geographic silos
Integrated evidence-generation strategies depend upon transparency and coordination, with functions sharing data and information on stakeholder needs to identify evidence gaps, align on priorities, and plan how to most effectively fill the gaps. But this can be challenging for functions accustomed to focusing on their own, specific deliverables, with minimal input from other functions. And care is required to ensure legal compliance in how data are shared between functions such as commercial and medical.
Geographic coordination is also needed, though not the norm. Companies headquartered in the United States, for example, tend to use US-focused evidence-generation strategies. The fact that most drugs are first launched there and most evidence is generated there—plus the commercial importance of the US market—helps explain this focus. So while there is often strong US participation in developing evidence-generation strategies, other major markets are not always adequately consulted. Different languages and time zones pose further challenges to collaboration. The end result is a so-called global evidence-generation plan that often fails to address many regional or local needs and sometimes the existence of regional or local plans that run counter to the global strategy.
A near-term focus
A tendency to focus on generating near-term evidence is another obstacle to an integrated approach. Across the industry, companies concentrate on generating evidence from RCTs to obtain regulatory approval, which is a critical near-term goal. Yet evidence is needed to inform therapy use along the entire patient journey, which means that other functions—medical, HEOR, and market access, for instance—need to provide input when clinical trials are designed to enable optimal patient use and access once therapies are approved. (Exhibit 2).

A near-term focus can also diminish support for investigator-initiated trials (IITs) for development-stage assets. The perceived or actual regulatory risk associated with conducting IITs prior to approval means biopharmaceutical companies often choose not to fund them until after launch, at which point clinical development priorities shift to different development-stage assets. Yet postapproval asset evidence plans can benefit tremendously from continued support and input from clinical development.
More broadly, a focus on the near term can prioritize the launch indication and launch geographies for a development-stage asset over longer-term needs. Hence, evidence-generation plans may not fully reflect the potential to use RWE for life-cycle management, or additional secondary end points or comparator arms in trial designs to allow approval or uptake in additional geographies.
Fragmented data
Rarely is there a complete data catalog for an asset or disease, as different data sit with different functions and geographic units. In addition, each function typically uses its own system to track study execution, be that with the help of an ad hoc Excel spreadsheet, shared database, or sophisticated portfolio tools. And there is no guarantee that data are kept up to date—seldom is there a single accountable owner of the data even within a single function. As a result, obtaining a complete, cross-functional view of the data being generated for a specific asset or disease is hard, making it difficult to leverage all knowledge, know where evidence gaps might lie, or prevent the duplication of efforts to generate new evidence.
Making the shift
Companies that make the most progress implementing integrated evidence-generation strategies take account of these challenges. To that end, they do four things. They make sure the planning process is cross-functional and begins early in the asset life cycle. They foster change management to further encourage collaboration and break down functional and geographic silos. They establish governance processes that support a new way of working. And they invest not just in data platforms and advanced analytics capabilities but also in skills that facilitate cross-functional engagement.
Institute a cross-functional planning process two to three years before launch
Ideally, companies should begin building an IEP for an asset two to three years before launch. At that time, the strategic direction of the clinical development program will be clear but there remains an opportunity to expand evidence generation to support the asset along the entire patient journey and to support planned life cycle indications, not just the launch indication.
The work is typically led by global medical affairs, which is well placed to integrate the perspectives of all internal and external stakeholders. In some organizations, however, R&D plays a lead role prelaunch and then hands the task over to medical affairs postlaunch. Likewise, while in most organizations IEPs are distinct plans that synergistically build on clinical development plans (CDPs) to support an asset’s launch and full life cycle, in some organizations the CDP evolves into an IEP as the product progresses in development.
The typical IEP process generally entails the following steps (Exhibit 3):
- Convene a cross-functional IEP team and develop and agree on the work plan: The medical-affairs lead assembles the team and introduces the process and framework for the IEP.
- Develop strategic context:Clinical, medical, launch, and life-cycle management strategies need to inform the IEP.
- Assemble an evidence catalog:Develop a centralized catalog of current and planned evidence-generation efforts that can be shared across functions, with appropriate safeguards in place, such as those that will keep commercial and medical affairs legally compliant.
- Identify and prioritize evidence gaps: Identify, without imposing any initial constraints, potential evidence gaps across the patient journey and then prioritize those gaps that are both feasible to fill and that will have the most potential impact. Be sure to consider the evidence needs of both internal and external stakeholders across geographies.
- Develop plans to fill priority evidence gaps: Think beyond RCTs and consider and weigh the trade-offs between different types of data and approaches (for example, secondary analyses of existing data sets, Phase IV trials, investigator-initiated trials, HEOR studies, RWE, advanced analytics). Tactically, ensure that the right combinations of internal and external expertise are leveraged to design, plan, and execute studies for optimal value creation.
- Integrate input from additional stakeholders, then sign-off:Additional input can strengthen the plan. Stakeholders in different markets, as well as cross-functional partners that were not part of the IEP team, can provide insights, as can external stakeholders, often as part of an advisory board. A cross-functional governance body should give final approval to the plan (as discussed in “Establish effective governance,” below).
- Keep the plan updated: IEPs should be living documents that reflect changes in the asset strategy and study-execution updates. Strategic updates are made after major data readouts. Executional ones, such as updates on study status, occur every six to 12 months.

Foster a change-management program
To develop an IEP, people will need to change the way they work—hence the need for a change-management program to break down functional and geographic silos and any cultural resistance. A change-management program should include the following elements:
- Leadership role modeling:Executive sponsors, such as the CEO and head of global medical affairs, should be seen and heard supporting the new approach and driving change, as should the heads of each function.
- A compelling value story: A change story that makes clear the rationale for integrated evidence generation should be cascaded through the organization, potentially through town halls, asset-level workshops, and one-on-one meetings. Showcasing evidence of how the approach has benefited brand teams will also encourage adoption.
- Reinforcement mechanisms:Personal-development plans that encourage integrated evidence generation—key performance indicators (KPIs) for global disease teams that include the development and updating of IEPs, for instance—can accelerate adoption. Publicly celebrating the success of an IEP can also serve as an important reinforcement mechanism.
- Confidence and skill building:Formal training to understand the new processes and governance will build capabilities as well as confidence in the new approach.
Establish effective governance
Cultural resistance can be an issue if clarity on leadership and governance is lacking.
There is much to clarify. Who, for example, is best placed to lead cross-functional planning, and who has final authority to decide which projects to prioritize? Should a scientific, medical, commercial, or hybrid body be responsible for governance? How frequently should an IEP be renewed? And who has signing-off rights for new IEPs or updates? While in some companies the cross-functional asset team signs off on the IEP, the ideal is a cross-functional, executive-level governing body. Bear in mind, however, that the ideal functional leader and the makeup of the governance body may change depending on the stage of development of an asset and the input required.
Whatever the choices made, they need to be set out clearly in a governance framework that eases the transition to the new approach. Importantly, there should also be agile funding mechanisms in place to ensure that prioritized studies are adequately resourced.
IEPs represent a fundamental shift in the way evidence is generated across functions, geographies, and phases of the asset life cycle.
Build new skills and capabilities
Companies wishing to adopt an integrated approach to the generation of evidence across the entire portfolio will need to build new skills and capabilities.
Data and analytics. For a successful evidence-generation strategy, companies will need an integrated data solution that seamlessly links the data catalog of completed studies (translational data, clinical trial data, and RWE, for example) with ongoing or planned studies across functions (with role-based access control, traceability, and auditability). Such a solution not only facilitates the rapid development and updates of IEPs but also can be used to power an analytics engine to generate hypotheses.
Talent. Attracting, retaining, and building the right talent will also be key. Experts will be needed in areas such as non-registrational data generation. But to overcome cultural roadblocks, soft leadership skills that facilitate cross-functional buy-in and engagement are critical too. There may also be a shortage of strategic or creative thinkers who can look beyond their asset or function to disease or franchise-level needs and identify opportunities for innovative methods of evidence generation. Training and coaching may be required to build the skills that lead to high-quality, integrated evidence-generation strategies and their execution.
Agile delivery. A playbook for generating integrated evidence-generation plans helps to ensure a consistent approach and speeds development, with periodic updates allowing newly discovered best practices to be codified and shared. For maximum impact, this can be paired with an agile project-management function dedicated to coordinating and communicating integrated evidence planning across the organization.
IEPs represent a fundamental shift in the way evidence is generated across functions, geographies, and phases of the asset life cycle, and as such will entail considerable effort to implement. But growing numbers of companies are discovering it is an effort well worth making in the pursuit of improved patient outcomes.
Article link: https://www.mckinsey.com/industries/life-sciences/our-insights/integrated-evidence-generation-a-paradigm-shift-in-biopharma?
ABOUT THE AUTHOR(S)
Palak Amin is a consultant in McKinsey’s Philadelphia office; Sarah Nam is an associate partner in the Washington, DC, office; and Lucy Pérez is a senior partner in the Boston office, where Jeff Smith is a partner.
The authors wish to thank David Champagne, Alex Davidson, Mattias Evers, Tomoko Nagatani, Brandon Parry, Lydia The, and Jan van Overbeeke for their contributions to this article.
14 May 2021
- Andrea WilligeSenior Writer, Formative Content
- To ensure social distancing and avoid infection, healthcare practices in many countries shifted from in-person consultations to telemedicine.
- Nearly two-thirds of healthcare providers across 14 global markets are now investing heavily in digital health.
- In developing countries, digital healthcare is also helping, with remote access to specialists.
Senior healthcare leaders from 14 countries say strengthening resilience and preparing for future crises is a top priority, according to a new report commissioned by Royal Philips.
Medical services leaders in countries including the US, Germany and India were asked about their plans for digitalization over the next three years.
The pandemic has seen many countries shift from in-person medical consultations to telemedicine, using apps, phone and video appointments. Industry analyst IDC predicts that by 2023 nearly two-thirds of patients will have accessed healthcare via a digital front end.
Have you read?
- 6 ways social innovators are harnessing 4IR technologies for social change
- A fairer world requires fairer tech. Here’s why
- For technology to boost global health, these 3 obstacles must be overcome
Taking telemedicine beyond the pandemic
Improving resilience and planning for future crises is the top priority for more than two-thirds of senior healthcare leaders surveyed, with France, the Netherlands and Germany scoring the highest. Second in line is the continued shift to remote and virtual care (42%), led by India, the Netherlands and the US.

Accordingly, 64% of healthcare leaders are investing heavily in digital health technology at the moment, but the number drops to 40% when they were asked about their investment levels in three years’ time. This may be because respondents expect solid foundations to have been laid by then or due to continued uncertainty about healthcare funding beyond the pandemic.
Digital health needs AI
A major focus for future health technology investments is the deployment of Artificial Intelligence (AI) and machine learning.
At present 19% of healthcare leaders polled by Royal Philips said they are prioritizing investments in AI but 37% said they plan to do so over the next three years. The aim is to have AI help with clinical decision-making and to predict clinical outcomes.
This ties in with a growing shift from volume-based care targets to value-based care, where predicting patient outcomes will play a key role.
In value-based healthcare models,providers get paid for improving health outcomes rather than for the volume of patients treated. The focus is on treating illnesses and injuries more quickly and avoiding chronic conditions such as diabetes or high blood pressure. The results are better health outcomes and lower costs for both the healthcare system and the patient, thanks to fewer doctor’s visits, tests, interventions and prescriptions.
IDC has forecast that by 2026 two-thirds of medical imaging processes will use AI to detect diseases and guide treatment. A growing number of healthcare leaders believe that investing in AI technology is important for the future of their medical facility, according to the Royal Philips report.
HEALTHCARE
What is the World Economic Forum doing about healthcare value and spending?
Each year, $3.2 trillion is spent on global healthcare making little or no impact on good health outcomes.
To address this issue, the World Economic Forum created the Global Coalition for Value in Healthcare to accelerate value-based health systems transformation.
This council partners with governments, leading companies, academia, and experts from around the world to co-design and pilot innovative new approaches to person-centered healthcare.Hide
Overcoming barriers to digital health
While healthcare leaders are clearly aware of the value of their digital investments, there are still many barriers to the sector’s digital transformation.
A lack of technology experience among staff is one major obstacle, highlighting the need for more digital training for those at the front line of healthcare provision. At the same time, governance, interoperability and data security challenges need to be overcome.
Resolving those will not be easy – which is why 41% of respondents highlighted the importance of forming strategic partnerships with technology companies or other healthcare facilities to jointly roll out new digital technology.
Freeing up hospitals
The formation of technology-enabled ecosystems is expected to contribute to offloading around a quarter of routine care from hospitals. Over the next three years, across the 14 markets surveyed, healthcare services at walk-in clinics and in-patient treatment centres will grow by around 10% each, pharmacies by 4% and home care by 6% on average.
This trend is stronger in countries where healthcare provision is more likely to be in a rural setting, such as India and China.
This may be because digital technology has the potential to bridge healthcare gaps in underserved rural communities, especially in emerging markets. For example, an all-female health provider in Pakistan, Sehat Kahani, has e-health clinics around the country where – for a cost of $0.66 – patients can see one of a network of 1,500 doctors via a digital platform.GOVERNANCE
What is the World Economic Forum doing about healthcare data privacy?
The Healthcare Data Project at the World Economic Forum Centre for the Fourth Industrial Revolution Japan grapples with the question of how societies should balance the interests of individual citizens, businesses and the public at large when it comes to sensitive healthcare issues. An improved approach to governance during a number of health crises, including pandemics, can help build trust and possibly even save lives.
The Centre for the Fourth Industrial Revolution has developed an approach to data governance – Authorized Public Purpose Access (APPA) – that seeks to balance human rights such as privacy with the interests of a data-collecting organizations and the public interest — that is, the needs of whole societies.
Additionally, a recent white paperexamining existing data-governance models, discovering that most are biased toward the interests of one of three major stakeholder groups. The whitepaper revealed the need for a balanced governance model designed to maximize the socially beneficial potential of data while protecting individual rights such as privacy and the legitimate interests of data holders.
Written by
Andrea Willige, Senior Writer, Formative Content
The views expressed in this article are those of the author alone and not the World Economic Forum.