What is the nature of the critical materials problem?
How did the current rare earth element (REE) supply chain form and what lessons learned are applicable to other critical materials, such as those found in LIB materials?
What are the potential risks of a disruption to the critical material supply chain?
What should be the aim of policies used to prevent or mitigate the effects of shocks to critical material supply chains?
The ongoing coronavirus disease 2019 pandemic and Russian invasion of Ukraine highlight the vulnerabilities of supply chains that lack diversity and are dependent on foreign inputs. This report presents a short, exploratory analysis summarizing the state of critical materials — materials essential to economic and national security — using two case studies and policies available to the U.S. Department of Defense (DoD) to increase the resilience of its supply chains in the face of disruption.
China is the largest producer and processor of rare earth oxides (REOs) worldwide and a key producer of lithium-ion battery (LIB) materials and components. China’s market share of REO extraction has decreased, but it still has large influence over the downstream supply chain–processing and magnet manufacturing. Chinese market share of the LIB supply chain mirrors REO supply bottlenecks. If it desired, China could effectively cut off 40 to 50 percent of global REO supply, affecting U.S. manufacturers and suppliers of DoD systems and platforms.
Although a deliberate disruption is unlikely, resilience against supply disruption and building domestic competitiveness are important. The authors discuss plausible REO disruption scenarios and their hazards and synthesize insights from a “Day After . . .” exercise and structured interviews with stakeholders to identify available policy options for DoD and the U.S. government to prevent or mitigate the effects of supply disruptions on the defense industrial base (DIB) and broader U.S. economy. They explore these policies’ applicability to another critical material supply chain — LIB materials — and make recommendations for policy goals.
Key Findings
China has used a variety of economic practices to capture a large portion of the REE supply chain. It has used this disproportionate market share to manipulate the availability and pricing of these materials outside China.
Economic coercion by China has usually been executed through denying access to the domestic markets. REOs have been the lone exception: China threatened to restrict access to Chinese exports to manipulate U.S. partner nations (Japan) to geopolitical ends.
Projects planned to increase the extraction and processing capacity of REOs fall short of meeting estimated future demand outside China; there is not enough supply outside China to mitigate a disruption event.
China could effectively cut off 40–50 percent of global REO supply, which would affect manufacturers and suppliers of advanced components used in DoD systems and platforms. The DIB has a limited time frame in which to respond to a disruption before industrial readiness suffers.
The potential risks associated with a disruption could affect the broader U.S. economy and the DIB’s ability to procure critical materials, as well as interrupt military operations in some cases.
DoD has a variety of policies available to mitigate the effects of disruption. They can be categorized as proactive or reactive policies, or both. These policies have an effective time to impact, or the time needed for implementation and benefits to materialize.
Policy options used thus far have yielded mixed results.
Recommendations
Proactive policies should aim to diversify critical material supply chains away from Chinese industry by expanding extraction capacity or increasing material recycling efforts.
Proactive efforts should aim to co-locate the upstream and downstream sectors to better leverage industrial efficiencies.
Reactive policies should aim to increase the DIB’s resiliency in the face of supply disruption by reducing its time to recover and increasing its time to survive.
Both proactive and reactive policy options with the longest time to impact should be implemented sooner rather than later to realize benefits.
Policies, planning, and coordination should also aim to reduce the time to impact for both proactive and reactive policies.
Both proactive and reactive policies should leverage U.S. ally and partner capabilities, or build relationships with nontraditional countries, to establish free access to critical materials at a fair market price wherever possible.
Working with nontraditional partners will be necessary because these countries have geographic access to critical materials and extraction capacity. Depending only on traditional allies and partners may only divert part of the supply chain away from Chinese industry.
Chinese disinformation campaigns should be expected in other critical material supply chains. This sector should work with cybersecurity experts and the U.S. intelligence community to educate executives and local governments about risks. Businesses should communicate their plans — and any influence operations underway — to local communities. The intelligence community should educate policymakers, U.S. allies and partners, and the public about the extent of Chinese interference in critical material supply chains.
You would not be reading this if you did not realize that it is important for the Department of Defense (DoD) to get software right. There are two sides to the coin of ubiquitous software for military systems. On one side lies untold headaches—new cyber vulnerabilities in our weapons and supporting systems, long development delays and cost overruns, endless upgrade requirements for software libraries and underlying infrastructure, challenges in modernizing legacy systems, and unexpected and undesirable bugs that emerge even after successful operational testing and evaluation. On the other side lies a vast potential for future capability, with surprising new military capabilities deployed to aircraft during and between sorties, seamless collaboration between military systems from different services and domains, and rich data exchange between allies and partners in pursuit of military goals. This report offers advice to help maximize the benefits and minimize the liabilities of the software-based aspects of acquisition, largely through structuring acquisition to enable rapid changes across diverse software forms.
This report features a narrower focus and more technical depth than typical policy analysis. We believe this detail is necessary to achieve our objectives and reach our target audience. We intend this to be a useful handbook for the DoD acquisition community and, in particular, the program executive officers (PEOs)1 and program managers as they navigate a complex landscape under great pressure to deliver capability in an environment of strategic competition. All of the 83 major defense acquisition programs and the many smaller acquisition category II and III efforts that make up the other 65 percent of defense investment scattered across the 3,112 program, project, and activity (PPA) line items found in the president’s budget request now include some software activity by our accounting.2 We would be thrilled if a larger community—contracting officers, industry executives, academics, engineers and programmers, policy analysts, legislators, staff, and operational military service members—also gleaned insight from this document. But we know that some terms may come across as jargon and that not everyone is familiar with the names of common software development tools or methods. We encourage them to read this nonetheless and are confident that the core principles and insights we present are still accessible to a broader audience.
While other recent analyses have focused on the imperative for software and offered high-level visions for a better future,3 we believe most of the acquisition community already recognizes the potential of a digital future and is engaged in a more tactical set of battles and decisions: how to structure their organizations, how to manage expectations, how to structure their deliverables, and how to write solicitations and let contracts meet requirements and strive for useful outcomes. We attempt to present background and principles that can assist them in navigating this complex landscape.
The real motivation for getting military software right is not to make the DoD more like commercial industry through digital modernization. Instead, it is to create a set of competitive advantages for the United States that stem from the embrace of distributed decision-making and mission command. The strategic context of this work stems from the observation that advantage in future military contests is less likely to come from the mere presence of robotics or artificial intelligence in military systems and is more likely to come from using these (and other) components effectively as part of a force design, and specifically from maintaining sufficient adaptability in the creation and implementation of tactics. Software, software systems, and information processing systems (including artificial intelligence and machine learning, or AI/ML) that leverage software-generated data are the critical ingredients in future force design, force employment, tactics generation, and execution monitoring. Software will bind together human decision-makers and automation support in future conflict.
This report aims to accomplish two goals:
Elucidate a set of principles that can help the acquisition community navigate the software world—a turbulent sea of buzzwords, technological change, bureaucratic processes, and external pressures.
Recruit this community to apply energy to a handful of areas that will enable better decisions and faster actions largely built around an alternative set of processes for evolutionary development.
The report is structured in chapters, each built around a core insight. Chapter 1 notes that the speed and ubiquity of digital information systems is driving military operations ever closer to capability development, and that this trend brings promise for capability and peril for bureaucratic practices.
Chapter 2 offers the military framing for software development, pointing out that it can enable greater adaptability in force employment and that the DoD cannot derive future concepts of more distributed forces and greater disaggregation of capability from requirements and specifications alone. Instead, software should enable employment tactics to evolve, especially as the elements of future combat are outside the control of any one program manager or office.
Chapter 3 introduces a framing analogy between software delivery and logistics. As logistics practitioners recognize, there is no one-size-fits-all solution to the problem of moving physical goods, but a set of principles that helps navigate many gradients of logistics systems. Similarly, the world of software is full of diversity—different use cases, computing environments, and security levels—and this heterogeneity is an intrinsic feature that the DoD needs to embrace.
Chapter 4 introduces the essential tool of the modern program office—the software factory—the tooling and processes via which operational software is created and delivered. All existing operational software was made and delivered somehow and, whether we like it or not, we will have to update it in the future. Thus, the production process matters. In many ways, the most important thing a PEO can do is identify, establish, or source an effective software factory suited to the particular needs of their unique deliverables.
Chapter 5 seeks to break down the ideas introduced earlier into actionable atomic principles that the DoD can use to navigate the tough decisions that the acquisition community faces; we refer to these ideas as ACTS. To aid the reader in understanding these, we also provide a fictitious vignette.
We recognize the inherent tension between making a report compact and accessible and covering the myriad facets of a broad and fast-changing scene. To balance this, we believe that we have focused this report on the most impactful aspects of software acquisition.
Most contractors the Department of Defense hired in the last five years failed to meet the required minimum cybersecurity standards, posing a significant risk to U.S. national security.
Managed service vendor CyberSheathon Nov. 30 released a report showing that 87% of the Pentagon supply chain fails to meet basic cybersecurity minimums. Those security gaps are subjecting sizeable prime defense contractors and their subcontractors to cyberattacks from a range of threat actors putting U.S. national security at risk.
Those risks have been well-known for some time without attempts to fix them. This independent study of the Defense Industrial Base (DIB) is the first to show that federal contractors are not properly securing military secrets, according to CyberSheath.
The DIB is a complex supply chain comprised of 300,000 primes and subcontractors. The government allows these approved companies to share sensitive files and communicate securely to get their work done.
Defense contractors will soon be required to meet Cybersecurity Maturity Model Certification (CMMC) compliance to keep those secrets safe. Meanwhile, the report warns that nation-state hackers are actively and specifically targeting these contractors with sophisticated cyberattack campaigns.
“Awarding contracts to federal contractors without first validating their cybersecurity controls has been a complete failure,” Eric Noonan, CEO at CyberSheath, told TechNewsWorld.
Defense contractors have been mandated to meet cybersecurity compliance requirements for more than five years. Those conditions are embedded in more than one million contracts, he added.
Dangerous Details
The Merrill Research Report 2022, commissioned by CyberSheath, revealed that 87% of federal contractors have a sub-70 Supplier Performance Risk System (SPRS) score. The metric shows how well a contractor meets Defense Federal Acquisition Regulation Supplement (DFARS) requirements.
DFARS has been law since 2017 and requires a score of 110 for full compliance. Critics of the system have anecdotally deemed 70 to be “good enough.” Even so, the overwhelming majority of contractors still come up short.
“The report’s findings show a clear and present danger to our national security,” said Eric Noonan. “We often hear about the dangers of supply chains that are susceptible to cyberattacks.”
The DIB is the Pentagon’s supply chain, and we see how woefully unprepared contractors are despite being in threat actors’ crosshairs, he continued.
“Our military secrets are not safe, and there is an urgent need to improve the state of cybersecurity for this group, which often does not meet even the most basic cybersecurity requirements,” warned Noonan.
More Report Findings
The survey data came from 300 U.S.-based DoD contractors, with accuracy tested at the 95% confidence level. The study was completed in July and August 2022, with CMMC 2.0 on the horizon.
Roughly 80% of the DIB users failed to monitor their computer systems around-the-clock and lacked U.S.-based security monitoring services. Other deficiencies were evident in the following categories that will be required to achieve CMMC compliance:
80% lack a vulnerability management solution
79% lack a comprehensive multi-factor authentication (MFA) system
73% lack an endpoint detection and response (EDR) solution
70% have not deployed security information and event management (SIEM)
These security controls are legally required of the DIB, and since they are not met, there is a significant risk facing the DoD and its ability to conduct armed defense. In addition to being largely non-compliant, 82% of contractors find it “moderately to extremely difficult to understand the governmental regulations on cybersecurity.
Confusion Rampant Among Contractors
Some defense contractors across the DIB have focused on cybersecurity only to be stalled by obstacles, according to the report.
When asked to rate DFARS reporting challenges on a scale from one-to-10 (with 10 being extremely challenging), about 60% of all respondents rated “understanding requirements” a seven in 10 or higher. Also high on the list of challenges were routine documentation and reporting.
The primary obstacles contractors listed are challenges in understanding the necessary steps to achieve compliance, the difficulty with implementing sustainable CMMC policies and procedures, and the overall cost involved.
Unfortunately, those results closely paralleled what CyberSheath expected, admitted Noonan. He noted that the research confirmed that even fundamental cybersecurity measures like multi-factor authentication had been largely ignored.
“This research, combined with the False Claims Act case against defense giant Aerojet Rocketdyne, shows that both large and small defense contractors are not meeting contractual obligations for cybersecurity and that the DoD has systemic risk throughout their supply chain,” Noonan said.
No Big Surprise
Noonan believes the DoD has long known that the defense industry is not addressing cybersecurity. News reporting of seemingly never-ending nation-state breaches of defense contractors, including large-scale incidents like the SolarWinds and False Claims Act cases, proves that point.
“I also believe the DoD has run out of patience after giving contractors years to address the problem. Only now is the DoD going to make cybersecurity a pillar of contract acquisition,” said Noonan.
He noted the planned new DoD principle would be “No cybersecurity, no contract.”
Noonan admitted that some of the struggles that contractors voiced about difficulties in understanding and meeting cyber requirements have merit.
“It is a fair point because some of the messaging from the government has been inconsistent. In reality, though, the requirements have not changed since about 2017,” he offered.
What’s Next
Perhaps the DoD will pursue a get-tougher policy with contractors. If contractors complied with what the law required in 2017, the entire supply chain would be in a much better place today. Despite some communication challenges, the DoD has been incredibly consistent on what is required for defense contractor cybersecurity, Noonan added.
The current research now sits atop a mountain of evidence that proves federal contractors have a lot of work to do to improve cybersecurity. It is clear that work will not be done without enforcement from the federal government.
“Trust without verification failed, and now the DoD appears to be moving to enforce verification,” he said.
DoD Response
TechNewsWorld submitted written questions to the DoD about the supply chain criticism in the CyberSheath report. A spokesperson for CYBER/IT/DOD CIO for the Department of Defense replied, stating that it would take a few days to dig into the issues. We will update this story with any response we receive.
Update: Dec. 9, 2022 – 3:20 PM PT DoD Spokesperson and U.S. Navy Commander Jessica McNulty provided this response to TechNewsWorld:
CyberSheath is a company that has been evaluated by the Cyber Accreditation Body (Cyber AB) and met the requirements to become a Registered Practitioner Organization, qualified to advise and assist Defense Industrial Base (DIB) companies with implementing CMMC. The Cyber AB is a 501(c)(3) that authorizes and accredits third-party companies conducting assessments of companies within the DIB, according to U.S. Navy Commander Jessica McNulty, a Department of Defense spokesperson.
McNulty confirmed that the DoD is aware of this report and its findings. The DoD has not taken any action to validate the findings, nor does the agency endorse this report, she said.
However, the report and its findings are generally not inconsistent with other prior reports (such as the DoD Inspector General’s Audit of Protection of DoD Controlled Unclassified Information on Contractor-Owned Networks and Systems (ref. DODIG-2019-105) or with results of compliance assessments performed by the DoD, as allowed/required by DFARS clause 252.204-7020 (when applicable), she noted.
“Upholding adequate cybersecurity standards, such as those defined by the National Institute of Standards and Technology (NIST) and levied as contractual requirements through application of DFARS 252.204-7012, is of the utmost importance for protecting DoD’s controlled unclassified information. DoD has long recognized that a mechanism is needed to assess the degree to which contract performers comply with these standards, rather than taking it on faith that the standards are met,” McNulty told TechNewsWorld.
For this reason, the DoD’s Cybersecurity Maturity Model Certification (CMMC) program was initiated, and the DoD is working to codify its requirements in part 32 of the Code of Federal Regulations, she added.
“Once implemented, CMMC assessment requirements will be levied as pre-award requirements, where appropriate, to ensure that DoD contracts are awarded to companies that do, in fact, comply with underlying cybersecurity requirements,” McNulty concluded.
Jack M. Germain has been an ECT News Network reporter since 2003. His main areas of focus are enterprise IT, Linux and open-source technologies. He is an esteemed reviewer of Linux distros and other open-source software. In addition, Jack extensively covers business technology and privacy issues, as well as developments in e-commerce and consumer electronics.
WASHINGTON — Without working software, the F-35 stealth fighter is a trillion-dollar lawn ornament.
Called “a computer that happens to fly” by one former Air Force chief, with algorithms running everything from basic flight controls to long-range targeting, the F-35 runs off eight million lines of code. That’s actually less than a late-model luxury car like the 2020 Mercedes S-Class, which has over 30 million lines, notes a forthcoming report from a national security thinktank, the Hudson Institute.
Yet, co-authors Jason Weiss and Dan Patt told Breaking Defense that even as private-sector software surges ahead, a Pentagon bureaucracy built to handle industrial-age hardware still struggles to get all the fighter’s code to work.
And that’s in peacetime. What if war broke out tomorrow — say a desperate Vladimir Putin lashes out at NATO, or Xi Jinping decides to seize Taiwan — and the stress of combat reveals some unexpected glitch? Imagine, for instance, that enemy anti-aircraft radars reveal a new wartime-only mode that doesn’t register in the F-35’s threat-detection algorithms. In that nightmare scenario, every day without a software update means people die.
But the Pentagon bureaucracy has simply not kept pace with the evolution of military hardware from heavy metal to software-dependent, let alone with lightning progress in the private sector. As a result, program managers cannot to update warfighting software in everything from command posts to combat vehicles at the speed that will be required in a fast-moving future conflict, according to an exclusive preview of a Hudson Institute report to be published later today.
Cover of the forthcoming Hudson Institute report. (Hudson Institute graphic, courtesy of the authors)
What’s needed, Weiss and Patt argue, is system that updates software so rapidly, implementing lessons-learned and enabling new tactics, that commanders can “shift on demand with the ease of sliding a finger across the capacitive glass screen of a mobile phone.”
When done right, rapid software updates can save lives. In their war with Russia, for example, Ukrainian troops have come to depend on Elon Musk’s Starlink satellite network. So Russian electronic warfare jammed its transmissions — for a few hours, and then SpaceX updated its software to bypass the interference. “When Starlink came under attack in the Ukraine, they didn’t deploy new satellites at a speed of months or years,” said Weiss, a Navy veteran and entrepreneur who served the Pentagon’s chief software officer. “They merely reconfigured the behavior of the software, in hours.”
But even hours may be too slow sometimes, Weiss and Patt went on in apreview of their report for Breaking Defense. When reconfiguring a well-designed, zero-trust cybersecurity system to stop a newly detected intrusion, they said, or changing how a drone shares data with Low-Earth Orbit satellites, changes can happen “in minutes.”
As the armed services seek to coordinate their disparate forces through an all-encompassing, AI-enabled meta-network, rapid software updates become even more important. What’s called Joint All-Domain Command & Control (JADC2) aims to pass targeting data so quickly from “sensors” to “shooters” that the time between target detected and shots fired is only seconds. In a battle between two such highly networked militaries trying to outguess each other —China is developing what it calls an “informationized” force of its own — the winner may be the one that updates its code the quickest.
Jason Weiss (courtesy of the author)
Unfortunately, the Defense Department mostly handles major software updates the same way it handles hardware procurements: through a time-honored, time-consuming Planning, Programming, Budgeting, and Execution (PPBE) process that locks in major decisions eight years in advance. (The annual Future Years Defense Plan, or FYDP, sets spending levels five years out, but it takes two years for the Pentagon to build a FYDP and another year for Congress to review, amend, and enact it).
“It is impossible to predict out seven or eight years where software technology will be,” Weiss said. What’s worse, the formal requirements for how a piece of equipment must perform — covering everything from armor protection on a tank to cybersecurity in software — can be locked a decade or more before it’s delivered to actual troops.
So, the report snarks, “the program manager must defend program execution to an obsolete baseline, an obsolete requirement, and an obsolete prediction of the future.”
Dan Patt (Hudson Institute photo)
Here lies what Weiss and Patt call Pentagon procurement’s “original sin”: a “predilection for prediction.” Projecting eight years out is workable, sort of, for old-school, industrial-age, heavy-metal hardware. It really does take years to build physical prototypes, field-test them, set up assembly lines, and ramp up mass production, and the pace of all that happening is predictable. Even America’s “arsenal of democracy” in World War II, a miracle of industrial mobilization, was only possible because far-sighted planners got to work years before Pearl Harbor, and it still took, for example, until 1944 to build enough landing craft for D-Day.
But software update cycles spin much faster — if you set up the proper architecture. That requires what techies call a “software factory,” but it’s really “a process,” the report says, “that provides software development teams with a repeatable, well-defined path to create and update production software.” If you have a team of competent coders, if they can get rapid feedback from users on what works and what glitches, and if they can push out software patches rapidly over a long-range wireless network, then you can write upgraded code ASAP and push it out over the network quickly at negligible marginal cost.
The Pentagon has some promising factories already, Weiss and Patt point out, like the Air Force’s famous Kessel Run. “Over the last few years, the number of cleverly named software factories across the DoD has rapidly grown to at least 30 today,” they write. “Others across the Air Force, then across each of the other services, and finally across the DoD’s fourth estate quickly followed the trail that Kessel Run blazed.”
But how does DoD scale up these promising start-ups to the massive scale required for major war?
The cycle of developing and updating weapons increasingly resembles the Development-Operations (DevOps) cycle used for software. (Hudson Institute graphic, courtesy of the authors)
Cycles Converge: DevOps And The OODA Loop
What’s crucial is that connection between coders and users, between the people who develop the technology and the people who operate it. The faster you go, the more you automate the exchange of data, the more the boundary between these two groups dissolves. The industry calls this fusion “DevOps” – developmental operations – or sometimes “DevSecOps” – emphasizing the need for cybersecurity updates to be part of the same cycle.
Cybersecurity updates are a crucial part of the cycle because software is under constant attack, even on civilian products as innocuous as baby monitors. Coders need diagnostic data in near-real-time on the latest hack.
Military equipment, however, comes under all sorts of other attacks as well: not just cyber, but also electronic warfare (jamming and spoofing), and of course physical attack. Even a new form of camouflage or decoy — like the sun-warmed pans of water the Serbs used to draw NATO infrared sensors away from their tanks in 1999 — is a form of “attack” that military systems must take into account.
The need for constant adaptation to lessons from combat — of measure, countermeasure, and counter-countermeasures — is as old as warfare itself. Alexander the Great drilled his pikemen to outmaneuver Persia’s scythed chariots, Edward III combined longbowmen and dismounted knights to crush French heavy cavalry, and a peasant visionary named Joan of Arc convinced aristocratic French commanders to listen to common-born cannoneers about how to bring down castle walls.
Historically, however, it was much quicker to adapt your tactics than to update your technology. In 1940, for instance, when the Germans realized their anti-tank shells just bounced off the heavy armor of many French and British tanks, Rommel’s immediate response was to repurpose existing 88 mm anti-aircraft guns, which were lethal but highly vulnerable to enemy fire. It took two years to deploy a 88 mm cannon on an armored vehicle, the notorious Tiger.
The German 88 mm anti-aircraft gun proved equally capable as an anti-tank weapon. (Bundesarchiv photo)
Col. John Boyd called this process of adaptation the OODA loop: A combatant must Observe, Orient, Decide, and Act — then observe the results of their action, re-orient themselves to the changed situation, decide what to do next, and take further action. Boyd, a fighter pilot, developed the OODA concept to describe how American mental quickness won dogfights in Korea against technologically superior MiGs. But military theorists soon applied the OODA loop to larger scales of combat. In essence, conflicts consist of nested OODA loops, with speed decreasing as size increases: a sergeant might change his squad’s scheme of fire in seconds, a major might redeploy his battalion in hours, a general might bring up supplies and reinforcements over days and weeks, a government might train new troops in months and develop new weapons over years.
But with the software-driven weapons of the 21st century, such as the F-35, these different OODA loops begin to merge. When you can push out a software patch in hours or minutes, technological updates, once the slowest kind of military change, become part of tactical adaption — the fastest. The OODA loop becomes a DevOps cycle, where the latest combat experience drives rapid changes in technology.
“Exercise of military capability bears increasing resemblance to DevOps cycle,” Weiss and Patt write in their report. “The speed and ubiquity of digital information systems is driving military operations ever closer to capability development… This trend brings promise for capability and peril for bureaucratic practices.”
So how does the Pentagon overcome that peril and get its bureaucracy up to speed?
Six principles, or ” Acquisition Competency Targets,” for Pentagon software development (Hudson Institute graphic, courtesy of the authors)
Acquisition Reform For The Digital Age
First, Weiss and Patt warn, do no harm: “Reforms” that impose some top-down, one-size-fits-all standard will just make software acquisition worse. So, instead of trying to merge “redundant” software development teams, toolkits, and platforms, as some reformers have proposed, they argue the Pentagon needs to build a zoo of diverse approaches, able to handle problems ranging from financial management algorithms to F-35 code.
Future warfare is so complex, involving such a wide array of different systems working together — from submarines to jets, tanks to satellites, all increasingly connected by what may one day involve into a JADC2 meta-network — that no “single contractor or … program office” can solve the problem on its own, the report says. Likewise, it argues, “modern software is no longer some monolithic thing. It exists in no singular repository… It is no longer built from a singular programming language [and] is rarely designed for a singular type of hardware.”
In fact, Weiss and Patt calculate that, looking at all the possible choices from what programming language to use, what contract structure, what mix of government and contractor coders, to whether to optimize to run on tablets or desktops or to assume cloud-based connectivity or enemy jamming, etcetera ad (nearly) infinitum, “there are more than six billion combinations.”
Yet certain principles still apply across the board, they say.
First of all, “the essential tool of the modern program office” is a software factory. Again, that’s not a physical place, but a process and a team, bringing together coders, users, the latest operational data, and a capability to push out rapid updates. While program managers should use existing software factories where possible, rather than reinvent the wheel, the sheer variety of different programs means the Pentagon must maintain a variety of different ones.
A “digital twin” of AFRL’s Gray Wolf prototype cruise missile. (Air Force Research Laboratory photo)
What’s more, and more controversially, Weiss and Patt argue that the government shouldn’t simply contract these factories out, nor try to run them entirely in-house. Instead, they recommend a hybrid called Government-Owned, Contractor-Operated. This GOCO model, already in widespread use for everything from ammunition plants to nuclear labs, allows the government to keep the lights on and sustain software support, even as workloads and profit margins ebb and flow, but still draw on contractor talent to ramp up rapidly when needed, for instance, during a war.
The Pentagon should also make extensive use of a new, streamlined process called the Software Pathway (SWP), created by then-acquisition undersecretary Ellen Lord in 2019. Some programs may exist entirely as SWPs, but even hardware-heavy programs like Army vehicles can use SWP for design and simulation functions such as digital twins. “The traditional acquisition process can execute in parallel for the bent metal and hardware aspects,” Weiss said, “[but] every new program should employ the SWP for its software needs.”
The full report goes into six broad principles and dozens of specific recommendations for everything from recruiting talent to defining requirements. But the ultimate message is simple: The key to victory is adaptability, the secret to adaptability is software, and how the Pentagon procures it needs to change.
U.S. government’s mandates around the creation and delivery of SBOMs (software bill of materials) to help mitigate supply chain attacks has run into strong objections from big-name technology vendors.
A lobbying outfit representing big tech is calling on the federal government’s Office of Management and Budget (OMB) to “discourage agencies” from requiring SBOMs, arguing that “it is premature and of limited utility” for vendors to accurately provide a nested inventory of the ingredients that make up software components.
The trade group, called ITI (Information Technology Industry Council), counts Amazon, Microsoft, Apple, Intel, AMD, Lenovo, IBM, Cisco, Samsung, TSMC, Qualcomm, Zoom and Palo Alto Networks among its prominent members.
In a recent letter to the OMB, the group argues that SBOMs are not currently scalable or consumable.
“We recognize and appreciate the value of flexibility built into the OMB process. Given the current level of (im-)maturity, we believe that SBOMs are not suitable contract requirements yet. The SBOM conversation needs more time to move towards a place where standardized SBOMs are scalable for all software categories and can be consumed by agencies,” the ITI letter read.
“At this time, it is premature and of limited utility for software producers to provide an SBOM. We ask that OMB discourage agencies from requiring artifacts until there is a greater understanding of how they ought to be provided and until agencies are ready to consume the artifacts that they request,” the group added.
At its core, an SBOM is meant to be a definitive record of the supply chain relationships between components used when building a software product. It is a machine-readable document that lists all components in a product, including all open source software, much like the mandatory ingredient list seen on food packaging.
The National Telecommunications and Information Administration (NTIA) has been busy issuing technical documentation, corralling industry feedback, and proposing the use of existing formats for the creation, distribution and enforcement of SBOMs.
In its objections, the big vendors are adamant that SBOMs are not yet suitable contract requirements. “Currently available industry tools create SBOMs of varying degrees of complexity, quality, completeness. The presence of multiple, at times inconsistent or even contradictory, efforts suggests a lacking maturity of SBOMs,” the group said.
The ITI letter cautioned that this is evident in a series of practical challenges related to implementation, including naming, identification, scalability, delivery and access, the linking to vulnerability information, as well as the applicability to cloud services, platforms and legacy software.
“These challenges make it difficult to effectively deploy and utilize SBOMs as a tool to foster transparency. The SBOM conversation needs more time to mature and move towards a place where SBOMs are scalable and consumable,” the group added.
The tech vendors also flagged concerns around the security of sensitive proprietary information that may be collected via SBOMs and held by federal agencies and called for clarifications around the definition of artifacts and what protections will be afforded to safeguard sensitive information.
The U.S. Commerce Department’s NTIA has been out front advocating for SBOMs with a wide range of new documentation including:
SBOM at a glance – an introduction to the practice of SBOM, supporting literature, and the pivotal role SBOMs play in providing much-needed transparency for the software supply chain.
A detailed FAQ document that outlines information, benefits, and commonly asked questions.
A two-page overview provides high-level information on SBOM’s background and eco-wide solution, the NTIA process, and an example of an SBOM.
Separately, the open-source Linux Foundation has released a batch of new industry research, training, and tools aimed at accelerating the use of a Software Bill of Materials (SBOM) in secure software development.
Ryan Naraine is Editor-at-Large at SecurityWeek and host of the popular Security Conversations podcast series. Ryan is a veteran cybersecurity strategist who has built security engagement programs at major global brands, including Intel Corp., Bishop Fox and GReAT. He is a co-founder of Threatpost and the global SAS conference series. Ryan’s past career as a security journalist included bylines at major technology publications including Ziff Davis eWEEK, CBS Interactive’s ZDNet, PCMag and PC World. Ryan is a director of the Security Tinkerers non-profit, an advisor to early-stage entrepreneurs, and a regular speaker at security conferences around the world. Follow Ryan on Twitter @ryanaraine.
Who remembers when floppy disks provided a new level of capability for the Department of Defense to support the operations of strategic forces across the world?
It was only in the last few years that DoD retired those 8-inch floppies in favor of modern computer capabilities, and the push is on to accelerate modernization of systems that rely on older technology in order to address modern threats.
This accumulation of older software and infrastructure, known as technical debt, takes a substantial amount of budget and resources to maintain, which puts pressure on new innovation required for enterprise functions such as cybersecurity.
The Pentagon recognizes the impact of technical debt on protecting systems and data from cyberattacks. It intends to deploy a zero trust strategy across the whole department by 2027. By assuming every application, network, connection, and user can become a threat, a zero trust framework validates access through control policies for specific functions – a significant advancement from perimeter-based security that can allow unrestricted access once an attacker gets inside the network.
Technical Debt Hinders Cybersecurity
The Biden Administration’s executive order on cybersecurity in May 2021 jump started the movement toward a zero trust architecture. It called for modernized cyber defenses, improved information sharing and stronger responses to attacks, all of which start with zero trust and depend on modern technologies such as identity management, cloud, artificial intelligence, machine learning and data analytics.
To accelerate this implementation of zero trust, DoD must continue to pay down its technical debt.
The Pentagon can look to the Department of Labor as a model to balance legacy system needs and innovation. As CIO Gundeep Ahluwaliaexplained in a recent interview, “When I joined the department six years ago, we invested only 10% of funds in modernization and development. Now we allocate 25% of our overall funds, and ideally this will increase to 40% in the near future for modernization and development. Modernization is a continuum, and this investment is essentially paying down that technical debt while preventing it from building up again.”
Technical debt has taken on new importance with its reference in the National Defense Authorization Act of 2022, which authorizes the DoD to a study it and make recommendations on its impact to software-intensive systems. DOD CIO John Sherman recently emphasized the connection between technical debt and zero trustas a cyber defense, and he confirmed the department’s commitment to addressing it as adversarial threats increase.
The scope of the technical debt problem is difficult to calculate, but if historical patterns have continued then 75 percent of the federal budget goes to operations and maintenance of legacy systems. That’s older technology that is not prepared or hardened for today’s cyber attacks.
Technology That Stays Ahead of Evolving Threats
There are ways to surround legacy systems with newer, resilient technologies and provide a layered defense of mission critical applications and data.
Identity management, for example, distinguishes a digital identity, while authentication confirms the identity to allow permission-based access to networks, applications, and data. Moreover, operating within a zero trust environment prevents users from gaining access unless they have the appropriate authentication and authorization.
The sheer amount of data from cyberattacks requires innovations in artificial intelligence, machine learning and analytics so that data quickly gets aggregated and filtered, patterns are detected, and threats are elevated for further review.
Because the Pentagon is such a large target for cyberattacks, the Defense Department needs these technologies and a zero trust methodology to eliminate both older technology and processes performed manually, which will reduce threats and fend off penetration.
While perimeter security has become standard for many networks, the executive order on cybersecurity and the DoD’s strategy for zero trust represent the government’s intention to change that. It will require more than technology to nullify existing threats and prevent unknown ones from becoming attacks.
For DoD to reach its full potential for cybersecurity, it must instill a shift in thinking that goes beyond perimeter defenses. Users need to embrace requirements such as multiple logins for enterprise-wide zero trust as well as information sharing that allows for better threat detection and response.
As the Pentagon tackles its technical debt to improve cybersecurity and overall asset protection, it can take the opportunity to renew users’ passion for the mission and good cyber behavior.
Kynan Carver is Defense Cybersecurity Lead at Maximus, an IT services management company focused on the federal government.
Who remembers when floppy disks provided a new level of capability for the Department of Defense to support the operations of strategic forces across the world?
It was only in the last few years that DoD retired those 8-inch floppies in favor of modern computer capabilities, and the push is on to accelerate modernization of systems that rely on older technology in order to address modern threats.
This accumulation of older software and infrastructure, known as technical debt, takes a substantial amount of budget and resources to maintain, which puts pressure on new innovation required for enterprise functions such as cybersecurity.
The Pentagon recognizes the impact of technical debt on protecting systems and data from cyberattacks. It intends to deploy a zero trust strategy across the whole department by 2027. By assuming every application, network, connection, and user can become a threat, a zero trust framework validates access through control policies for specific functions – a significant advancement from perimeter-based security that can allow unrestricted access once an attacker gets inside the network.
Technical Debt Hinders Cybersecurity
The Biden Administration’s executive order on cybersecurity in May 2021 jump started the movement toward a zero trust architecture. It called for modernized cyber defenses, improved information sharing and stronger responses to attacks, all of which start with zero trust and depend on modern technologies such as identity management, cloud, artificial intelligence, machine learning and data analytics.
To accelerate this implementation of zero trust, DoD must continue to pay down its technical debt.
The Pentagon can look to the Department of Labor as a model to balance legacy system needs and innovation. As CIO Gundeep Ahluwaliaexplained in a recent interview, “When I joined the department six years ago, we invested only 10% of funds in modernization and development. Now we allocate 25% of our overall funds, and ideally this will increase to 40% in the near future for modernization and development. Modernization is a continuum, and this investment is essentially paying down that technical debt while preventing it from building up again.”
Technical debt has taken on new importance with its reference in the National Defense Authorization Act of 2022, which authorizes the DoD to a study it and make recommendations on its impact to software-intensive systems. DOD CIO John Sherman recently emphasized the connection between technical debt and zero trustas a cyber defense, and he confirmed the department’s commitment to addressing it as adversarial threats increase.
The scope of the technical debt problem is difficult to calculate, but if historical patterns have continued then 75 percent of the federal budget goes to operations and maintenance of legacy systems. That’s older technology that is not prepared or hardened for today’s cyber attacks.
Technology That Stays Ahead of Evolving Threats
There are ways to surround legacy systems with newer, resilient technologies and provide a layered defense of mission critical applications and data.
Identity management, for example, distinguishes a digital identity, while authentication confirms the identity to allow permission-based access to networks, applications, and data. Moreover, operating within a zero trust environment prevents users from gaining access unless they have the appropriate authentication and authorization.
The sheer amount of data from cyberattacks requires innovations in artificial intelligence, machine learning and analytics so that data quickly gets aggregated and filtered, patterns are detected, and threats are elevated for further review.
Because the Pentagon is such a large target for cyberattacks, the Defense Department needs these technologies and a zero trust methodology to eliminate both older technology and processes performed manually, which will reduce threats and fend off penetration.
While perimeter security has become standard for many networks, the executive order on cybersecurity and the DoD’s strategy for zero trust represent the government’s intention to change that. It will require more than technology to nullify existing threats and prevent unknown ones from becoming attacks.
For DoD to reach its full potential for cybersecurity, it must instill a shift in thinking that goes beyond perimeter defenses. Users need to embrace requirements such as multiple logins for enterprise-wide zero trust as well as information sharing that allows for better threat detection and response.
As the Pentagon tackles its technical debt to improve cybersecurity and overall asset protection, it can take the opportunity to renew users’ passion for the mission and good cyber behavior.
Kynan Carver is Defense Cybersecurity Lead at Maximus, an IT services management company focused on the federal government.
Over the past several years, the U.S. Department of Defense and the military services have spent enormous amounts of time extending weapons ranges and creating new sensors, algorithms and hardware to support the next generation of warfighting. With these new capabilities, interconnectivity complexities increase exponentially, frequently due to siloed invention and disparate integration.
Because operations environments are not pristine and uniform across all formations, these variants create havoc on G2 intelligence gathering and G6 cyber operations teams during exercises and combat situations. The department needs to focus on data logistics, specifically connectivity and hierarchy, to succeed.
Data is the lifeblood of any digital transactional or analytical system. Getting data to systems or sensors becomes a critical pathway to success for any digital organization. For this article, data logistics are all efforts necessary to move data from source systems to their destination.
Sometimes this means many endpoints based on the type of data. Other times this means moving data back to the source for further data gathering or messages that create actions. While in detail, data logistics is different from physical logistics, at abstract levels, they are similar.
A promising approach is Zhamak Dehghani’s “data mesh.” Principles defined in data mesh include domain-oriented decentralized data ownership and architecture, data as a product, self-serve data infrastructure as a platform, and federated computational governance. These principles align with Program Executive Office’s independent structure.
However, appropriate enterprise technologies must exist for data mesh to be successful. These include architectural patterns, data catalogs, change data capture, immutable audit logs and data product APIs. At the core is the event streaming backbone, which shuttles data to interested systems and applications.
Commercial cloud providers, well entrenched in the DoD, already have the capabilities integrated into their offerings. AWS offers Kinesis and Azure provides Stream Analytics, to name two. One industry standard is Kafka, an open-source project or managed service by cloud providers or Confluent.io. Many companies have adopted Kafka due to its in-stream processing capabilities that provide real-time analytics for rapid decision-making and event triggers for humans and machines.
Private industry has solved many problems related to the rapid onboarding of new sensors and sites. The use of automation from networks to applications will enable the Services and OSD to rapidly create the logical integrations necessary to achieve Joint All-Domain Command and Control, or JADC2. As units move to support combatant commands, unit assets need to reduce the time to connectivity. Additionally, there needs to be a focus on automating connections between data distribution nodes and any relevant data backbones.
DoD needs to press on a technical strategy requiring current and future systems to achieve discoverability. An example is the human presence function on messaging systems reused and applied to HIMARS, F-35s, Abrams and TPQ-53s.
The Army’s Unified Network Operations has made positive efforts to improve connectivity timelines. A key opportunity exists with zero-trust security. By divesting the network edge, services at every echelon can become interoperable at a more flexible technical level.
This will require unified efforts across the DoD. Doing so will synchronize internal technical efforts and policies that will onboard systems and sensors and allow, for example, an F-35 and HIMARS to talk directly to each other without human intervention.
Another focus necessary to enable rapid data logistics is rearchitecting the hierarchical distribution model. According to Conway’s Law,
“Any organization that designs a system (defined broadly) will produce a design whose structure is a copy of the organization’s communication structure.” — Melvin E. Conway
A shift in thought and design for data distribution architectures is necessary to enable rapid decision-making by humans and systems. The latency between systems and sensors will be critical, even when robust networks and computing exist.
Flattening these architectures will also reduce digital choke points for moving inter-echelon and intra-echelon. Designing our systems to replicate a flat and interconnected organizational model is necessary for minimizing these risks.
For the Pentagon to enable data for decision-making by a human, sensor, or system, the DoD must focus on the data logistics. The DoD must reduce the time for connections at the network, data, application, and system levels. Using industry-proven technologies and architectures enables warfighters and decision-makers, whether human or machine, by improving resiliency in the entire architecture.
Jock Padgett is the Chief Data Officer and Senior Digital Advisor at the XVIII Airborne Corps, US Army. During the Afghanistan withdrawal and Ukraine support mission, he served as the Chief Technology Officer at the 82nd Airborne Division.
The opinions expressed are his and do not represent the Department of Defense’s position or approach.
The Securing Software Supply Chain Series is an output of the Enduring Security Framework (ESF), a public-private cross-sector working group led by NSA and CISA.
The guidance released today, along with the accompanying fact sheet, provides recommended practices for software customers to ensure the integrity and security of software during the procuring and deployment phases.
The Securing Software Supply Chain Series is an output of the Enduring Security Framework (ESF), a public-private cross-sector working group led by NSA and CISA. This series complements other U.S. government efforts underway to help the software ecosystem secure the supply chain, such as the software bill of materials (SBOM) community.