healthcarereimagined

Envisioning healthcare for the 21st century

  • About
  • Economics

Inside a radical new project to democratize AI – MIT Technology Review

Posted by timmreardon on 01/02/2023
Posted in: Uncategorized. Leave a comment


A group of over 1,000 AI researchers has created a multilingual large language model bigger than GPT-3—and they’re giving it out for free.

By Melissa Heikkilä

July 12, 2022

PARIS — This is as close as you can get to a rock concert in AI research. Inside the supercomputing center of the French National Center for Scientific Research, on the outskirts of Paris, rows and rows of what look like black fridges hum at a deafening 100 decibels. 

They form part of a supercomputer that has spent 117 days gestating a new large language model (LLM) called BLOOM that its creators hope represents a radical departure from the way AI is usually developed.

Unlike other, more famous large language models such as OpenAI’s GPT-3 and Google’s LaMDA, BLOOM (which stands for BigScience Large Open-science Open-access Multilingual Language Model) is designed to be as transparent as possible, with researchers sharing details about the data it was trained on, the challenges in its development, and the way they evaluated its performance. OpenAI and Google have not shared their code or made their models available to the public, and external researchers have very little understanding of how these models are trained. 

BLOOM was created over the last year by over 1,000 volunteer researchers in a project called BigScience, which was coordinated by AI startup Hugging Face using funding from the French government. It officially launched on July 12. The researchers hope developing an open-access LLM that performs as well as other leading models will lead to long-lasting changes in the culture of AI development and help democratize access to cutting-edge AI technology for researchers around the world. 

The model’s ease of access is its biggest selling point. Now that it’s live, anyone can download it and tinker with it free of charge on Hugging Face’s website. Users can pick from a selection of languages and then type in requests for BLOOM to do tasks like writing recipes or poems, translating or summarizing texts, or writing programming code. AI developers can use the model as a foundation to build their own applications. 

At 176 billion parameters (variables that determine how input data is transformed into the desired output), it is bigger than OpenAI’s 175-billion-parameter GPT-3, and BigScience claims that it offerssimilar levels of accuracy and toxicity as other models of the same size. For languages such as Spanish and Arabic, BLOOM is the first large language model of this size. 

But even the model’s creators warn it won’t fix the deeply entrenched problems around large language models, including the lack of adequate policies on data governance and privacy and the algorithms’ tendency to spew toxic content, such as racist or sexist language.

Out in the open

Large language models are deep-learning algorithms that are trained on massive amounts of data. They are one of the hottest areas of AI research. Powerful models such as GPT-3 and LaMDA, which produce text that reads as if a human wrote it, have huge potential to change the way we process information online. They can be used as chatbots or to search for information, moderate online content, summarize books, or generate entirely new passages of text based on prompts. But they are also riddled with problems. It takes only a little prodding before these models start producing harmful content.

The models are also extremely exclusive. They need to be trained on massive amounts of data using lots of expensive computing power, which is something only large (and mostly American) technology companies such as Google can afford. 

Most big tech companies developing cutting-edge LLMs restrict their use by outsiders and have not released information about the inner workings of their models. This makes it hard to hold them accountable. The secrecy and exclusivity are what the researchers working on BLOOM hope to change.

Meta has already taken steps away from the status quo: in May 2022 the company released its own large language model, Open Pretrained Transformer (OPT-175B), along with its code and a logbook detailing how the model was trained. 

But Meta’s model is available only upon request, and it has a license that limits its use to research purposes. Hugging Face goes a step further. The meetings detailing its work over the past year are recorded and uploaded online, and anyone can download the model free of charge and use it for research or to build commercial applications.  

A big focus for BigScience was to embed ethical considerations into the model from its inception, instead of treating them as an afterthought. LLMs are trained on tons of data collected by scraping the internet. This can be problematic, because these data sets include lots of personal information and often reflect dangerous biases. The group developed data governance structures specifically for LLMs that should make it clearer what data is being used and who it belongs to, and it sourced different data sets from around the world that weren’t readily available online.  

The group is also launching a new Responsible AI License, which is something like a terms-of-service agreement. It is designed to act as a deterrent from using BLOOM in high-risk sectors such as law enforcement or health care, or to harm, deceive, exploit, or impersonate people. The license is an experiment in self-regulating LLMs before laws catch up, says Danish Contractor, an AI researcher who volunteered on the project and co-created the license. But ultimately, there’s nothing stopping anyone from abusing BLOOM.

The project had its own ethical guidelines in place from the very beginning, which worked as guiding principles for the model’s development, says Giada Pistilli, Hugging Face’s ethicist, who drafted BLOOM’s ethical charter. For example, it made a point of recruiting volunteers from diverse backgrounds and locations, ensuring that outsiders can easily reproduce the project’s findings, and releasing its results in the open. 

All aboard

This philosophy translates into one major difference between BLOOM and other LLMs available today: the vast number of human languages the model can understand. It can handle 46 of them, including French, Vietnamese, Mandarin, Indonesian, Catalan, 13 Indic languages (such as Hindi), and 20 African languages. Just over 30% of its training data was in English. The model also understands 13 programming languages.

This is highly unusual in the world of large language models, where English dominates. That’s another consequence of the fact that LLMs are built by scraping data off the internet: English is the most commonly used language online.

The reason BLOOM was able to improve on this situation is that the team rallied volunteers from around the world to build suitable data sets in other languages even if those languages weren’t as well represented online. For example, Hugging Face organized workshops with African AI researchers to try to find data sets such as records from local authorities or universities that could be used to train the model on African languages, says Chris Emezue, a Hugging Face intern and a researcher at Masakhane, an organization working on natural-language processing for African languages.

Including so many different languages could be a huge help to AI researchers in poorer countries, who often struggle to get access to natural-language processing because it uses a lot of expensive computing power. BLOOM allows them to skip the expensive part of developing and training the models in order to focus on building applications and fine-tuning the models for tasks in their native languages. 

“If you want to include African languages in the future of [natural-language processing] … it’s a very good and important step to include them while training language models,” says Emezue.

Handle with caution

BigScience has done a “phenomenal” job of building a community around BLOOM, and its approach of involving ethics and governance from the beginning is a thoughtful one, says Percy Liang, director of Stanford’s Center for Research on Foundation Models. 

However, Liang doesn’t think it will lead to significant changes to LLM development. “OpenAI and Google and Microsoft are still blazing ahead,” he says.

Ultimately, BLOOM is still a large language model, and it still comes with all the associated flaws and risks. Companies such as OpenAI have not released their models or code to the public because, they argue, the sexist and racist language that has gone into them makes them too dangerous to use that way. 

BLOOM is also likely to incorporate inaccuracies and biased language, but since everything about the model is out in the open, people will be able to interrogate the model’s strengths and weaknesses, says Margaret Mitchell, an AI researcher and ethicist at Hugging Face.

BigScience’s biggest contribution to AI might end up being not BLOOM itself, but the numerous spinoff research projects its volunteers are getting involved in. For example, such projects could bolster the model’s privacy credentials and come up with ways to use the technology in different fields, such as biomedical research.  

“One new large language model is not going to change the course of history,” says Teven Le Scao, a researcher at Hugging Face who co-led BLOOM’s training. “But having one good open language model that people can actually do research on has a strong long-term impact.”

When it comes to the potential harms of LLMs, “ Pandora’s box is already wide open,” says Le Scao. “The best you can do is to create the best conditions possible for researchers to study them.”

Article link: https://www-technologyreview-com.cdn.ampproject.org/c/s/www.technologyreview.com/2022/07/12/1055817/inside-a-radical-new-project-to-democratize-ai/amp/

Time for Resilient Critical Material Supply Chain Policies – RAND

Posted by timmreardon on 12/30/2022
Posted in: Uncategorized. Leave a comment

by Fabian Villalobos, Jonathan L. Brosmer, Richard Silberglitt, Justin M. Lee, Aimee E. Curtright

DOWNLOAD EBOOK FOR FREE

PDF file 0.3MB Technical Details »

Research Questions

  1. What is the nature of the critical materials problem?
  2. How did the current rare earth element (REE) supply chain form and what lessons learned are applicable to other critical materials, such as those found in LIB materials?
  3. What are the potential risks of a disruption to the critical material supply chain?
  4. What should be the aim of policies used to prevent or mitigate the effects of shocks to critical material supply chains?

The ongoing coronavirus disease 2019 pandemic and Russian invasion of Ukraine highlight the vulnerabilities of supply chains that lack diversity and are dependent on foreign inputs. This report presents a short, exploratory analysis summarizing the state of critical materials — materials essential to economic and national security — using two case studies and policies available to the U.S. Department of Defense (DoD) to increase the resilience of its supply chains in the face of disruption.

China is the largest producer and processor of rare earth oxides (REOs) worldwide and a key producer of lithium-ion battery (LIB) materials and components. China’s market share of REO extraction has decreased, but it still has large influence over the downstream supply chain–processing and magnet manufacturing. Chinese market share of the LIB supply chain mirrors REO supply bottlenecks. If it desired, China could effectively cut off 40 to 50 percent of global REO supply, affecting U.S. manufacturers and suppliers of DoD systems and platforms.

Although a deliberate disruption is unlikely, resilience against supply disruption and building domestic competitiveness are important. The authors discuss plausible REO disruption scenarios and their hazards and synthesize insights from a “Day After . . .” exercise and structured interviews with stakeholders to identify available policy options for DoD and the U.S. government to prevent or mitigate the effects of supply disruptions on the defense industrial base (DIB) and broader U.S. economy. They explore these policies’ applicability to another critical material supply chain — LIB materials — and make recommendations for policy goals.

Key Findings

  • China has used a variety of economic practices to capture a large portion of the REE supply chain. It has used this disproportionate market share to manipulate the availability and pricing of these materials outside China.
  • Economic coercion by China has usually been executed through denying access to the domestic markets. REOs have been the lone exception: China threatened to restrict access to Chinese exports to manipulate U.S. partner nations (Japan) to geopolitical ends.
  • Projects planned to increase the extraction and processing capacity of REOs fall short of meeting estimated future demand outside China; there is not enough supply outside China to mitigate a disruption event.
  • China could effectively cut off 40–50 percent of global REO supply, which would affect manufacturers and suppliers of advanced components used in DoD systems and platforms. The DIB has a limited time frame in which to respond to a disruption before industrial readiness suffers.
  • The potential risks associated with a disruption could affect the broader U.S. economy and the DIB’s ability to procure critical materials, as well as interrupt military operations in some cases.
  • DoD has a variety of policies available to mitigate the effects of disruption. They can be categorized as proactive or reactive policies, or both. These policies have an effective time to impact, or the time needed for implementation and benefits to materialize.
  • Policy options used thus far have yielded mixed results.

Recommendations

  • Proactive policies should aim to diversify critical material supply chains away from Chinese industry by expanding extraction capacity or increasing material recycling efforts.
  • Proactive efforts should aim to co-locate the upstream and downstream sectors to better leverage industrial efficiencies.
  • Reactive policies should aim to increase the DIB’s resiliency in the face of supply disruption by reducing its time to recover and increasing its time to survive.
  • Both proactive and reactive policy options with the longest time to impact should be implemented sooner rather than later to realize benefits.
  • Policies, planning, and coordination should also aim to reduce the time to impact for both proactive and reactive policies.
  • Both proactive and reactive policies should leverage U.S. ally and partner capabilities, or build relationships with nontraditional countries, to establish free access to critical materials at a fair market price wherever possible.
  • Working with nontraditional partners will be necessary because these countries have geographic access to critical materials and extraction capacity. Depending only on traditional allies and partners may only divert part of the supply chain away from Chinese industry.
  • Chinese disinformation campaigns should be expected in other critical material supply chains. This sector should work with cybersecurity experts and the U.S. intelligence community to educate executives and local governments about risks. Businesses should communicate their plans — and any influence operations underway — to local communities. The intelligence community should educate policymakers, U.S. allies and partners, and the public about the extent of Chinese interference in critical material supply chains.

Article link; https://www.rand.org/pubs/research_reports/RRA2102-1.html

Software Defines Tactics – Hudson Institute

Posted by timmreardon on 12/22/2022
Posted in: Uncategorized. Leave a comment

Structuring Military Software Acquisitions for Adaptability and Advantage in a Competitive Era

Jason Weiss & Dan Patt

View Full PDF

Executive Summary

You would not be reading this if you did not realize that it is important for the Department of Defense (DoD) to get software right. There are two sides to the coin of ubiquitous software for military systems. On one side lies untold headaches—new cyber vulnerabilities in our weapons and supporting systems, long development delays and cost overruns, endless upgrade requirements for software libraries and underlying infrastructure, challenges in modernizing legacy systems, and unexpected and undesirable bugs that emerge even after successful operational testing and evaluation. On the other side lies a vast potential for future capability, with surprising new military capabilities deployed to aircraft during and between sorties, seamless collaboration between military systems from different services and domains, and rich data exchange between allies and partners in pursuit of military goals. This report offers advice to help maximize the benefits and minimize the liabilities of the software-based aspects of acquisition, largely through structuring acquisition to enable rapid changes across diverse software forms.

This report features a narrower focus and more technical depth than typical policy analysis. We believe this detail is necessary to achieve our objectives and reach our target audience. We intend this to be a useful handbook for the DoD acquisition community and, in particular, the program executive officers (PEOs)1 and program managers as they navigate a complex landscape under great pressure to deliver capability in an environment of strategic competition. All of the 83 major defense acquisition programs and the many smaller acquisition category II and III efforts that make up the other 65 percent of defense investment scattered across the 3,112 program, project, and activity (PPA) line items found in the president’s budget request now include some software activity by our accounting.2 We would be thrilled if a larger community—contracting officers, industry executives, academics, engineers and programmers, policy analysts, legislators, staff, and operational military service members—also gleaned insight from this document. But we know that some terms may come across as jargon and that not everyone is familiar with the names of common software development tools or methods. We encourage them to read this nonetheless and are confident that the core principles and insights we present are still accessible to a broader audience.

While other recent analyses have focused on the imperative for software and offered high-level visions for a better future,3 we believe most of the acquisition community already recognizes the potential of a digital future and is engaged in a more tactical set of battles and decisions: how to structure their organizations, how to manage expectations, how to structure their deliverables, and how to write solicitations and let contracts meet requirements and strive for useful outcomes. We attempt to present background and principles that can assist them in navigating this complex landscape.

The real motivation for getting military software right is not to make the DoD more like commercial industry through digital modernization. Instead, it is to create a set of competitive advantages for the United States that stem from the embrace of distributed decision-making and mission command. The strategic context of this work stems from the observation that advantage in future military contests is less likely to come from the mere presence of robotics or artificial intelligence in military systems and is more likely to come from using these (and other) components effectively as part of a force design, and specifically from maintaining sufficient adaptability in the creation and implementation of tactics. Software, software systems, and information processing systems (including artificial intelligence and machine learning, or AI/ML) that leverage software-generated data are the critical ingredients in future force design, force employment, tactics generation, and execution monitoring. Software will bind together human decision-makers and automation support in future conflict.

This report aims to accomplish two goals:

  • Elucidate a set of principles that can help the acquisition community navigate the software world—a turbulent sea of buzzwords, technological change, bureaucratic processes, and external pressures.
  • Recruit this community to apply energy to a handful of areas that will enable better decisions and faster actions largely built around an alternative set of processes for evolutionary development.

The report is structured in chapters, each built around a core insight. Chapter 1 notes that the speed and ubiquity of digital information systems is driving military operations ever closer to capability development, and that this trend brings promise for capability and peril for bureaucratic practices.

Chapter 2 offers the military framing for software development, pointing out that it can enable greater adaptability in force employment and that the DoD cannot derive future concepts of more distributed forces and greater disaggregation of capability from requirements and specifications alone. Instead, software should enable employment tactics to evolve, especially as the elements of future combat are outside the control of any one program manager or office.

Chapter 3 introduces a framing analogy between software delivery and logistics. As logistics practitioners recognize, there is no one-size-fits-all solution to the problem of moving physical goods, but a set of principles that helps navigate many gradients of logistics systems. Similarly, the world of software is full of diversity—different use cases, computing environments, and security levels—and this heterogeneity is an intrinsic feature that the DoD needs to embrace.

Chapter 4 introduces the essential tool of the modern program office—the software factory—the tooling and processes via which operational software is created and delivered. All existing operational software was made and delivered somehow and, whether we like it or not, we will have to update it in the future. Thus, the production process matters. In many ways, the most important thing a PEO can do is identify, establish, or source an effective software factory suited to the particular needs of their unique deliverables.

Chapter 5 seeks to break down the ideas introduced earlier into actionable atomic principles that the DoD can use to navigate the tough decisions that the acquisition community faces; we refer to these ideas as ACTS. To aid the reader in understanding these, we also provide a fictitious vignette.

We recognize the inherent tension between making a report compact and accessible and covering the myriad facets of a broad and fast-changing scene. To balance this, we believe that we have focused this report on the most impactful aspects of software acquisition.

View Full PDF

Article link: https://www.hudson.org/national-security-defense/software-defines-tactics

Pentagon Supply Chain Fails Minimal Standards for US National Security – Technewsworld

Posted by timmreardon on 12/16/2022
Posted in: Uncategorized. Leave a comment
  • By Jack M. Germain
  • December 9, 2022 10:37 AM PT

Most contractors the Department of Defense hired in the last five years failed to meet the required minimum cybersecurity standards, posing a significant risk to U.S. national security.

Managed service vendor CyberSheathon Nov. 30 released a report showing that 87% of the Pentagon supply chain fails to meet basic cybersecurity minimums. Those security gaps are subjecting sizeable prime defense contractors and their subcontractors to cyberattacks from a range of threat actors putting U.S. national security at risk.

Those risks have been well-known for some time without attempts to fix them. This independent study of the Defense Industrial Base (DIB) is the first to show that federal contractors are not properly securing military secrets, according to CyberSheath.

The DIB is a complex supply chain comprised of 300,000 primes and subcontractors. The government allows these approved companies to share sensitive files and communicate securely to get their work done.

Defense contractors will soon be required to meet Cybersecurity Maturity Model Certification (CMMC) compliance to keep those secrets safe. Meanwhile, the report warns that nation-state hackers are actively and specifically targeting these contractors with sophisticated cyberattack campaigns.

“Awarding contracts to federal contractors without first validating their cybersecurity controls has been a complete failure,” Eric Noonan, CEO at CyberSheath, told TechNewsWorld.

Defense contractors have been mandated to meet cybersecurity compliance requirements for more than five years. Those conditions are embedded in more than one million contracts, he added.

Dangerous Details

The Merrill Research Report 2022, commissioned by CyberSheath, revealed that 87% of federal contractors have a sub-70 Supplier Performance Risk System (SPRS) score. The metric shows how well a contractor meets Defense Federal Acquisition Regulation Supplement (DFARS) requirements.

DFARS has been law since 2017 and requires a score of 110 for full compliance. Critics of the system have anecdotally deemed 70 to be “good enough.” Even so, the overwhelming majority of contractors still come up short.

“The report’s findings show a clear and present danger to our national security,” said Eric Noonan. “We often hear about the dangers of supply chains that are susceptible to cyberattacks.”

The DIB is the Pentagon’s supply chain, and we see how woefully unprepared contractors are despite being in threat actors’ crosshairs, he continued.

“Our military secrets are not safe, and there is an urgent need to improve the state of cybersecurity for this group, which often does not meet even the most basic cybersecurity requirements,” warned Noonan.

More Report Findings

The survey data came from 300 U.S.-based DoD contractors, with accuracy tested at the 95% confidence level. The study was completed in July and August 2022, with CMMC 2.0 on the horizon.

Roughly 80% of the DIB users failed to monitor their computer systems around-the-clock and lacked U.S.-based security monitoring services. Other deficiencies were evident in the following categories that will be required to achieve CMMC compliance:

  • 80% lack a vulnerability management solution
  • 79% lack a comprehensive multi-factor authentication (MFA) system
  • 73% lack an endpoint detection and response (EDR) solution
  • 70% have not deployed security information and event management (SIEM)

These security controls are legally required of the DIB, and since they are not met, there is a significant risk facing the DoD and its ability to conduct armed defense. In addition to being largely non-compliant, 82% of contractors find it “moderately to extremely difficult to understand the governmental regulations on cybersecurity.

Confusion Rampant Among Contractors

Some defense contractors across the DIB have focused on cybersecurity only to be stalled by obstacles, according to the report.

When asked to rate DFARS reporting challenges on a scale from one-to-10 (with 10 being extremely challenging), about 60% of all respondents rated “understanding requirements” a seven in 10 or higher. Also high on the list of challenges were routine documentation and reporting.

The primary obstacles contractors listed are challenges in understanding the necessary steps to achieve compliance, the difficulty with implementing sustainable CMMC policies and procedures, and the overall cost involved.

Unfortunately, those results closely paralleled what CyberSheath expected, admitted Noonan. He noted that the research confirmed that even fundamental cybersecurity measures like multi-factor authentication had been largely ignored.

“This research, combined with the False Claims Act case against defense giant Aerojet Rocketdyne, shows that both large and small defense contractors are not meeting contractual obligations for cybersecurity and that the DoD has systemic risk throughout their supply chain,” Noonan said.

No Big Surprise

Noonan believes the DoD has long known that the defense industry is not addressing cybersecurity. News reporting of seemingly never-ending nation-state breaches of defense contractors, including large-scale incidents like the SolarWinds and False Claims Act cases, proves that point.

“I also believe the DoD has run out of patience after giving contractors years to address the problem. Only now is the DoD going to make cybersecurity a pillar of contract acquisition,” said Noonan.

He noted the planned new DoD principle would be “No cybersecurity, no contract.”

Noonan admitted that some of the struggles that contractors voiced about difficulties in understanding and meeting cyber requirements have merit.

“It is a fair point because some of the messaging from the government has been inconsistent. In reality, though, the requirements have not changed since about 2017,” he offered.

What’s Next

Perhaps the DoD will pursue a get-tougher policy with contractors. If contractors complied with what the law required in 2017, the entire supply chain would be in a much better place today. Despite some communication challenges, the DoD has been incredibly consistent on what is required for defense contractor cybersecurity, Noonan added.

The current research now sits atop a mountain of evidence that proves federal contractors have a lot of work to do to improve cybersecurity. It is clear that work will not be done without enforcement from the federal government.

“Trust without verification failed, and now the DoD appears to be moving to enforce verification,” he said.

DoD Response

TechNewsWorld submitted written questions to the DoD about the supply chain criticism in the CyberSheath report. A spokesperson for CYBER/IT/DOD CIO for the Department of Defense replied, stating that it would take a few days to dig into the issues. We will update this story with any response we receive.


Update: Dec. 9, 2022 – 3:20 PM PT
DoD Spokesperson and U.S. Navy Commander Jessica McNulty provided this response to TechNewsWorld:

CyberSheath is a company that has been evaluated by the Cyber Accreditation Body (Cyber AB) and met the requirements to become a Registered Practitioner Organization, qualified to advise and assist Defense Industrial Base (DIB) companies with implementing CMMC. The Cyber AB is a 501(c)(3) that authorizes and accredits third-party companies conducting assessments of companies within the DIB, according to U.S. Navy Commander Jessica McNulty, a Department of Defense spokesperson.

McNulty confirmed that the DoD is aware of this report and its findings. The DoD has not taken any action to validate the findings, nor does the agency endorse this report, she said.

However, the report and its findings are generally not inconsistent with other prior reports (such as the DoD Inspector General’s Audit of Protection of DoD Controlled Unclassified Information on Contractor-Owned Networks and Systems (ref. DODIG-2019-105) or with results of compliance assessments performed by the DoD, as allowed/required by DFARS clause 252.204-7020 (when applicable), she noted.

“Upholding adequate cybersecurity standards, such as those defined by the National Institute of Standards and Technology (NIST) and levied as contractual requirements through application of DFARS 252.204-7012, is of the utmost importance for protecting DoD’s controlled unclassified information. DoD has long recognized that a mechanism is needed to assess the degree to which contract performers comply with these standards, rather than taking it on faith that the standards are met,” McNulty told TechNewsWorld.

For this reason, the DoD’s Cybersecurity Maturity Model Certification (CMMC) program was initiated, and the DoD is working to codify its requirements in part 32 of the Code of Federal Regulations, she added.

“Once implemented, CMMC assessment requirements will be levied as pre-award requirements, where appropriate, to ensure that DoD contracts are awarded to companies that do, in fact, comply with underlying cybersecurity requirements,” McNulty concluded.

Jack M. Germain has been an ECT News Network reporter since 2003. His main areas of focus are enterprise IT, Linux and open-source technologies. He is an esteemed reviewer of Linux distros and other open-source software. In addition, Jack extensively covers business technology and privacy issues, as well as developments in e-commerce and consumer electronics.

Article link: https://www.technewsworld.com/story/pentagon-supply-chain-fails-minimal-standards-for-us-national-security-177497.html

Make Our Communities Great Again (MOCG).

Posted by timmreardon on 12/15/2022
Posted in: Uncategorized. Leave a comment

Make Our Communities Great Again (MOCG) thus Making Our Country Great Again (MOCG). It all begins with local communities.

Pentagon not prepared for software updates at the speed of war, report finds – Breaking Defense

Posted by timmreardon on 12/15/2022
Posted in: Uncategorized. Leave a comment

By Sydney J. Freedberg Jr. on December 14, 2022 at 11:26 AM

WASHINGTON — Without working software, the F-35 stealth fighter is a trillion-dollar lawn ornament.

Called “a computer that happens to fly” by one former Air Force chief, with algorithms running everything from basic flight controls to long-range targeting, the F-35 runs off eight million lines of code. That’s actually less than a late-model luxury car like the 2020 Mercedes S-Class, which has over 30 million lines, notes a forthcoming report from a national security thinktank, the Hudson Institute.

Yet, co-authors Jason Weiss and Dan Patt told Breaking Defense that even as private-sector software surges ahead, a Pentagon bureaucracy built to handle industrial-age hardware still struggles to get all the fighter’s code to work.

RELATED: F-35 modernization program’s costs, schedule keeps growing: GAO

And that’s in peacetime. What if war broke out tomorrow — say a desperate Vladimir Putin lashes out at NATO, or Xi Jinping decides to seize Taiwan — and the stress of combat reveals some unexpected glitch? Imagine, for instance, that enemy anti-aircraft radars reveal a new wartime-only mode that doesn’t register in the F-35’s threat-detection algorithms. In that nightmare scenario, every day without a software update means people die.

But the Pentagon bureaucracy has simply not kept pace with the evolution of military hardware from heavy metal to software-dependent, let alone with lightning progress in the private sector. As a result, program managers cannot to update warfighting software in everything from command posts to combat vehicles at the speed that will be required in a fast-moving future conflict, according to an exclusive preview of a Hudson Institute report to be published later today.

Hudson Institute graphic

Cover of the forthcoming Hudson Institute report. (Hudson Institute graphic, courtesy of the authors)

What’s needed, Weiss and Patt argue, is system that updates software so rapidly, implementing lessons-learned and enabling new tactics, that commanders can “shift on demand with the ease of sliding a finger across the capacitive glass screen of a mobile phone.”

When done right, rapid software updates can save lives. In their war with Russia, for example, Ukrainian troops have come to depend on Elon Musk’s Starlink satellite network. So Russian electronic warfare jammed its transmissions — for a few hours, and then SpaceX updated its software to bypass the interference. “When Starlink came under attack in the Ukraine, they didn’t deploy new satellites at a speed of months or years,” said Weiss, a Navy veteran and entrepreneur who served the Pentagon’s chief software officer. “They merely reconfigured the behavior of the software, in hours.”

RELATED: SpaceX beating Russian jamming attack was ‘eyewatering’: DoD official

But even hours may be too slow sometimes, Weiss and Patt went on in a preview of their report for Breaking Defense. When reconfiguring a well-designed, zero-trust cybersecurity system to stop a newly detected intrusion, they said, or changing how a drone shares data with Low-Earth Orbit satellites, changes can happen “in minutes.”

As the armed services seek to coordinate their disparate forces through an all-encompassing, AI-enabled meta-network, rapid software updates become even more important. What’s called Joint All-Domain Command & Control (JADC2) aims to pass targeting data so quickly from “sensors” to “shooters” that the time between target detected and shots fired is only seconds. In a battle between two such highly networked militaries trying to outguess each other — China is developing what it calls an “informationized” force of its own — the winner may be the one that updates its code the quickest.

Jason Weiss photo

Jason Weiss (courtesy of the author)

Unfortunately, the Defense Department mostly handles major software updates the same way it handles hardware procurements: through a time-honored, time-consuming Planning, Programming, Budgeting, and Execution (PPBE) process that locks in major decisions eight years in advance. (The annual Future Years Defense Plan, or FYDP, sets spending levels five years out, but it takes two years for the Pentagon to build a FYDP and another year for Congress to review, amend, and enact it).

“It is impossible to predict out seven or eight years where software technology will be,” Weiss said. What’s worse, the formal requirements for how a piece of equipment must perform — covering everything from armor protection on a tank to cybersecurity in software — can be locked a decade or more before it’s delivered to actual troops.

So, the report snarks, “the program manager must defend program execution to an obsolete baseline, an obsolete requirement, and an obsolete prediction of the future.”

(Hudson Institute photo)

Dan Patt (Hudson Institute photo)

Here lies what Weiss and Patt call Pentagon procurement’s “original sin”: a “predilection for prediction.” Projecting eight years out is workable, sort of, for old-school, industrial-age, heavy-metal hardware. It really does take years to build physical prototypes, field-test them, set up assembly lines, and ramp up mass production, and the pace of all that happening is predictable. Even America’s “arsenal of democracy” in World War II, a miracle of industrial mobilization, was only possible because far-sighted planners got to work years before Pearl Harbor, and it still took, for example, until 1944 to build enough landing craft for D-Day.

But software update cycles spin much faster — if you set up the proper architecture. That requires what techies call a “software factory,” but it’s really “a process,” the report says, “that provides software development teams with a repeatable, well-defined path to create and update production software.” If you have a team of competent coders, if they can get rapid feedback from users on what works and what glitches, and if they can push out software patches rapidly over a long-range wireless network, then you can write upgraded code ASAP and push it out over the network quickly at negligible marginal cost.

The Pentagon has some promising factories already, Weiss and Patt point out, like the Air Force’s famous Kessel Run. “Over the last few years, the number of cleverly named software factories across the DoD has rapidly grown to at least 30 today,” they write. “Others across the Air Force, then across each of the other services, and finally across the DoD’s fourth estate quickly followed the trail that Kessel Run blazed.”

But how does DoD scale up these promising start-ups to the massive scale required for major war?

Hudson Institute Graphic

The cycle of developing and updating weapons increasingly resembles the Development-Operations (DevOps) cycle used for software. (Hudson Institute graphic, courtesy of the authors)

Cycles Converge: DevOps And The OODA Loop

What’s crucial is that connection between coders and users, between the people who develop the technology and the people who operate it. The faster you go, the more you automate the exchange of data, the more the boundary between these two groups dissolves. The industry calls this fusion “DevOps” – developmental operations – or sometimes “DevSecOps” – emphasizing the need for cybersecurity updates to be part of the same cycle.

Cybersecurity updates are a crucial part of the cycle because software is under constant attack, even on civilian products as innocuous as baby monitors. Coders need diagnostic data in near-real-time on the latest hack.

Military equipment, however, comes under all sorts of other attacks as well: not just cyber, but also electronic warfare (jamming and spoofing), and of course physical attack. Even a new form of camouflage or decoy — like the sun-warmed pans of water the Serbs used to draw NATO infrared sensors away from their tanks in 1999 — is a form of “attack” that military systems must take into account.

The need for constant adaptation to lessons from combat — of measure, countermeasure, and counter-countermeasures — is as old as warfare itself. Alexander the Great drilled his pikemen to outmaneuver Persia’s scythed chariots, Edward III combined longbowmen and dismounted knights to crush French heavy cavalry, and a peasant visionary named Joan of Arc convinced aristocratic French commanders to listen to common-born cannoneers about how to bring down castle walls.

Historically, however, it was much quicker to adapt your tactics than to update your technology. In 1940, for instance, when the Germans realized their anti-tank shells just bounced off the heavy armor of many French and British tanks, Rommel’s immediate response was to repurpose existing 88 mm anti-aircraft guns, which were lethal but highly vulnerable to enemy fire. It took two years to deploy a 88 mm cannon on an armored vehicle, the notorious Tiger.

Bundesarchiv

The German 88 mm anti-aircraft gun proved equally capable as an anti-tank weapon. (Bundesarchiv photo)

Col. John Boyd called this process of adaptation the OODA loop: A combatant must Observe, Orient, Decide, and Act — then observe the results of their action, re-orient themselves to the changed situation, decide what to do next, and take further action. Boyd, a fighter pilot, developed the OODA concept to describe how American mental quickness won dogfights in Korea against technologically superior MiGs. But military theorists soon applied the OODA loop to larger scales of combat. In essence, conflicts consist of nested OODA loops, with speed decreasing as size increases: a sergeant might change his squad’s scheme of fire in seconds, a major might redeploy his battalion in hours, a general might bring up supplies and reinforcements over days and weeks, a government might train new troops in months and develop new weapons over years.

But with the software-driven weapons of the 21st century, such as the F-35, these different OODA loops begin to merge. When you can push out a software patch in hours or minutes, technological updates, once the slowest kind of military change, become part of tactical adaption — the fastest. The OODA loop becomes a DevOps cycle, where the latest combat experience drives rapid changes in technology.

“Exercise of military capability bears increasing resemblance to DevOps cycle,” Weiss and Patt write in their report. “The speed and ubiquity of digital information systems is driving military operations ever closer to capability development… This trend brings promise for capability and peril for bureaucratic practices.”

So how does the Pentagon overcome that peril and get its bureaucracy up to speed?

Hudson Institute Graphic

Six principles, or ” Acquisition Competency Targets,” for Pentagon software development (Hudson Institute graphic, courtesy of the authors)

Acquisition Reform For The Digital Age

First, Weiss and Patt warn, do no harm: “Reforms” that impose some top-down, one-size-fits-all standard will just make software acquisition worse. So, instead of trying to merge “redundant” software development teams, toolkits, and platforms, as some reformers have proposed, they argue the Pentagon needs to build a zoo of diverse approaches, able to handle problems ranging from financial management algorithms to F-35 code.

Future warfare is so complex, involving such a wide array of different systems working together — from submarines to jets, tanks to satellites, all increasingly connected by what may one day involve into a JADC2 meta-network — that no “single contractor or … program office” can solve the problem on its own, the report says. Likewise, it argues, “modern software is no longer some monolithic thing. It exists in no singular repository… It is no longer built from a singular programming language [and] is rarely designed for a singular type of hardware.”

In fact, Weiss and Patt calculate that, looking at all the possible choices from what programming language to use, what contract structure, what mix of government and contractor coders, to whether to optimize to run on tablets or desktops or to assume cloud-based connectivity or enemy jamming, etcetera ad (nearly) infinitum, “there are more than six billion combinations.”

Yet certain principles still apply across the board, they say.

First of all, “the essential tool of the modern program office” is a software factory. Again, that’s not a physical place, but a process and a team, bringing together coders, users, the latest operational data, and a capability to push out rapid updates. While program managers should use existing software factories where possible, rather than reinvent the wheel, the sheer variety of different programs means the Pentagon must maintain a variety of different ones.

Gray Wolf digital twin, AFRL image

A “digital twin” of AFRL’s Gray Wolf prototype cruise missile. (Air Force Research Laboratory photo)

What’s more, and more controversially, Weiss and Patt argue that the government shouldn’t simply contract these factories out, nor try to run them entirely in-house. Instead, they recommend a hybrid called Government-Owned, Contractor-Operated. This GOCO model, already in widespread use for everything from ammunition plants to nuclear labs, allows the government to keep the lights on and sustain software support, even as workloads and profit margins ebb and flow, but still draw on contractor talent to ramp up rapidly when needed, for instance, during a war.

The Pentagon should also make extensive use of a new, streamlined process called the Software Pathway (SWP), created by then-acquisition undersecretary Ellen Lord in 2019. Some programs may exist entirely as SWPs, but even hardware-heavy programs like Army vehicles can use SWP for design and simulation functions such as digital twins. “The traditional acquisition process can execute in parallel for the bent metal and hardware aspects,” Weiss said, “[but] every new program should employ the SWP for its software needs.”

The full report goes into six broad principles and dozens of specific recommendations for everything from recruiting talent to defining requirements. But the ultimate message is simple: The key to victory is adaptability, the secret to adaptability is software, and how the Pentagon procures it needs to change.

Article link: https://breakingdefense-com.cdn.ampproject.org/c/s/breakingdefense.com/2022/12/exclusive-pentagon-not-prepared-for-software-updates-at-the-speed-of-war-report-finds/?amp=1

Big Tech Vendors Object to US Gov SBOM Mandate – SecurityWeek

Posted by timmreardon on 12/08/2022
Posted in: Uncategorized. Leave a comment

By Ryan Naraine on December 07, 2022

U.S. government’s mandates around the creation and delivery of SBOMs (software bill of materials) to help mitigate supply chain attacks has run into strong objections from big-name technology vendors.

A lobbying outfit representing big tech is calling on the federal government’s Office of Management and Budget (OMB) to “discourage agencies” from requiring SBOMs, arguing that “it is premature and of limited utility” for vendors to accurately provide a nested inventory of the ingredients that make up software components.

The trade group, called ITI (Information Technology Industry Council), counts Amazon, Microsoft, Apple, Intel, AMD, Lenovo, IBM, Cisco, Samsung, TSMC, Qualcomm, Zoom and Palo Alto Networks among its prominent members.

In a recent letter to the OMB, the group argues that SBOMs are not currently scalable or consumable. 

“We recognize and appreciate the value of flexibility built into the OMB process. Given the current level of (im-)maturity, we believe that SBOMs are not suitable contract requirements yet. The SBOM conversation needs more time to move towards a place where standardized SBOMs are scalable for all software categories and can be consumed by agencies,” the ITI letter read.

[ READ: Microsoft Releases Open Source Toolkit for Generating SBOMs ]

“At this time, it is premature and of limited utility for software producers to provide an SBOM. We ask that OMB discourage agencies from requiring artifacts until there is a greater understanding of how they ought to be provided and until agencies are ready to consume the artifacts that they request,” the group added.

At its core, an SBOM is meant to be a definitive record of the supply chain relationships between components used when building a software product. It is a machine-readable document that lists all components in a product, including all open source software, much like the mandatory ingredient list seen on food packaging.

The National Telecommunications and Information Administration (NTIA) has been busy issuing technical documentation, corralling industry feedback, and proposing the use of existing formats for the creation, distribution and enforcement of SBOMs.

In its objections, the big vendors are adamant that SBOMs are not yet suitable contract requirements. “Currently available industry tools create SBOMs of varying degrees of complexity, quality, completeness. The presence of multiple, at times inconsistent or even contradictory, efforts suggests a lacking maturity of SBOMs,” the group said.

[ Supply Chain Security Panel: A Civil Discourse on SBOMs]

The ITI letter cautioned that this is evident in a series of practical challenges related to implementation, including naming, identification, scalability, delivery and access, the linking to vulnerability information, as well as the applicability to cloud services, platforms and legacy software. 

“These challenges make it difficult to effectively deploy and utilize SBOMs as a tool to foster transparency. The SBOM conversation needs more time to mature and move towards a place where SBOMs are scalable and consumable,” the group added.

The tech vendors also flagged concerns around the security of sensitive proprietary information that may be collected via SBOMs and held by federal agencies and called for clarifications around the definition of artifacts and what protections will be afforded to safeguard sensitive information. 

The SBOM mandate was included in a cybersecurity executive order issued last May, sending security leaders scrambling to understand the ramifications and prepare for downstream side-effects.

The U.S. Commerce Department’s NTIA has been out front advocating for SBOMs with a wide range of new documentation including:

  • SBOM at a glance – an introduction to the practice of SBOM, supporting literature, and the pivotal role SBOMs play in providing much-needed transparency for the software supply chain.
  • A detailed FAQ document that outlines information, benefits, and commonly asked questions.
  • A two-page overview provides high-level information on SBOM’s background and eco-wide solution, the NTIA process, and an example of an SBOM.
  • A series of SBOM Explainer Videos on YouTube.

Separately, the open-source Linux Foundation has released a batch of new industry research, training, and tools aimed at accelerating the use of a Software Bill of Materials (SBOM) in secure software development.

Article link: https://www.securityweek.com/big-tech-vendors-object-us-gov-sbom-mandate

Related: Cybersecurity Leaders Scramble to Decipher SBOM Mandate

Related: Microsoft Releases Open Source Toolkit for Generating SBOMs

Related: One Year Later: Log4Shell Remediation Slow, Painful Slog

Related: Video: A Civil Discourse on SBOMs

Ryan Naraine is Editor-at-Large at SecurityWeek and host of the popular Security Conversations podcast series. Ryan is a veteran cybersecurity strategist who has built security engagement programs at major global brands, including Intel Corp., Bishop Fox and GReAT. He is a co-founder of Threatpost and the global SAS conference series. Ryan’s past career as a security journalist included bylines at major technology publications including Ziff Davis eWEEK, CBS Interactive’s ZDNet, PCMag and PC World. Ryan is a director of the Security Tinkerers non-profit, an advisor to early-stage entrepreneurs, and a regular speaker at security conferences around the world. Follow Ryan on Twitter @ryanaraine.

Previous Columns by Ryan Naraine:

Apple Scraps CSAM Detection Tool for iCloud Photos

Apple Adding End-to-End Encryption to iCloud Backup

Big Tech Vendors Object to US Gov SBOM Mandate

Investors Pour $200 Million Into Compliance Automation Startup Drata

Balance Theory Scores Seed Funding for Secure Workspace Collaboration

Technical debt: The cybersecurity threat hiding in plain sight – C4ISRNET

Posted by timmreardon on 12/06/2022
Posted in: Uncategorized. Leave a comment

By Kynan Carver

Dec 5 at 07:39 AM

Who remembers when floppy disks provided a new level of capability for the Department of Defense to support the operations of strategic forces across the world?

It was only in the last few years that DoD retired those 8-inch floppies in favor of modern computer capabilities, and the push is on to accelerate modernization of systems that rely on older technology in order to address modern threats.

This accumulation of older software and infrastructure, known as technical debt, takes a substantial amount of budget and resources to maintain, which puts pressure on new innovation required for enterprise functions such as cybersecurity.

The Pentagon recognizes the impact of technical debt on protecting systems and data from cyberattacks. It intends to deploy a zero trust strategy across the whole department by 2027. By assuming every application, network, connection, and user can become a threat, a zero trust framework validates access through control policies for specific functions – a significant advancement from perimeter-based security that can allow unrestricted access once an attacker gets inside the network.

Technical Debt Hinders Cybersecurity

The Biden Administration’s executive order on cybersecurity in May 2021 jump started the movement toward a zero trust architecture. It called for modernized cyber defenses, improved information sharing and stronger responses to attacks, all of which start with zero trust and depend on modern technologies such as identity management, cloud, artificial intelligence, machine learning and data analytics.

To accelerate this implementation of zero trust, DoD must continue to pay down its technical debt.

The Pentagon can look to the Department of Labor as a model to balance legacy system needs and innovation. As CIO Gundeep Ahluwaliaexplained in a recent interview, “When I joined the department six years ago, we invested only 10% of funds in modernization and development. Now we allocate 25% of our overall funds, and ideally this will increase to 40% in the near future for modernization and development. Modernization is a continuum, and this investment is essentially paying down that technical debt while preventing it from building up again.”

Technical debt has taken on new importance with its reference in the National Defense Authorization Act of 2022, which authorizes the DoD to a study it and make recommendations on its impact to software-intensive systems. DOD CIO John Sherman recently emphasized the connection between technical debt and zero trustas a cyber defense, and he confirmed the department’s commitment to addressing it as adversarial threats increase.

The scope of the technical debt problem is difficult to calculate, but if historical patterns have continued then 75 percent of the federal budget goes to operations and maintenance of legacy systems. That’s older technology that is not prepared or hardened for today’s cyber attacks.

Technology That Stays Ahead of Evolving Threats

There are ways to surround legacy systems with newer, resilient technologies and provide a layered defense of mission critical applications and data.

Identity management, for example, distinguishes a digital identity, while authentication confirms the identity to allow permission-based access to networks, applications, and data. Moreover, operating within a zero trust environment prevents users from gaining access unless they have the appropriate authentication and authorization.

The sheer amount of data from cyberattacks requires innovations in artificial intelligence, machine learning and analytics so that data quickly gets aggregated and filtered, patterns are detected, and threats are elevated for further review.

Because the Pentagon is such a large target for cyberattacks, the Defense Department needs these technologies and a zero trust methodology to eliminate both older technology and processes performed manually, which will reduce threats and fend off penetration.

While perimeter security has become standard for many networks, the executive order on cybersecurity and the DoD’s strategy for zero trust represent the government’s intention to change that. It will require more than technology to nullify existing threats and prevent unknown ones from becoming attacks.

For DoD to reach its full potential for cybersecurity, it must instill a shift in thinking that goes beyond perimeter defenses. Users need to embrace requirements such as multiple logins for enterprise-wide zero trust as well as information sharing that allows for better threat detection and response.

As the Pentagon tackles its technical debt to improve cybersecurity and overall asset protection, it can take the opportunity to renew users’ passion for the mission and good cyber behavior.

Kynan Carver is Defense Cybersecurity Lead at Maximus, an IT services management company focused on the federal government.

Article link: https://www.c4isrnet.com/cyber/2022/12/05/technical-debt-the-cybersecurity-threat-hiding-in-plain-sight/

Technical debt: The cybersecurity threat hiding in plain sight – C4ISRNET

Posted by timmreardon on 12/06/2022
Posted in: Uncategorized. Leave a comment

By Kynan Carver

Dec 5 at 07:39 AM

Who remembers when floppy disks provided a new level of capability for the Department of Defense to support the operations of strategic forces across the world?

It was only in the last few years that DoD retired those 8-inch floppies in favor of modern computer capabilities, and the push is on to accelerate modernization of systems that rely on older technology in order to address modern threats.

This accumulation of older software and infrastructure, known as technical debt, takes a substantial amount of budget and resources to maintain, which puts pressure on new innovation required for enterprise functions such as cybersecurity.

The Pentagon recognizes the impact of technical debt on protecting systems and data from cyberattacks. It intends to deploy a zero trust strategy across the whole department by 2027. By assuming every application, network, connection, and user can become a threat, a zero trust framework validates access through control policies for specific functions – a significant advancement from perimeter-based security that can allow unrestricted access once an attacker gets inside the network.

Technical Debt Hinders Cybersecurity

The Biden Administration’s executive order on cybersecurity in May 2021 jump started the movement toward a zero trust architecture. It called for modernized cyber defenses, improved information sharing and stronger responses to attacks, all of which start with zero trust and depend on modern technologies such as identity management, cloud, artificial intelligence, machine learning and data analytics.

To accelerate this implementation of zero trust, DoD must continue to pay down its technical debt.

The Pentagon can look to the Department of Labor as a model to balance legacy system needs and innovation. As CIO Gundeep Ahluwaliaexplained in a recent interview, “When I joined the department six years ago, we invested only 10% of funds in modernization and development. Now we allocate 25% of our overall funds, and ideally this will increase to 40% in the near future for modernization and development. Modernization is a continuum, and this investment is essentially paying down that technical debt while preventing it from building up again.”

Technical debt has taken on new importance with its reference in the National Defense Authorization Act of 2022, which authorizes the DoD to a study it and make recommendations on its impact to software-intensive systems. DOD CIO John Sherman recently emphasized the connection between technical debt and zero trustas a cyber defense, and he confirmed the department’s commitment to addressing it as adversarial threats increase.

The scope of the technical debt problem is difficult to calculate, but if historical patterns have continued then 75 percent of the federal budget goes to operations and maintenance of legacy systems. That’s older technology that is not prepared or hardened for today’s cyber attacks.

Technology That Stays Ahead of Evolving Threats

There are ways to surround legacy systems with newer, resilient technologies and provide a layered defense of mission critical applications and data.

Identity management, for example, distinguishes a digital identity, while authentication confirms the identity to allow permission-based access to networks, applications, and data. Moreover, operating within a zero trust environment prevents users from gaining access unless they have the appropriate authentication and authorization.

The sheer amount of data from cyberattacks requires innovations in artificial intelligence, machine learning and analytics so that data quickly gets aggregated and filtered, patterns are detected, and threats are elevated for further review.

Because the Pentagon is such a large target for cyberattacks, the Defense Department needs these technologies and a zero trust methodology to eliminate both older technology and processes performed manually, which will reduce threats and fend off penetration.

While perimeter security has become standard for many networks, the executive order on cybersecurity and the DoD’s strategy for zero trust represent the government’s intention to change that. It will require more than technology to nullify existing threats and prevent unknown ones from becoming attacks.

For DoD to reach its full potential for cybersecurity, it must instill a shift in thinking that goes beyond perimeter defenses. Users need to embrace requirements such as multiple logins for enterprise-wide zero trust as well as information sharing that allows for better threat detection and response.

As the Pentagon tackles its technical debt to improve cybersecurity and overall asset protection, it can take the opportunity to renew users’ passion for the mission and good cyber behavior.

Kynan Carver is Defense Cybersecurity Lead at Maximus, an IT services management company focused on the federal government.

Article link: https://www.c4isrnet.com/cyber/2022/12/05/technical-debt-the-cybersecurity-threat-hiding-in-plain-sight/

Pentagon must embrace commercial technologies to win data war – C4ISRNET

Posted by timmreardon on 12/05/2022
Posted in: Uncategorized. Leave a comment

By Jock Padgett

Wednesday, Nov 16, 2022

Over the past several years, the U.S. Department of Defense and the military services have spent enormous amounts of time extending weapons ranges and creating new sensors, algorithms and hardware to support the next generation of warfighting. With these new capabilities, interconnectivity complexities increase exponentially, frequently due to siloed invention and disparate integration.

Because operations environments are not pristine and uniform across all formations, these variants create havoc on G2 intelligence gathering and G6 cyber operations teams during exercises and combat situations. The department needs to focus on data logistics, specifically connectivity and hierarchy, to succeed.

Data is the lifeblood of any digital transactional or analytical system. Getting data to systems or sensors becomes a critical pathway to success for any digital organization. For this article, data logistics are all efforts necessary to move data from source systems to their destination.

Sometimes this means many endpoints based on the type of data. Other times this means moving data back to the source for further data gathering or messages that create actions. While in detail, data logistics is different from physical logistics, at abstract levels, they are similar.

A promising approach is Zhamak Dehghani’s “data mesh.” Principles defined in data mesh include domain-oriented decentralized data ownership and architecture, data as a product, self-serve data infrastructure as a platform, and federated computational governance. These principles align with Program Executive Office’s independent structure.

However, appropriate enterprise technologies must exist for data mesh to be successful. These include architectural patterns, data catalogs, change data capture, immutable audit logs and data product APIs. At the core is the event streaming backbone, which shuttles data to interested systems and applications.

Commercial cloud providers, well entrenched in the DoD, already have the capabilities integrated into their offerings. AWS offers Kinesis and Azure provides Stream Analytics, to name two. One industry standard is Kafka, an open-source project or managed service by cloud providers or Confluent.io. Many companies have adopted Kafka due to its in-stream processing capabilities that provide real-time analytics for rapid decision-making and event triggers for humans and machines.

Private industry has solved many problems related to the rapid onboarding of new sensors and sites. The use of automation from networks to applications will enable the Services and OSD to rapidly create the logical integrations necessary to achieve Joint All-Domain Command and Control, or JADC2. As units move to support combatant commands, unit assets need to reduce the time to connectivity. Additionally, there needs to be a focus on automating connections between data distribution nodes and any relevant data backbones.

DoD needs to press on a technical strategy requiring current and future systems to achieve discoverability. An example is the human presence function on messaging systems reused and applied to HIMARS, F-35s, Abrams and TPQ-53s.

The Army’s Unified Network Operations has made positive efforts to improve connectivity timelines. A key opportunity exists with zero-trust security. By divesting the network edge, services at every echelon can become interoperable at a more flexible technical level.

This will require unified efforts across the DoD. Doing so will synchronize internal technical efforts and policies that will onboard systems and sensors and allow, for example, an F-35 and HIMARS to talk directly to each other without human intervention.

Another focus necessary to enable rapid data logistics is rearchitecting the hierarchical distribution model. According to Conway’s Law,

“Any organization that designs a system (defined broadly) will produce a design whose structure is a copy of the organization’s communication structure.” — Melvin E. Conway

A shift in thought and design for data distribution architectures is necessary to enable rapid decision-making by humans and systems. The latency between systems and sensors will be critical, even when robust networks and computing exist.

Flattening these architectures will also reduce digital choke points for moving inter-echelon and intra-echelon. Designing our systems to replicate a flat and interconnected organizational model is necessary for minimizing these risks.

For the Pentagon to enable data for decision-making by a human, sensor, or system, the DoD must focus on the data logistics. The DoD must reduce the time for connections at the network, data, application, and system levels. Using industry-proven technologies and architectures enables warfighters and decision-makers, whether human or machine, by improving resiliency in the entire architecture.

Jock Padgett is the Chief Data Officer and Senior Digital Advisor at the XVIII Airborne Corps, US Army. During the Afghanistan withdrawal and Ukraine support mission, he served as the Chief Technology Officer at the 82nd Airborne Division.

The opinions expressed are his and do not represent the Department of Defense’s position or approach.

Article link: https://www.c4isrnet.com/thought-leadership/2022/11/16/pentagon-must-embrace-commercial-technologies-to-win-data-war/

Posts navigation

← Older Entries
Newer Entries →
  • Search site

  • Follow healthcarereimagined on WordPress.com
  • Recent Posts

    • Are AI Tools Ready to Answer Patients’ Questions About Their Medical Care? – JAMA 03/27/2026
    • How AI use in scholarly publishing threatens research integrity, lessens trust, and invites misinformation – Bulletin of the Atomic Scientists 03/25/2026
    • VA Prepares April Relaunch of EHR Program – GovCIO 03/19/2026
    • Strong call for universal healthcare from Pope Leo today – FAN 03/18/2026
    • EHR fragmentation offers an opportunity to enhance care coordination and experience 03/16/2026
    • When AI Governance Fails 03/15/2026
    • Introduction: Disinformation as a multiplier of existential threat – Bulletin of the Atomic Scientists 03/12/2026
    • AI is reinventing hiring — with the same old biases. Here’s how to avoid that trap – MIT Sloan 03/08/2026
    • Fiscal Year 2025 Year In Review – PEO DHMS 02/26/2026
    • “𝗦𝗼𝗰𝗶𝗮𝗹 𝗠𝗲𝗱𝗶𝗮 𝗠𝗮𝗻𝗶𝗽𝘂𝗹𝗮𝘁𝗶𝗼𝗻 𝗳𝗼𝗿 𝗦𝗮𝗹𝗲” – NATO Strategic Communications COE 02/26/2026
  • Categories

    • Accountable Care Organizations
    • ACOs
    • AHRQ
    • American Board of Internal Medicine
    • Big Data
    • Blue Button
    • Board Certification
    • Cancer Treatment
    • Data Science
    • Digital Services Playbook
    • DoD
    • EHR Interoperability
    • EHR Usability
    • Emergency Medicine
    • FDA
    • FDASIA
    • GAO Reports
    • Genetic Data
    • Genetic Research
    • Genomic Data
    • Global Standards
    • Health Care Costs
    • Health Care Economics
    • Health IT adoption
    • Health Outcomes
    • Healthcare Delivery
    • Healthcare Informatics
    • Healthcare Outcomes
    • Healthcare Security
    • Helathcare Delivery
    • HHS
    • HIPAA
    • ICD-10
    • Innovation
    • Integrated Electronic Health Records
    • IT Acquisition
    • JASONS
    • Lab Report Access
    • Military Health System Reform
    • Mobile Health
    • Mobile Healthcare
    • National Health IT System
    • NSF
    • ONC Reports to Congress
    • Oncology
    • Open Data
    • Patient Centered Medical Home
    • Patient Portals
    • PCMH
    • Precision Medicine
    • Primary Care
    • Public Health
    • Quadruple Aim
    • Quality Measures
    • Rehab Medicine
    • TechFAR Handbook
    • Triple Aim
    • U.S. Air Force Medicine
    • U.S. Army
    • U.S. Army Medicine
    • U.S. Navy Medicine
    • U.S. Surgeon General
    • Uncategorized
    • Value-based Care
    • Veterans Affairs
    • Warrior Transistion Units
    • XPRIZE
  • Archives

    • March 2026 (8)
    • February 2026 (6)
    • January 2026 (8)
    • December 2025 (11)
    • November 2025 (9)
    • October 2025 (10)
    • September 2025 (4)
    • August 2025 (7)
    • July 2025 (2)
    • June 2025 (9)
    • May 2025 (4)
    • April 2025 (11)
    • March 2025 (11)
    • February 2025 (10)
    • January 2025 (12)
    • December 2024 (12)
    • November 2024 (7)
    • October 2024 (5)
    • September 2024 (9)
    • August 2024 (10)
    • July 2024 (13)
    • June 2024 (18)
    • May 2024 (10)
    • April 2024 (19)
    • March 2024 (35)
    • February 2024 (23)
    • January 2024 (16)
    • December 2023 (22)
    • November 2023 (38)
    • October 2023 (24)
    • September 2023 (24)
    • August 2023 (34)
    • July 2023 (33)
    • June 2023 (30)
    • May 2023 (35)
    • April 2023 (30)
    • March 2023 (30)
    • February 2023 (15)
    • January 2023 (17)
    • December 2022 (10)
    • November 2022 (7)
    • October 2022 (22)
    • September 2022 (16)
    • August 2022 (33)
    • July 2022 (28)
    • June 2022 (42)
    • May 2022 (53)
    • April 2022 (35)
    • March 2022 (37)
    • February 2022 (21)
    • January 2022 (28)
    • December 2021 (23)
    • November 2021 (12)
    • October 2021 (10)
    • September 2021 (4)
    • August 2021 (4)
    • July 2021 (4)
    • May 2021 (3)
    • April 2021 (1)
    • March 2021 (2)
    • February 2021 (1)
    • January 2021 (4)
    • December 2020 (7)
    • November 2020 (2)
    • October 2020 (4)
    • September 2020 (7)
    • August 2020 (11)
    • July 2020 (3)
    • June 2020 (5)
    • April 2020 (3)
    • March 2020 (1)
    • February 2020 (1)
    • January 2020 (2)
    • December 2019 (2)
    • November 2019 (1)
    • September 2019 (4)
    • August 2019 (3)
    • July 2019 (5)
    • June 2019 (10)
    • May 2019 (8)
    • April 2019 (6)
    • March 2019 (7)
    • February 2019 (17)
    • January 2019 (14)
    • December 2018 (10)
    • November 2018 (20)
    • October 2018 (14)
    • September 2018 (27)
    • August 2018 (19)
    • July 2018 (16)
    • June 2018 (18)
    • May 2018 (28)
    • April 2018 (3)
    • March 2018 (11)
    • February 2018 (5)
    • January 2018 (10)
    • December 2017 (20)
    • November 2017 (30)
    • October 2017 (33)
    • September 2017 (11)
    • August 2017 (13)
    • July 2017 (9)
    • June 2017 (8)
    • May 2017 (9)
    • April 2017 (4)
    • March 2017 (12)
    • December 2016 (3)
    • September 2016 (4)
    • August 2016 (1)
    • July 2016 (7)
    • June 2016 (7)
    • April 2016 (4)
    • March 2016 (7)
    • February 2016 (1)
    • January 2016 (3)
    • November 2015 (3)
    • October 2015 (2)
    • September 2015 (9)
    • August 2015 (6)
    • June 2015 (5)
    • May 2015 (6)
    • April 2015 (3)
    • March 2015 (16)
    • February 2015 (10)
    • January 2015 (16)
    • December 2014 (9)
    • November 2014 (7)
    • October 2014 (21)
    • September 2014 (8)
    • August 2014 (9)
    • July 2014 (7)
    • June 2014 (5)
    • May 2014 (8)
    • April 2014 (19)
    • March 2014 (8)
    • February 2014 (9)
    • January 2014 (31)
    • December 2013 (23)
    • November 2013 (48)
    • October 2013 (25)
  • Tags

    Business Defense Department Department of Veterans Affairs EHealth EHR Electronic health record Food and Drug Administration Health Health informatics Health Information Exchange Health information technology Health system HIE Hospital IBM Mayo Clinic Medicare Medicine Military Health System Patient Patient portal Patient Protection and Affordable Care Act United States United States Department of Defense United States Department of Veterans Affairs
  • Upcoming Events

Blog at WordPress.com.
healthcarereimagined
Blog at WordPress.com.
  • Subscribe Subscribed
    • healthcarereimagined
    • Join 153 other subscribers
    • Already have a WordPress.com account? Log in now.
    • healthcarereimagined
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...