healthcarereimagined

Envisioning healthcare for the 21st century

  • About
  • Economics

Raj Iyer: Army to Invest $1.4B in ERP System Modernization

Posted by timmreardon on 06/20/2022
Posted in: Uncategorized. Leave a comment
  • JANE EDWARDS. JUNE 13, 2022

Raj Iyer, chief information officer of the U.S. Army, said the service intends to spend $1.4 billion by fiscal year 2023 on the modernization of its enterprise resource planning systems used to manage its financial, logistics, human resources and training activities, Breaking Defense reported Friday.

He told reporters during a briefing Thursday the military branch will award multiple other transaction authority agreements that will run between 12 and 18 months to build prototypes for enterprise business systems and then award a production contract following the downselection process.

“Some of the things that we will be looking for as part of … this prototype is to look at how modular the architecture is, again, to make sure that is future-proofed,” said Iyer, a 2022 Wash100 Award winner.

“We’ll be looking at the ability to support data exchange through APIs and micro-services … We’ll be looking at the system being cloud native from the get-go and making sure that we can fully benefit from a future modern architecture … We’ll be looking at how flexible the solution will be in terms of its ability to implement Army-unique processes where we have them without the need to customize commercial-off-the-shelf products,” he added.

Iyer also discussed how the Army will spend its FY 2023 budget request of $16.6 billion for information technology and cybersecurity.

Iyer will headline GovCon Wire Events’ 2nd Annual Army IT and Digital Transformation Forum on Wednesday, June 15. Join this forum to hear from military, government and industry leaders on how the Army pursues innovation and advances digital technology adoption to help the force fight and win the battles of tomorrow.

Article link: https://www.govconwire.com/2022/06/raj-iyer-army-to-invest-1-4b-in-erp-system-modernization/

The Best Examples Of Digital Twins Everyone Should Know About – Forbes

Posted by timmreardon on 06/20/2022
Posted in: Uncategorized. Leave a comment

Bernard Marr

ContributorJun 20, 2022,01:13amEDT https://www.gstatic.com/readaloud/forbes/player/web/api/iframe/index.html?

The digital twin is an exciting concept and undoubtedly one of the hottest tech trends right now. It fuses ideas including artificial intelligence (AI), the internet of things (IoT), metaverse, and virtual and augmented reality (VR/AR) to create digital models of real-world objects, systems, or processes. These models can then be used to tweak and adjust variables to study the effect on whatever is being twinned – at a fraction of the cost of carrying out experiments in the real world. 

Businesses around the globe are looking to deploy Digital Twins across a broad range of applications, ranging from engineering design of complex equipment and 3D immersive environments to precision medicine and digital agriculture. However, to date, applications have been highly customized and only accessible for high value use-cases, such as the operations of jet engines, industrial facilities and power plants. Now leading technology companies like AWS are working hard to lower the costs and simplify the deployment of this technology, with AWS IoT TwinMaker, making it easier and more accessible for all kinds and sizes of companies to build their own Digital Twins.

Some truly groundbreaking digital twins have been developed in recent years that are inspiring the industry and helping to push the envelope of what is possible in the fields of science, medicine, engineering, pharmaceuticals, sports, and many more. Here are some of the most interesting and innovative examples.

The Human Brain

Let’s start with the most ambitious! The human brain is, as far as we know, the most complex structure or organism in the universe. Creating a digital simulation of it is incredibly complicated, but that hasn’t put people off trying. The EU-funded Neurotwinproject aims to simulate specific human brains in order to build models that can predict the best treatments for conditions such as Alzheimer’s and epilepsy. There have been other attempts to simulate aspects of the brain in the past, but Neurotwin is the first project that focuses on modeling both the electromagnetic activity and the physiology. Clinical trials using the model are due to start in 2023.

An Entire Human

Ok, so this one is a bit of a pipe dream right now, but the science exists to make it a reality. Former GE CEO Bill Ruh predicts that one day, every human will have a digital twin at birth, which can be used to design bespoke treatments for that person when they become ill, as well as model the impact of lifestyle choices on his or her long-term health. Using that person’s unique genome, it will be possible to predict the effects of different drugs, providing insight into the best treatment options if the person is struck by conditions such as cancer or Parkinson’s disease. This will minimize the wasted cost of failed treatment programs that were never going to work due to the patient’s genetics, and lengthen lifespans.

Los Angeles Transportation 

The Los Angeles Department of Transportation has partnered with the Open Mobility Foundation to create a data-driven digital twin of the city’s transport infrastructure. To start with, it will model the movement and activity of micro-mobility solutions such as the city’s network of shared-use bicycles and e-scooters. After that, it will be expanded to cover ride-sharing services, carpools, and new mobility solutions that will appear, such as autonomous taxi drones. 

The Whole of Shanghai

The Shanghai Urban Operations and Management Center has built a digital twin of the city of 26 million inhabitants, which models 100,000 elements from refuse disposal and collection facilities to e-bike charging infrastructure, road traffic, and the size and location of apartment buildings. Its creator, 51World, uses data from satellites and drones to construct the living model, which, among other uses, is helping authorities to plan and react in the face of the Covid-19 pandemic. It can also be used to simulate the effects of natural disasters such as flooding to aid with response planning. 

A Sports Stadium

Los Angeles’ Sofi Stadium – home to NFL teams the LA Rams and LA Chargers – has its own digital twin, which models not just the stadium itself but also the 300-acre Hollywood Park campus around it. Built as the stadium itself was undergoing construction (starting in 2020), it collects data in real-time from every area of the park’s operations into a single platform that can be used to answer questions from everybody from event organizers looking to use the space, to maintenance and janitorial operations. Users engage with the twin via an “app store” model, where they can engage with applications specific to the features and functionality that they need to work with. 

The World’s First 3D-Printed Bridge

The 12-meter steel bridge spanning the Oudezijds Achterburgwal canal in central Amsterdam is remarkable due to the fact it is the first pedestrian bridge to be entirely constructed via 3D printing. It’s also unique due to the fact it has its own digital twin. A network of sensors is placed across the structure as part of a project led by the Turing Institute. These sensors gather data that is used to build the twin, which can then be used to analyze the performance of the structure as it comes under stress during everyday use. This is particularly important considering it is the first bridge ever to be built using this technology, and more data about the safety and strength of 3D printed structures is vital if it’s going to become a mainstream engineering tool in the future.

Every Tesla Ever Sold

Tesla creates a digital simulation of every one of its cars, using data collected from sensors on the vehicles and uploaded to the cloud. These allow the company’s AI algorithms to determine where faults and breakdowns are most likely to occur and minimize the need for owners to take their cars to servicing stations for repairs and maintenance. This reduces cost to the company of servicing cars that are under warranty and improves user experience, leading to more satisfied customers and a higher chance of winning repeat business.

Article link: https://www.forbes.com/sites/bernardmarr/2022/06/20/the-best-examples-of-digital-twins-everyone-should-know-about/amp/

To stay on top of the latest business and tech trends, make sure to subscribe to my newsletter and have a look at my new book Business Trends in Practice. 

You can also follow me on Twitter, LinkedIn, and YouTube. And don’t forget to check out my website.

Tech Modernization Fund Launches Fresh $100 Million for CX Projects – Nextgov

Posted by timmreardon on 06/19/2022
Posted in: Uncategorized. Leave a comment

By ALEXANDRA KELLEYJUNE 16, 2022

The TMF is accepting proposals from federal agencies looking to modernize digital services with a human-centric design.

$100 million from the federal Technology Modernization Fund will be allocated toward improving customer experiences for civilian end users interacting with U.S. government digital services.

Announced on Thursday, the Office of Management and Budget and the General Services Administration—two agencies that help oversee the TMF—are investing large sums into improving federal digital services. Improving user experience with government services has been one of the priority items on the Biden-Harris administration agenda. 

“Federal service delivery has not kept pace with the needs and expectations of the public. The American people deserve a government that puts people at the center of everything it does,” said Federal Chief Information Officer and TMF Board Chair Clare Martorana. “With this funding, we will deploy secure technology that reduces costs for agencies, eliminates burdens for the federal workforce and those it serves and powers services that meet the public’s expectations.”

Some of the areas the injection of funding will look to tackle include cutting waiting times, avoiding administering duplicative paperwork and streamlining access to government services. 

The $100 million will be dispersed over a group of projects selected from federal agencies, and those interested must apply by September 30, with approved projects selected on a rolling basis.

TMF board members will look at initiatives that focus on modernizing government digital tools and services, bolstered by customer research and data. Selected projects will also include measurable milestones and deliverables to accurately gauge progress. The final digital tool will focus on human-centered design within new federal software. 

“Government technology and websites can and must work better for the people and communities we serve,” said GSA Administrator Robin Carnahan. “Targeted TMF funding focused on making their lives easier when they need government services is a no brainer. It’s also a smart way to invest tax dollars to ensure the American people are getting the most for their money.”

So far, the TMF has dispersed about $400 million in funding across 12 projects from its reserve of $1 billion for the fiscal year, partially spurred into action by President Joe Biden’s executive order on improving federal customer experience signed earlier this year. 

Similar to the TMF’s project funding goals, Biden’s order emphasized intuitive technology to help connect Americans to crucial government services.

Article link: https://www.nextgov.com/it-modernization/2022/06/tech-modernization-fund-launches-fresh-100-million-cx-projects/368273/

VA updates RFI for Enterprise Cloud Capacity Program – Fed Health IT

Posted by timmreardon on 06/19/2022
Posted in: Uncategorized. Leave a comment

By Jackie Gilbert

 June 13, 2022

Updated June 13, 2022

Notice ID: 36C10B21Q0551

“The purpose of this RFI was to gather additional information to help VA determine the most suitable acquisition strategy to maintain and evolve its existing VA Enterprise Cloud (VAEC) which is currently built upon the cloud services of Microsoft Azure Government (MAG) and Amazon Web Services (AWS) GovCloud. This included contemplating the complete replacement of one or both of VA s existing Cloud Service Providers (CSPs) and/or expanding the VAEC to include additional FedRAMP High certified CSPs. Currently, VAEC requires access through the purchase of cloud credits to only FedRAMP High Government Community Cloud (GCC) service providers with VA Authority to Operate for the full suite of Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS) models. Further, VA anticipates any expansion effort involving the integration of additional CSPs within the VAEC will be executed through separate brand name acquisitions. VA has architected and operates its enterprise cloud, and any additional CSP would need to be architected in accordance with VA technical requirements. These facts do not restrict competition for other SaaS and PaaS cloud offerings which are outside the scope of the VAEC. Based on the above, the Request for Quotation will be posted to National Aeronautics and Space Administration Solutions for Enterprisewide Procurement Governmentwide Acquisition Contract for limited competition among resellers for the required brand name AWS cloud service capabilities and professional services.”

Read more here.


Posted September 17, 2021

Notice ID: 36C10B21Q0551

“Background: The Department of Veterans Affairs (VA) has successfully stood up its own VA Enterprise Cloud (VAEC) over the past 3-4 years consisting of two Federal Risk and Authorization Management Program (FedRAMP) High Cloud Service Providers (CSP), Amazon Web Services (AWS) and Microsoft Azure. Additional security and tooling has been established on top of these Government clouds. During this time, VA has amassed significant, Veteran-facing and VA-facing cloud IT solutions for its three administrations, the Veterans Health Administration, Veterans Benefits Administration and National Cemetery Administration as well for its Office of Information & Technology and VA Central Offices. Significant investment has been made in customizing workloads for the underlying CSPs. Security controls were inherited by the CSP’s FedRAMP Authority to Operate and VA added security controls to both clouds equally but separately…”

“The vehicles through which VA is procuring its cloud capacity and associated services are expiring in fiscal year 2022. As such, VA is in the planning stages of developing the acquisition strategies to meet these continuing requirements. VA is seeking input from FedRAMP High certified CSPs to determine the impact (i.e., cost, operational, schedule) of replacing one or both of VA’s existing CSPs, and/or the value and benefit (i.e., cost, technical) of expanding its existing VAEC to include additional CSPs. Initial market research and technical analysis efforts have revealed that replacement efforts could result in duplicative costs, schedule and operational impacts. Similarly, introducing any new CSPs into VAEC could result in duplicative costs, and significant time per additional CSP just to replicate all of the required tooling, security controls and monitoring capabilities that are already present within and around VAEC-Azure and VAEC-AWS for successful operation within the FedRAMP High VAEC architecture…”

Read more here.

Article link: https://www.fedhealthit.com/2022/06/va-rfi-enterprise-cloud-capacity-program/

Artificial intelligence is creating a new colonial world order – MIT Tech Review

Posted by timmreardon on 06/18/2022
Posted in: Uncategorized. Leave a comment

An MIT Technology Review series investigates how AI is enriching a powerful few by dispossessing communities that have been dispossessed before.

chasm concept

EDEL RODRIGUEZ

by Karen Haoarchive page

This story is the introduction to MIT Technology Review’s series on AI colonialism, which was supported by the MIT Knight Science Journalism Fellowship Program and the Pulitzer Center. Read the full series here.

My husband and I love to eat and to learn about history. So shortly after we married, we chose to honeymoon along the southern coast of Spain. The region, historically ruled by Greeks, Romans, Muslims, and Christians in turn, is famed for its stunning architecture and rich fusion of cuisines.

Related Story

South Africa’s private surveillance machine is fueling a digital apartheid

As firms have dumped their AI technologies into the country, it’s created a blueprint for how to surveil citizens and serves as a warning to the world.

Little did I know how much this personal trip would intersect with my reporting. Over the last few years, an increasing number of scholars have argued that the impact of AI is repeating the patterns of colonial history. European colonialism, they say, was characterized by the violent capture of land, extraction of resources, and exploitation of people—for example, through slavery—for the economic enrichment of the conquering country. While it would diminish the depth of past traumas to say the AI industry is repeating this violence today, it is now using other, more insidious means to enrich the wealthy and powerful at the great expense of the poor.

I had already begun to investigate these claims when my husband and I began to journey through Seville, Córdoba, Granada, and Barcelona. As I simultaneously read The Costs of Connection, one of the foundational texts that first proposed a “data colonialism,” I realized that these cities were the birthplaces of European colonialism—cities through which Christopher Columbus traveled as he voyaged back and forth to the Americas, and through which the Spanish crown transformed the world order.

In Barcelona especially, physical remnants of this past abound. The city is known for its Catalan modernism, an iconic aesthetic popularized by Antoni Gaudí, the mastermind behind the Sagrada Familia. The architectural movement was born in part from the investments of wealthy Spanish families who amassed riches from their colonial businesses and funneled the money into lavish mansions.

Related Story

How the AI industry profits from catastrophe

As the demand for data labeling exploded, an economic catastrophe turned Venezuela into ground zero for a new model of labor exploitation.

One of the most famous, known as the Casa Lleó Morera, was built early in the 20th century with profits made from the sugar trade in Puerto Rico. While tourists from around the world today visit the mansion for its beauty, Puerto Rico still suffers from food insecuritybecause for so long its fertile land produced cash crops for Spanish merchants instead of sustenance for the local people.

As we stood in front of the intricately carved façade, which features flora, mythical creatures, and four women holding the four greatest inventions of the time (a lightbulb, a telephone, a gramophone, and a camera), I could see the parallels between this embodiment of colonial extraction and global AI development.

The AI industry does not seek to capture land as the conquistadors of the Caribbean and Latin America did, but the same desire for profit drives it to expand its reach. The more users a company can acquire for its products, the more subjects it can have for its algorithms, and the more resources—data—it can harvest from their activities, their movements, and even their bodies.

Neither does the industry still exploit labor through mass-scale slavery, which necessitated the propagation of racist beliefs that dehumanized entire populations. But it has developed new ways of exploiting cheap and precarious labor, often in the Global South, shaped by implicit ideas that such populations don’t need—or are less deserving of—livable wages and economic stability.

MIT Technology Review’s new AI Colonialism series digs into these and other parallels between AI development and the colonial past by examining communities that have been profoundly changed by the technology. In part one, we head to South Africa, where AI surveillance tools, built on the extraction of people’s behaviors and faces, are re-entrenching racial hierarchies and fueling a digital apartheid. 

In part two, we head to Venezuela, where AI data-labeling firms found cheap and desperate workers amid a devastating economic crisis, creating a new model of labor exploitation. The series also looks at ways to move away from these dynamics. In part three, we visit ride-hailing drivers in Indonesia who, by building power through community, are learning to resist algorithmic control and fragmentation. In part four, we end in Aotearoa, the Māori name for New Zealand, where an Indigenous couple are wresting back control of their community’s datato revitalize its language.

Together, the stories reveal how AI is impoverishing the communities and countries that don’t have a say in its development—the same communities and countries already impoverished by former colonial empires. They also suggest how AI could be so much more—a way for the historically dispossessed to reassert their culture, their voice, and their right to determine their own future.

That is ultimately the aim of this series: to broaden the view of AI’s impact on society so as to begin to figure out how things could be different. It’s not possible to talk about “AI for everyone” (Google’s rhetoric), “responsible AI” (Facebook’s rhetoric), or “broadly distribut[ing]” its benefits (OpenAI’s rhetoric) without honestly acknowledging and confronting the obstacles in the way.

Now a new generation of scholars is championing a “decolonial AI” to return power from the Global North back to the Global South, from Silicon Valley back to the people. My hope is that this series can provide a prompt for what “decolonial AI” might look like—and an invitation, because there’s so much more to explore.

Article link: https://www.technologyreview.com/2022/04/19/1049592/artificial-intelligence-colonialism/

Read MIT Technology Review’s series on AI Colonialism here.

The Collapse of Complex Software – Read the Tea Leaves

Posted by timmreardon on 06/17/2022
Posted in: Uncategorized. Leave a comment

In 1988, the anthropologist Joseph Tainter published a book called The Collapse of Complex Societies. In it, he described the rise and fall of great civilizations such as the Romans, the Mayans, and the Chacoans. His goal was to answer a question that had vexed thinkers over the centuries: why did such mighty societies collapse?

In his analysis, Tainter found the primary enemy of these societies to be complexity. As civilizations grow, they add more and more complexity: more hierarchies, more bureaucracies, deeper intertwinings of social structures. Early on, this makes sense: each new level of complexity brings rewards, in terms of increased economic output, tax revenue, etc. But at a certain point, the law of diminishing returns sets in, and each new level of complexity brings fewer and fewer net benefits, dwindling down to zero and beyond.

But since complexity has worked so well for so long, societies are unable to adapt. Even when each new layer of complexity starts to bring zero or even negative returns on investment, people continue trying to do what worked in the past. At some point, the morass they’ve built becomes so dysfunctional and unwieldy that the only solution is collapse: i.e., a rapid decrease in complexity, usually by abolishing the old system and starting from scratch.

What I find fascinating about this (besides the obvious implications for modern civilization) is that Tainter could have been writing about software.

Anyone who’s worked in the tech industry for long enough, especially at larger organizations, has seen it before. A legacy system exists: it’s big, it’s complex, and no one fully understands how it works. Architects are brought in to “fix” the system. They might wheel out a big whiteboard showing a lot of boxes and arrows pointing at other boxes, and inevitably, their solution is… to add more boxes and arrows. Nobody can subtract from the system; everyone just adds.

Photo of a man standing in front of a whiteboard with a lot of boxes and arrows and text on the boxes

“EKS is being deprecated at the end of the month for Omega Star, but Omega Star still doesn’t support ISO timestamps.” We’ve all been there. (Via Krazam)

This might go on for several years. At some point, though, an organizational shakeup probably occurs – a merger, a reorg, the polite release of some senior executive to go focus on their painting hobby for a while. A new band of architects is brought in, and their solution to the “big diagram of boxes and arrows” problem is much simpler: draw a big red X through the whole thing. The old system is sunset or deprecated, the haggard veterans who worked on it either leave or are reshuffled to other projects, and a fresh-faced team is brought in to, blessedly, design a new system from scratch.

As disappointing as it may be for those of us who might aspire to write the kind of software that is timeless and enduring, you have to admit that this system works. For all its wastefulness, inefficiency, and pure mendacity (“The old code works fine!” “No wait, the old code is terrible!”), this is the model that has sustained a lot of software companies over the past few decades.

Will this cycle go on forever, though? I’m not so sure. Right now, the software industry has been in a nearly two-decade economic boom (with some fits and starts), but the one sure thing in economics is that booms eventually turn to busts. During the boom, software companies can keep hiring new headcount to manage their existing software (i.e. more engineers to understand more boxes and arrows), but if their labor force is forced to contract, then that same system may become unmaintainable. A rapid and permanent reduction in complexity may be the only long-term solution.

One thing working in complexity’s favor, though, is that engineers likecomplexity. Admit it: as much as we complain about other people’s complexity, we love our own. We love sitting around and dreaming up new architectural diagrams that can comfortably sit inside our own heads – it’s only when these diagrams leave our heads, take shape in the real world, and outgrow the size of any one person’s head that the problems begin.

It takes a lot of discipline to resist complexity, to say “no” to new boxes and arrows. To say, “No, we won’t solve that problem, because that will just introduce 10 new problems that we haven’t imagined yet.” Or to say, “Let’s go with a much simpler design, even if it seems amateurish, because at least we can understand it.” Or to just say, “Let’s do less instead of more.”

Simplicity of design sounds great in theory, but it might not win you many plaudits from your peers. A complex design means more teams to manage more parts of the system, more for the engineers to do, more meetings and planning sessions, maybe some more patents to file. A simple design might make it seem like you’re not really doing your job. “That’s it? We’re done? We can clock out?” And when promotion season comes around, it might be easier to make a case for yourself with a dazzling new design than a boring, well-understood solution.

Ultimately, I think whether software follows the boom-and-bust model, or a more sustainable model, will depend on the economic pressures of the organization that is producing the software. A software company that values growth at all cost, like the Romans eagerly gobbling up more and more of Gaul, will likely fall into the “add-complexity-and-collapse” cycle. A software company with more modest aims, that has a stable customer base and doesn’t change much over time (does such a thing exist?) will be more like the humble tribe that follows the yearly migration of the antelope and focuses on sustainable, tried-and-true techniques. (Whether such companies will end up like the hapless Gauls, overrun by Caesar and his armies, is another question.)

Personally, I try to maintain a good sense of humor about this situation, and to avoid giving in to cynicism or despair. Software is fun to write, but it’s also very impermanent in the current industry. If the code you wrote 10 years ago is still in use, then you have a lot to crow about. If not, then hey, at least you’re in good company with the rest of us, who probably make up the majority of software developers. Just keep doing the best you can, and try to have a healthy degree of skepticism when some wild-eyed architect wheels out a big diagram with a lot of boxes and arrows.

Article link: https://nolanlawson.com/2022/06/09/the-collapse-of-complex-software/

Latest Cyberspace Solarium Commission 2.0 Report focuses on cyber workforce – CSO

Posted by timmreardon on 06/16/2022
Posted in: Uncategorized. Leave a comment

The June 2022 report offers recommendations to the private sector, U.S. Congress, and the federal government to build up the nation’s cybersecurity talent pool.

By Christopher Burgess

CSO

JUN 6, 2022 2:00 AM PT

The Cyberspace Solarium Commission 2.0 released its most recent report on June 02, 2022. This iteration re-affirmed the continued need for public-private partnership in cybersecurity, including the development of shared resources and increased investment in a cyber workforce. Additionally, the report included a plethora of recommendations for the U.S. national cyber director’s action concerning educating and developing the national cyber workforce, as well as expanding the hiring authorities for cyber positions, and establishing “special pay rates for the most in-demand roles.” The 43-page report included seven fulsome recommendations for the national cyber director, U.S. Congress, and the private sector, which if adopted would serve to enhance the recruitment, retention, and performance of the nation’s cyber workforce in both public and private sectors.

The report’s review of the current state of affairs highlights what every CISO in both government and private entities knows: There is a talent shortage. The lack of talent, however, doesn’t always equate to less being accomplished. One may envision Lucille Ball and the chocolate confection conveyor belt as an accurate analogy, as over time more and more is expected.

The lack of personnel has and will continue to create a national security concern, “particularly when they occur in critical-infrastructure systems or supply chains upon which that infrastructure exists,” said the report.

For over a decade the forecast of shortages and the impending impact has been the topic of many a story. In its report, the Commission notes that over 600,000 cybersecurity positions across all sectors, including government, remain empty. Not mincing words, the Commission notes, “the cybersecurity community is out of time.”

National cyber director cybersecurity recommendations

  • Establish a process for ongoing cyber workforce data collection and evaluation.
  • Establish leadership and coordination structures.
  • Review and align cyber workforce budgets.
  • Create a cyber workforce development strategy for the federal government.
  • Revamp cyber hiring authorities and pay flexibilities government-wide.

Congressional cybersecurity recommendations

  • Amend the Federal Cybersecurity Workforce Assessment Act of 2015.
  • Increase support for the CyberCorps: Scholarship for Service Program.
  • Provide incentives to develop entry-level employees into mid-career talent.
  • Strive for clarity in roles and responsibilities for cyber workforce development.
  • Exercise oversight of federal cyber workforce development in each department and agency.
  • Establish cyber excepted service authorities government-wide.
  • Expand appropriations for existing efforts in cyber workforce development.

Private Sector cybersecurity recommendations

  • Increase investment in the cyber workforce.
  • Develop shared resources.

CISO takeaways from the Solarium Commission report

Referencing manpower shortages, the Commission highlights the tendency to count open billets as the primary means to determine understaffing as a shortcoming is spot-on. CISOs will be well served to take on board the recommendation to include in their measurements of the actual need. In doing so they will need to identify what is the optimal number of employees to conduct the tasks at hand. This may create a delta, between the actual number of positions and desired number of positions, thus putting underfunding as a measurable shortcoming. Whether within the government or private sector such a discussion might be contentious as every organization has internal battles for resources.

While my time within government was many moons ago, the feeling was always that within government, largely due to the long administrative tail and complicated procurement paths, the private sector was always a generation or two ahead. There may not be opportunities for CISOs to directly participate in the intra-governmental working groups and committees, yet several national cyber workforce evolution opportunities are available, and CISOs are encouraged to participate.

The report highlights the general lack of diversity within the federal government’s cyber workforce, particularly at the leadership level, characterizing “the average federal worker is more likely to be older, male, and possess a college degree relative to the rest of the U.S. labor force.” This characterization should not be taken as a signal that diversity within the private sector is where it should be, but rather as an observation that the U.S. government is trailing. There is much which can and should be done to keep diversifying the national workforce.

CISOs have enjoyed the existence of the “pay gap” in the race for talent, as only limited parts of the government have the means to create pay flexibility to bring in needed talent. With the recommendation to change the status quo and bring the pay for cyber employees closer to that of the private sector, CISOs may wish to ensure their total compensation packages for their current and future employees are complete. Working for the federal government will be more attractive, as “service to the nation” does fill the narrower pay gap for many individuals. 

The report also calls for congressional action to support the national cyber workforce. While many companies engage lobbyists to bring their corporate messages, wants, and desires to the legislative branch of the U.S. government, direct outreach from practitioners, the CISO, and their staff, provide legislators with a ground-truth view as the lawmakers take on various actions designed to enhance, grow, and sustain the national cyber talent pool.

Article link: https://www-csoonline-com.cdn.ampproject.org/c/s/www.csoonline.com/article/3663014/latest-cyberspace-solarium-commission-2-0-report-focuses-on-cyber-workforce.amp.html

OMB and GSA Announce Technology Modernization Fund Will Designate $100 Million to Improve Customer Experiences with the Federal Government

Posted by timmreardon on 06/16/2022
Posted in: Uncategorized. Leave a comment


JUNE 16, 2022PRESS RELEASES

The funding will support innovative technology projects focused on reducing burdens on the American public and the federal workers who serve them.

Today, the Office of Management and Budget and the General Services Administration announced that $100 million of the Technology Modernization Fund (TMF)—an innovative investment program for Federal technology modernization projects—will be designated to help streamline and improve digital services to deliver a better customer experience to the American people. The $100 million is a portion of the $1 billion in American Rescue Plan (ARP) funding previously provided by Congress to help secure and modernize Government technology after the pandemic accelerated the need to improve online access to Government services for the public.  To date, the TMF has invested nearly $400 million of the $1 billion in ARP funding in 12 projects.

The TMF Board, the members of which provide funding recommendations and monitor progress and performance of approved modernization projects, will prioritize investing in projects that span across agencies and will cut down on frustrating wait times, duplicative paperwork, and bureaucratic barriers people too often face when interacting with their Government. The Board will also prioritize projects aimed at improving a wide range of essential Federal Government capabilities and systems.

All Federal agencies and High-Impact Service Providers (HISPs), which serve the largest percentage of people, conduct the greatest volume of transactions annually, and have an outsized impact on the lives of the individuals they serve, are eligible to apply for the designated funding. The effort will also help deliver on commitments outlined in both the President’s Executive Order on Transforming Federal Customer Experience and Service Delivery to Rebuild Trust in Government and the second priority of the President’s Management Agenda, focused on delivering excellent, equitable, and secure Federal services and customer experience.

“Federal service delivery has not kept pace with the needs and expectations of the public. The American people deserve a Government that puts people at the center of everything it does,” said Federal Chief Information Officer and TMF Board Chair Clare Martorana. “With this funding, we will deploy secure technology that reduces costs for agencies, eliminates burdens for the Federal workforce and those it serves, and powers services that meet the public’s expectations.”

“Government technology and websites can and must work better for the people and communities we serve,” said GSA Administrator Robin Carnahan. “Targeted TMF funding focused on making their lives easier when they need government services is a no brainer. It’s also a smart way to invest tax dollars to ensure the American people are getting the most for their money.”

Projects that are selected will be supported by customer research and data, cut across agencies and systems, address immediate security gaps, and improve the public’s ability to access and manage Government services. Moreover, the agencies proposing the projects chosen will have technology teams and systems that are capable of rapidly designing, prototyping, and deploying modern digital tools and services based on human-centered design. Projects will also be required to have measurable goals to ensure TMF investments are addressing real customer pain points and gaps in accessibility and equity.

Any Federal agencies that provide public-facing information, benefits, services, and programs can apply for this funding by Monday, August 1 for expedited consideration, or by September 30, on a rolling basis.

Since the President signed the customer experience Executive Order, HISPs have made important progress delivering on specific service delivery improvement commitments, and have launched five “life experience” teams to better integrate services for people at critical moments in their lives.

About the TMF: The Technology Modernization Fund is working to transform the way the Government uses technology to deliver for the American public in an equitable, secure, and user-centric way. The TMF invests in technology projects across Government, providing incremental funding, technical assistance, and oversight throughout execution to ensure the success of its investments.

The TMF has invested in projects ranging from providing a single secure login experience for Government websites, to digitizing temporary worker visa programs and modernizing systems that support crop inspection and certification. Since receiving the $1 billion in ARP funding, the TMF has invested in 12 projects that address urgent IT modernization challenges, bolster cybersecurity defenses, and respond to the COVID-19 pandemic. For more information, visit tmf.cio.gov.

Article link: https://www.whitehouse.gov/omb/briefing-room/2022/06/16/omb-and-gsa-announce-technology-modernization-fund-will-designate-100-million-to-improve-customer-experiences-with-the-federal-government/

DoD software deliveries are lagging behind industry standards – Breaking Defense

Posted by timmreardon on 06/15/2022
Posted in: Uncategorized. Leave a comment

By Jaspreet Gill on June 15, 2022 at 12:58 PM

WASHINGTON: The Pentagon is still struggling to deliver working software for its weapon systems in a timely matter, with programs lagging behind commercial standards that call for deliveries as frequently as two weeks, according to a new watchdog report.

In its annual Weapon System Annual Assessment, released June 8, the Government Accountability Office surveyed 59 Major Defense Acquisition Projects (MDAPs) and Middle Tier Acquisition (MTA) programs and found the department needs to rapidly increase delivery of software to a majority of those programs. 

The report comes a few months after Deputy Defense Secretary Kathleen Hicks approved a new software modernization strategy outlining three key objectives the Pentagon wants to achieve, including turning its 29 software factories into one overarching department-wide “ecosystem” to rapidly acquire and deliver software at speed. 

The centralized software hub is anticipated to help streamline control points for end-to-end software delivery and speed innovation into warfighter hands. 

DoD programs aren’t delivering software frequently enough 

In its report, the GAO reviewed 39 MDAPs and MTA programs and found that the majority that reported using a modern software development approach actually deliver working software for user feedback slower than recommended by industry’s “agile” practices, which call for delivery of software as frequently as every two to six weeks. As a result, those programs may lose out on some of the benefits of using a modern approach. 

Seventy-nine percent of MTA programs were reported as using modern approaches compared to 60% of MDAPs. Some of the 39 programs reported delivering software to users as frequently as one year, 22 programs delivered software every 12 months or less and only six programs had a software delivery timeframe of three months or less. 

“However, software deliveries for user feedback at a frequency of six months to a year do not align with the Agile principle of delivering working software frequently and would not attain the benefits from fast iterative feedback cycles,” the report states. 

The Pentagon struggling with agile software deliveries isn’t new: Last year, the GAO’s annual report concluded that only six of the 42 major weapons programs reviewed actually met the industry-level standard of delivering software updates to users in a six-week timeframe.

RELATED: DoD ‘Agile’ Software Development Still Too Slow: GAO

A majority of the MDAP and MTA programs surveyed in this years report — 40 out of 59 programs — identified software development as a risk with the largest contributing factor being initial software integration with hardware. Other reported risks included the initial planned software effort proving to be more difficult than expected, hardware design changes that required additional software development efforts and requirements changes.

Workforce challenges also plague the programs, with the most commonly reported challenge being able to find staff that have the proper training and level of expertise needed to advance software development efforts.

As part of the GAO’s work on the department’s implementation of software acquisition reforms, it plans to examine DoD’s workforce issues. The report states that a 2020 RAND study presented options for DoD to track and manage its software acquisition workforce.

According to the GAO report, DoD initiated three software acquisition pilot programs in response to requirements set forth in the 2018 National Defense Authorization Act. To date, DoD has completed one of the pilot programs and is currently implementing another pilot program. 

DoD officials said they could not implement a third pilot on open source software due, in part, to the sensitivity of releasing weapon system software, according to the GAO. 

However, the GAO states DoD is continuing to mature its implementation of modern software development approaches. 

According to the report, as of February 2022 DoD is tracking 35 programs using the software acquisition pathway established in 2020. The programs include a “wide array of software intensive systems to include command and control, cybersecurity, business systems, training, and software embedded weapon programs,” the report states. 

DoD weapon system cybersecurity concerns

The GAO found that the department’s cybersecurity practices remain “generally consistent” with its prior assessment, which found that all 59 programs surveyed reported either having an approved cybersecurity strategy or are planning to have one in the future. 

The report also states the the number of programs that reported key requirements addressing cybersecurity has increased this year: 39 out of 59 programs reported at least one key system attribute addressed cybersecurity compared to 37 programs last year. 

All DoD acquisition programs are required to execute cybersecurity testing and evaluation throughout the program’s life cycle, and GAO found that “this year, the percentages of programs that completed cybersecurity testing during developmental or operational testing changed since last year … Specifically, an increased percentage of programs this year reported conducting cooperative vulnerability and adversarial assessments during developmental testing, while a decreased percentage of programs reported conducting cooperative vulnerability and adversarial assessments during operational testing.”

However, the GAO did report the F-15EX program faces a cybersecurity vulnerability risk stemming from its design, “derived from FMS aircraft and, according to the program, not designed to U.S. Air Force cybersecurity requirements.”

“The program office plans to bring subject matter experts together in April 2022 to conduct a tabletop exercise in which they talk through how they would respond to simulated scenarios identifying vulnerabilities,” according to the report. “Subsequently, the program office plans to conduct other cybersecurity assessments, with results from the tabletop exercise determining the scope and dates of these additional assessments.”

Article link: https://breakingdefense-com.cdn.ampproject.org/c/s/breakingdefense.com/2022/06/dod-software-deliveries-are-lagging-behind-industry-standards/amp/

More Reality Checks Could Help Keep Defense Programs on Time and Budget, GAO Says – Government Executive

Posted by timmreardon on 06/13/2022
Posted in: Uncategorized. Leave a comment

The Pentagon has long espoused “knowledge-based acquisition,” but doesn’t insist on it.

PATRICK TUCKER | JUNE 10, 2022

Is the lack of reality checks making Defense Department weapons programs cost too much and take too long? A new report released on Wednesday from the Government Accountability Office suggests so. 

The report looked at how often and how well Pentagon officials who design and execute arms programs use knowledge-based acquisition processes—essentially, a set of steps to make sure that program managers’ expectations are in line with reality and that they aren’t fooling themselves (and potentially the whole Department) about how costly and complicated certain things will be to make. 

Those practices include things like using analysis to gauge the current state of technology before starting a program that uses new or emerging technologies, identifying gaps between requirements and resources, and basically establishing key points in the process, called knowledge points, by which certain things should be known before continuing on to the next phase.

GAO has previously found that using knowledge practices, such as completing a preliminary design review before launching a new project, can save 36 percent on growth in cost and 31 percent in growth in schedule. Indeed, the Pentagon itself has endorsed these practices for at least two decades.

The problem is that the Defense Department makes only sporadic use of them.

“We continue to find that many [major defense acquisition programs] missed opportunities at key acquisition milestones to make knowledge-based decisions that can lead to improved cost and schedule outcomes” and that “over half of 40 major defense acquisition programs did not implement key knowledge practices.” It’s one reason why more than half of the programs GAO looked at reported delays in reaching initial operational capability. 

With statistics like those, the decision to not make knowledge-based decisions seems indefensible. Unfortunately, humans in management positions typically fall into the trap of underestimating how long and how costly projects will be, a propensity common in high-functioning individuals that Nobel-prize winning psychologist Daniel Kahneman describes in his 2013 book Thinking Fast and Slow.

Like most GAO reports, this one includes a response from the Defense Department, but it does not address the office’s conclusions about the use of knowledge-based acquisition. Defense One has reached out to the Defense Department for comment. 

Building off previous work, the Government Accountability Office also “continued to find programs not fully implementing recommended cybersecurity practices, such as testing.”

Article link: https://www.defenseone.com/technology/2022/06/more-reality-checks-could-help-keep-dod-programs-time-and-budget-gao-says/367973/

Posts navigation

← Older Entries
Newer Entries →
  • Search site

  • Follow healthcarereimagined on WordPress.com
  • Recent Posts

    • Are AI Tools Ready to Answer Patients’ Questions About Their Medical Care? – JAMA 03/27/2026
    • How AI use in scholarly publishing threatens research integrity, lessens trust, and invites misinformation – Bulletin of the Atomic Scientists 03/25/2026
    • VA Prepares April Relaunch of EHR Program – GovCIO 03/19/2026
    • Strong call for universal healthcare from Pope Leo today – FAN 03/18/2026
    • EHR fragmentation offers an opportunity to enhance care coordination and experience 03/16/2026
    • When AI Governance Fails 03/15/2026
    • Introduction: Disinformation as a multiplier of existential threat – Bulletin of the Atomic Scientists 03/12/2026
    • AI is reinventing hiring — with the same old biases. Here’s how to avoid that trap – MIT Sloan 03/08/2026
    • Fiscal Year 2025 Year In Review – PEO DHMS 02/26/2026
    • “𝗦𝗼𝗰𝗶𝗮𝗹 𝗠𝗲𝗱𝗶𝗮 𝗠𝗮𝗻𝗶𝗽𝘂𝗹𝗮𝘁𝗶𝗼𝗻 𝗳𝗼𝗿 𝗦𝗮𝗹𝗲” – NATO Strategic Communications COE 02/26/2026
  • Categories

    • Accountable Care Organizations
    • ACOs
    • AHRQ
    • American Board of Internal Medicine
    • Big Data
    • Blue Button
    • Board Certification
    • Cancer Treatment
    • Data Science
    • Digital Services Playbook
    • DoD
    • EHR Interoperability
    • EHR Usability
    • Emergency Medicine
    • FDA
    • FDASIA
    • GAO Reports
    • Genetic Data
    • Genetic Research
    • Genomic Data
    • Global Standards
    • Health Care Costs
    • Health Care Economics
    • Health IT adoption
    • Health Outcomes
    • Healthcare Delivery
    • Healthcare Informatics
    • Healthcare Outcomes
    • Healthcare Security
    • Helathcare Delivery
    • HHS
    • HIPAA
    • ICD-10
    • Innovation
    • Integrated Electronic Health Records
    • IT Acquisition
    • JASONS
    • Lab Report Access
    • Military Health System Reform
    • Mobile Health
    • Mobile Healthcare
    • National Health IT System
    • NSF
    • ONC Reports to Congress
    • Oncology
    • Open Data
    • Patient Centered Medical Home
    • Patient Portals
    • PCMH
    • Precision Medicine
    • Primary Care
    • Public Health
    • Quadruple Aim
    • Quality Measures
    • Rehab Medicine
    • TechFAR Handbook
    • Triple Aim
    • U.S. Air Force Medicine
    • U.S. Army
    • U.S. Army Medicine
    • U.S. Navy Medicine
    • U.S. Surgeon General
    • Uncategorized
    • Value-based Care
    • Veterans Affairs
    • Warrior Transistion Units
    • XPRIZE
  • Archives

    • March 2026 (8)
    • February 2026 (6)
    • January 2026 (8)
    • December 2025 (11)
    • November 2025 (9)
    • October 2025 (10)
    • September 2025 (4)
    • August 2025 (7)
    • July 2025 (2)
    • June 2025 (9)
    • May 2025 (4)
    • April 2025 (11)
    • March 2025 (11)
    • February 2025 (10)
    • January 2025 (12)
    • December 2024 (12)
    • November 2024 (7)
    • October 2024 (5)
    • September 2024 (9)
    • August 2024 (10)
    • July 2024 (13)
    • June 2024 (18)
    • May 2024 (10)
    • April 2024 (19)
    • March 2024 (35)
    • February 2024 (23)
    • January 2024 (16)
    • December 2023 (22)
    • November 2023 (38)
    • October 2023 (24)
    • September 2023 (24)
    • August 2023 (34)
    • July 2023 (33)
    • June 2023 (30)
    • May 2023 (35)
    • April 2023 (30)
    • March 2023 (30)
    • February 2023 (15)
    • January 2023 (17)
    • December 2022 (10)
    • November 2022 (7)
    • October 2022 (22)
    • September 2022 (16)
    • August 2022 (33)
    • July 2022 (28)
    • June 2022 (42)
    • May 2022 (53)
    • April 2022 (35)
    • March 2022 (37)
    • February 2022 (21)
    • January 2022 (28)
    • December 2021 (23)
    • November 2021 (12)
    • October 2021 (10)
    • September 2021 (4)
    • August 2021 (4)
    • July 2021 (4)
    • May 2021 (3)
    • April 2021 (1)
    • March 2021 (2)
    • February 2021 (1)
    • January 2021 (4)
    • December 2020 (7)
    • November 2020 (2)
    • October 2020 (4)
    • September 2020 (7)
    • August 2020 (11)
    • July 2020 (3)
    • June 2020 (5)
    • April 2020 (3)
    • March 2020 (1)
    • February 2020 (1)
    • January 2020 (2)
    • December 2019 (2)
    • November 2019 (1)
    • September 2019 (4)
    • August 2019 (3)
    • July 2019 (5)
    • June 2019 (10)
    • May 2019 (8)
    • April 2019 (6)
    • March 2019 (7)
    • February 2019 (17)
    • January 2019 (14)
    • December 2018 (10)
    • November 2018 (20)
    • October 2018 (14)
    • September 2018 (27)
    • August 2018 (19)
    • July 2018 (16)
    • June 2018 (18)
    • May 2018 (28)
    • April 2018 (3)
    • March 2018 (11)
    • February 2018 (5)
    • January 2018 (10)
    • December 2017 (20)
    • November 2017 (30)
    • October 2017 (33)
    • September 2017 (11)
    • August 2017 (13)
    • July 2017 (9)
    • June 2017 (8)
    • May 2017 (9)
    • April 2017 (4)
    • March 2017 (12)
    • December 2016 (3)
    • September 2016 (4)
    • August 2016 (1)
    • July 2016 (7)
    • June 2016 (7)
    • April 2016 (4)
    • March 2016 (7)
    • February 2016 (1)
    • January 2016 (3)
    • November 2015 (3)
    • October 2015 (2)
    • September 2015 (9)
    • August 2015 (6)
    • June 2015 (5)
    • May 2015 (6)
    • April 2015 (3)
    • March 2015 (16)
    • February 2015 (10)
    • January 2015 (16)
    • December 2014 (9)
    • November 2014 (7)
    • October 2014 (21)
    • September 2014 (8)
    • August 2014 (9)
    • July 2014 (7)
    • June 2014 (5)
    • May 2014 (8)
    • April 2014 (19)
    • March 2014 (8)
    • February 2014 (9)
    • January 2014 (31)
    • December 2013 (23)
    • November 2013 (48)
    • October 2013 (25)
  • Tags

    Business Defense Department Department of Veterans Affairs EHealth EHR Electronic health record Food and Drug Administration Health Health informatics Health Information Exchange Health information technology Health system HIE Hospital IBM Mayo Clinic Medicare Medicine Military Health System Patient Patient portal Patient Protection and Affordable Care Act United States United States Department of Defense United States Department of Veterans Affairs
  • Upcoming Events

Blog at WordPress.com.
healthcarereimagined
Blog at WordPress.com.
  • Subscribe Subscribed
    • healthcarereimagined
    • Join 153 other subscribers
    • Already have a WordPress.com account? Log in now.
    • healthcarereimagined
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...