By Fadesola Adetosoye, Danielle Hinton, Gayatri Shenai, and Ethan Thayumanavan
Affordable broadband with wraparound support could expand access to cost-efficient virtual health for underserved communities. Seven actions could help state and local leaders unlock this opportunity.
An electronic health record (EHR) is software that’s used to securely document, store, retrieve, share and analyze information about individual patient care. It enables a digital version of your health record. The Federal Electronic Health Record Modernization (FEHRM) office, along with other federal agencies, is implementing a single, common federal EHR to enhance patient care and provider effectiveness.
A federal EHR means as a provider, you will be able to access patient information entered into the EHR from different doctors, pharmacies, labs and other points of care throughout your patient’s care. There is recognition across the board that the federal EHR saves providers time and enables more standard workflows to support enhanced clinical decision-making and patient safety.
This means you benefit:
Your patients spend less time repeating their health history, undergoing duplicative tests and managing printed health records.
You have access to patient data such as service treatment records, Service medals and honors, housing status and other information to ensure patients receive all their earned benefits as they transition to civilian life in a seamless, timely fashion.
You can start conversations more productively with your patients since you already have a more comprehensive picture of their health before their appointments.
You will make more informed decisions about your patient’s care as you have access to more relevant data.
You will have a more seamless care experience. Regardless of whether your patient gets care from the Department of Defense, Department of Veterans Affairs, Department of Homeland Security’s U.S. Coast Guard or another health care system participating in the joint health information exchange, you will be able to easily access and exchange patient health information to enhance quality of care and satisfaction.
WASHINGTON — Despite spending over $800 billion a year, the Defense Department chronically under-invests in cutting-edge technology and struggles to leverage the even greater investments being made by private-sector innovators in fields from AI and networks to space and biotech.
That’s the verdict of a new report by the Reagan Foundation and Institute, released Monday. But there’s good news: Both in the report and at a conference the foundation hosted Tuesday, experts from the Pentagon, Capitol Hill, and industry were full of ideas for improvements.
The central insight from the report and morning panels: For all the reforms of recent years, the US government as a whole — not just the Defense Department — is still an unappealing customer for the most innovative firms in the economy. Even a modest government investment in a nascent, promising technology at a critical, early phase could encourage matching or even greater investment from the private sector, but the government isn’t good at making those targeted investments. So Congress needs to provide more stable, flexible funding, freed from the restrictions and delays of a frequently gridlocked appropriations process — and bureaucrats need the courage to use the authorities Congress has already granted.
“What happens many times is those tools are there, [but] they don’t get used,” said Rep. Rob Wittman, R-Va., chairman of the tactical land and air forces subcommittee. “There are OTAs and MTAs — Other Transaction Authorities and Mid-Tier [Acquisition] — that aren’t used enough.”
“The problem is sometimes, too, Congress says, ‘whoa, whoa, wait a minute’… because we like to have the control,” Wittman continued. That forces Pentagon programs to slow down and wait for a budget cycle that takes a year to grind through Congress, on top of years of planning and infighting at the Pentagon.
“It’s the culture, it’s the incentives,” emphasized retired House Armed Services Committee chairman Mac Thornberry, who advised on the scorecard. “It’s not just the Pentagon. It’s Congress too.”
Structured like a report card, the Reagan report grades the government in 10 major areas and dozens of subcategories, all related to what it calls the “National Security Innovation Base” — a construct carefully crafted to include innovative companies, laboratories, and universities outside the traditional defense industry. The scorecard is actually somewhat sanguine about America’s overall technological leadership in the world, giving it an A- despite a surge of patents, PhDs, and tech investment in China. But it all goes downhill from there.
Along with a bunch of Bs and Cs, the Reagan scorecard gave the country as a whole a D+ when it comes to fostering top-flight technical talent — a complex issue tied to education and immigration policies like the H-1B visa. Still worse, it gave the US a blunt D in what it calls “customer clarity”: In essence, how well the government connects to industry, not just through issuing policy statements and publishing strategies, but, most importantly, by providing tangible funding that gives innovators an incentive to work on tech the military wants.
A section of the Reagan Institute’s “scorecard” (Reagan Institute graphic)
That D for being a lousy customer breaks down into three subcategories, the scorecard continues. The government gets a C+ for patchy communication of policy, hampered by many agencies’ slowness in issuing public strategies and Congress’s slowness to confirm key officials, leaving key leadership slots vacant. The government gets a C- for making only limited use of streamlined acquisition processes that bypass the sclerotic traditional system. And it gets an F — the only failing grade in the entire report card — for providing erratic and ill-focused funding, with Congress routinely passing appropriations weeks or months late and frequently plussing-up traditional programs instead of new technology.
“The big companies have this problem with this inconsistent demand signal from government,” making it hard to plan long-term investments, said Eric Fanning, former Army Secretary (and Acting Air Force Secretary) who’s now CEO of the Aerospace Industries Association. “The smaller companies have all sorts of other barriers,” especially with the administrative overhead necessary to meet government requirements. And, he said, innovators whose technology has wide commercial applications — like drones or artificial intelligence — can afford to give up on government sales if the bureaucracy proves too difficult a customer.
Being a good customer is crucial, agreed Sreenivas Ramaswamy, a senior advisor at the Commerce Department. “I’ve been in this role for two years,” he said, “and one of things I’ve seen … is how much people underestimate the role of the Defense Department in issuing the demand signal.”
A Government Solution
So how to fix this? The government needs to make “bigger, bolder, more flexible bets” on emerging, high-potential technologies, the report recommends, and it needs to “foster stronger relationships” with the private-sector investors that drive most innovation in the modern economy, using government funding as seed money to draw larger private investments.
This does not require massive government purchases to build up entire industries, Ramaswamy emphasized. Instead, he argued, the government can help kickstart new sectors by buying a relatively small amount of new technology early on, when the price is still too high for most commercial applications.
But encouraging and exploiting private-sector investment is not something the government usually thinks about. “It wasn’t until I became chairman that I became focused on outside investment and talked to investors,” said Thornberry, “[because] we don’t vote on that.”
“Government… as a whole does not understand the private capital that is leveraged by the Department of Defense,” said Fanning. A modest Pentagon purchase can incentivize the private sector to make its own investments in the hope of landing future sales — if those investors feel they can rely on that future government funding to materialize.
So what mechanisms would work for this kind of funding? Congress and the Pentagon have enacted significant reforms, like Mid-Tier Acquisition — although there’s been recent pushback about MTA on Capitol Hill — and the Software Pathway. But for both fiscal and cultural reasons, the experts said at today’s conference, it’s been a struggle to scale these up to have a major impact.
“The government is not an easy customer to move from prototype … to production,” said Fanning, “especially if you’ve developed it yourself” outside the government’s R&D pipelines.
“Starting prototype programs has never been easier. This is a good thing,” said assistant Army secretary Douglas Bush, the service’s chief acquisition officer. “[But] it’s still difficult to just carve out the money to take those prototypes and put them into production because again, when you get to that stage, you’re now competing with everything else, fighting for those limited production dollars… It’s a shark tank.”
“Congress often does some of the hard work of pushing innovations and innovative solutions on the Department,” Bush added. “That’s good, [but] we need to do a better job on that on our own.”
“There’s no question, there is a cultural challenge [about] being able to take a risk,” Thornberry said. But often, he said, Congress reinforces bureaucratic over-caution by hammering officials who tried something new that didn’t work out, or by imposing strict controls up front.
“We need to do our funding a little differently,” he said, “so that there is a pool of money for a particular purpose” — for example, artificial intelligence — that isn’t tied to a particular program, so Pentagon officials can rapidly respond to emerging opportunities in fast-paced tech sectors.
That way, “you can move fast [without] asking ‘Mother may I?’ two years in advance,” Thornberry said. “There’s got to be parameters around it, but I’d really like to give Doug [Bush] some pots of money for a purpose and then let him make the adjustments as the technology is developed.”
“That would be amazing,” Bush said. The audience burst into laughter.
As our industry continues to move toward more people-focused care, we’re driven to improve patient engagement and enhance outcomes to create a better healthcare experience for everyone. By making it more convenient for people to access their health data wherever and whenever they need it, we can help empower individuals and clinicians to make better health and well-being decisions.
“We imagine a world where individuals are empowered and have their healthcare data at their fingertips,” says Rob Helton, vice president of platform development, Oracle Health. “They have ownership of their data and are able to review and add to it. They can choose to share it, and who it gets shared to. We believe by putting patients in control of their data, they’ll become active participants in managing their health.”
The US federal government is continuing its push to place patients at the center of their own care, including ensuring patients have access to all of their electronic health information(EHI). The 21st Century Cures Act Information Blocking Rule, among other things, ensures patients have electronic access to their EHI in the manner the patient chooses, a right originating in HIPAA’s individual right of access.
Additionally, the federal government is setting minimum standards for health IT (HIT) that chooses to participate in the HIT certification program under the Office of the National Coordinator for HIT. While most healthcare organizations are striving to comply, some struggle to stay on top of the extra work needed to reach the full potential of interoperability. Luckily, technology can help streamline the process, reducing the regulatory burden on providers and making it easier for people to securely access their data.
Automating patient health data access
Historically, for a patient to use a health app of their choosing, a health system would have to review and approve the health app beforehand. However, with Oracle Health’s newly released functionality, called consumer automatic provision, this process is streamlined enabling patients easier access to their health data.
Consumer automatic provision allows consumer-facing health apps to be immediately provisioned for FHIR API endpoints upon registration. From there, health data stored within the electronic health record (EHR) system can be retrieved by the app upon explicit authorization from the patient or their authorized representative. Patients maintain full control over authorizing an app to access any of their EHI or only certain data types.
Advancing meaningful health insights and maintaining regulatory compliance
When people can use apps on their smartphones to measure and record critical information like physical activity, weight, sleep, heartbeat, and blood sugar level or track their medicine to ensure they don’t miss a dose, they can make better decisions about their health.
The bottom line of these regulations is patients have the right to access their information electronically for any purpose they may choose, and that right should be facilitated by health IT developers and healthcare providers. Automatic provisioning of successfully registered consumer apps for your FHIR API endpoint helps patients access health information through any app of their choice in a timely fashion without any additional action by your organization.
Protecting patient health data
Protections are built into the FHIR APIs to help keep health data private and secure, but there are potential risks involved with authorizing the disclosure of EHI to a third-party app. While these risks are largely mitigated by protections built into the API connections and access, we all have a responsibility to help educate patients on how to keep their data safe. Although the constraints and protections that Oracle Health can put in place under the 21st Century Cures Act regulations are limited, here are the actions we’re taking to further protect consumer privacy and make the health app ecosystem more reliable and secure for consumers.
Employing industry-standard privacy and security protections as a built-in feature of the APIs
Developing enhancements to provide consumers with details of an app’s posture in relation to certain security-related best practices as part of the authorization user interface
Instructing developers connected with Oracle Health APIs to align with industry best practices for the safe and ethical handling of EHI communicated by the CARIN Alliance
Implementing patient-directed scope selection and redaction in the consumer app authorization workflow
The CARIN Alliance provides useful tips for consumers including:
Only download from trusted app stores
Always review app privacy terms and conditions
Know the app developer
Find out if the app is sharing or selling your data
Streamlining patients’ ability to access personal health data through an app of their choice is an important part of prioritizing the human experience in healthcare. Consumer automatic provision is one step of several that we’re taking to help patients control their own health and care, enhance clinical decision-making, and improve regulatory compliance for greater interoperability.
The official cautioned that “modernization would require VistA be rewritten almost from scratch.”
The Department of Veterans Affairs must continue to maintain its legacy health information system as it moves forward with its multi-billion dollar electronic health record modernization project, even though the decades-old software is too outdated to serve as a viable long-term solution, a VA official said during a House Veterans’ Affairs subcommittee hearing on Tuesday.
Daniel McCune, VA’s executive director of software product management, told members of the committee’s Technology Modernization Subcommittee that the Veterans Health Information Systems and Technology Architecture—or VistA—“has served VA and veterans for over 40 years, and we are aware of its limitations.” But he added that, as the agency continues to slowly deploy the new Oracle Cerner EHR system, “VistA remains our authoritative source of veteran data.”
Modernizing the EHR system has been a priority for VA for some time, albeit one with few tangible results until now. Prior to awarding Cerner the contract for the new EHR system—known as Millennium—in 2018, VA spent years attempting to modernize VistA. During a House Veterans’ Affairs Committee hearing on Feb. 28, U.S. Comptroller General Gene Dodaro told lawmakers that VA has spent “over $1.7 billion dollars for failed predecessor electronic healthcare record systems that either failed, or did not come to fruition.”
Rep. Matt Rosendale, R-Mont.—the chairman of the subcommittee—said VistA “must be maintained,” and called for the VA to identify “key areas of VistA that need to be modernized and are feasible to undertake.” Rosendale previously introduced legislation in January that would terminate the VA’s transition to the Oracle Cerner system if the agency “cannot demonstrate significant improvement” in its deployment.
“The reality is, regardless of whether the Oracle Cerner implementation can be accomplished, and regardless of how we feel about that, the VA will probably continue to rely on VistA for at least another decade, and some of the elements of VistA will probably never go away because no replacement even exists,” Rosendale added.
Rep. Sheila Cherfilus-McCormick, D-Fla.—the panel’s ranking member—also agreed that maintaining VistA “is imperative” for the VA, noting that “there are functions of VistA that are not related to the EHR and would likely exist long after the EHR is replaced.”
But Cherfilus-McCormick expressed concerns about the current state of VistA, including cybersecurity risks, data management issues and a code-base “considered to be obsolete by many.”
“I’m not here to say that the Oracle Cerner approach in EHRM is going well, but I’m not sure returning to VistA is correct either,” Cherfilus-McCormick added.
McCune agreed that maintaining VistA during the deployment of the new Oracle Cerner EHR system was essential, noting that VA has already been working to enhance the legacy system by “standardizing VistA code,” implementing an API gateway and transitioning portions of the system to a cloud environment. He noted that “20 instances of Vista have been moved to cloud, with an additional 54 planned this year.”
But McCune told lawmakers that VistA is “an old technology ill-suited for the modern digital age,” and added that “modernization would require VistA be rewritten almost from scratch, at a great cost and great risk.”
McCune highlighted the fact that VistA is written in an old programming language called Mumps, which he noted “is not taught in computer science classes.”While he said VA had been able to retain Mumps programmers “much longer than a typical workforce,” he added that “approximately 70% of our Mump programmers today are retirement eligible, and we have few options to hire or contract additional ones.”
The rollout of the new Oracle Cerner EHR system, however, has been fraught with technical issues, patient safety concerns, delays and cost overruns since it first went live at the Mann-Grandstaff VA Medical Center in Spokane, Washington in 2020. A report released in July 2022 by the VA’s Office of Inspector General found that the new software deployed at the Spokane facility inadvertently routed more than 11,000 orders for clinical services to an “unknown queue” without alerting clinicians, resulting in “multiple events of patient harm.”
VA announced in October that it was pausing further rollouts of the Oracle Cerner software until June 2023 to give the agency time to “fully assess performance and address every concern” with the system’s deployment. The agency announced last month that it was also pushing back the planned deployment of the Oracle Cerner EHR system at the VA Ann Arbor Healthcare System until later this year or in 2024. The go-live for Ann Arbor was originally scheduled to happen in June of this year.
Ongoing issues with the new software’s deployment have only increased lawmakers’ worries about the viability and usability of the new system. Rosendale, in particular, expressed concern during the hearing about reported disparities regarding patient safety when it came to medical facilities continuing to use VistA, and those that have transitioned to the Oracle Cerner EHR system.
Rosendale cited VA’s patient safety reports related to its EHR systems and noted that—for the 166 medical facilities still using VistA—the agency received 12,644 reports in 2020; 14,637 in 2021; and 9,211 in 2022. For the two years after Cerner went live in Spokane, Rosendale said VA received a combined 1,033 patient safety reports at that facility alone.
“That’s over 500 reports per year from one hospital using Cerner,” he said. “One hospital, compared to an average of 55 reports annually from the VistA hospitals.”
Rosendale said the data showed that, as a result of the Oracle Cerner EHR system’s beleaguered rollout, Mann-Grandstaff has “become the most dangerous VA hospital in the country.”
COMMENTARY | Many agency chief data officers would like to see the creation and appointment of a federal CDO.
Here’s the essential irony behind data in the federal government: Agencies have captured and cataloged plenty of it, but they still aren’t doing nearly enough with it to maximize its value.
The current administration is attempting to fix this: In its Federal Data Strategy 2021 Action Plan, the White House calls for agencies to pursue a “robust [and] integrated approach to managing and using data” while still protecting security, privacy and confidentiality. Its ten-year vision includes goals such as the initiation of data governance, planning and infrastructure; the optimization of self-service analytics capabilities; and the execution of proactive, evidence-based decisions.
Then, late last year, the Data Foundation released its annual survey of federal chief data officers to get a sense of agencies’ responses to the plan. Findings revealed that less than one-quarter of CDOs are making notable progress in implementing it.
Another intriguing finding which stood out was that three of five survey participants would like to see the creation and appointment of a federal CDO. Such leadership could lend needed vision and support to help them achieve key components of success, which include the development of data strategy and governance (as cited by two-thirds of the CDOs); the facilitation of data-driven decision-making at all levels (nearly one-half); and the enabling of data-sharing (also nearly one-half).
This leads to a central question: If the White House established a federal CDO position, what should be the top priorities on his or her “to do” list? I’d recommend the following:
Encourage agency CDOs to “think different”
To borrow from the iconic Apple ad slogan, a federal CDO could start by encouraging agency-level CDOs to shift their focus. Too many spend most of their time attempting to catalog their data, and then set policies for it. True, those CDOs need to play a leadership role in governance. But they should also drive teams to leverage data for innovation, by helping them discover how to securely share it and maximize its value. Unfortunately, this isn’t happening nearly enough.
Much of the problem is rooted in fixed mindsets and organizational culture. Data owners build a “hoarding mentality” about data—they want to gather lots of it to create impressive inventories and then lord over this digital fiefdom. But that’s as far as it goes. Agency CDOs do not take the critical next step of making data accessible and shareable. A federal CDO must champion the idea that cataloging data is really just a starting point, and hardly the endgame.
Direct agencies to liberate users with free-flowing—but secure—access to data
Agency CDOs often resist opening up data because they fear the consequences. They worry about all of the “what ifs,” e.g., “what if ‘top secret’ information trickles down to ‘secret?’” This freezes any forward progress. They don’t want to overshare and invite potential risks.
Much of the problem is rooted in the continued dependence on siloed legacy systems that require endless paperwork and labor-intensive oversight. Then, there are the highly antiquated security approaches that are still in play. To successfully protect massive amounts of data while still making it readily accessible and shareable, CDOs need to adopt an attribute-based access control model.
Simply defined, ABAC provides dynamic, real-time data access decisions and—coupled with tools such as data-use agreements and data-activity logging—ensures authorized users can proceed without security-based bottlenecks or friction while avoiding new risks for their agencies. ABAC is based on user and data attributes for data security, determining who should have access to sensitive data and whether they are seeking it for the right reasons. With this, it dynamically empowers authorization to sensitive data, applications and databases in real-time.
Issue report cards that grade real progress
Whenever the government establishes a new “fed czar,” we see lots of lofty proclamations and visionary statements. But we don’t see enough follow-through that results in actual, meaningful progress. A federal CDO needs to be empowered enough to inspire change.
Fully funded programs aimed at the improved securing, sharing and leveraging of data would help, obviously. And the federal CDO could make agency counterparts more accountable by developing metrics which measure how effectively they are deploying data sets, enabling end users and so on, and then by issuing regular report cards that publicly grade agencies on these key capabilities.
Certainly, the appointment of a federal CDO may result in much value-generating impact. But it might end up as yet another lost opportunity. A federal CDO who is genuinely empowered should lead a charge that leaps far beyond simply governance and proclamations, and instead goes toward optimal innovation. Doing so requires the enforcement of security controls, so people can fully utilize and share data sets. Without this, data is worthless. But with it, the federal CDO would truly drive agencies to not only “think different,” but do different.
Chris Brown is the chief technology officer for Immuta.
COMMENTARY | Many agency chief data officers would like to see the creation and appointment of a federal CDO.
Here’s the essential irony behind data in the federal government: Agencies have captured and cataloged plenty of it, but they still aren’t doing nearly enough with it to maximize its value.
The current administration is attempting to fix this: In its Federal Data Strategy 2021 Action Plan, the White House calls for agencies to pursue a “robust [and] integrated approach to managing and using data” while still protecting security, privacy and confidentiality. Its ten-year vision includes goals such as the initiation of data governance, planning and infrastructure; the optimization of self-service analytics capabilities; and the execution of proactive, evidence-based decisions.
Then, late last year, the Data Foundation released its annual survey of federal chief data officers to get a sense of agencies’ responses to the plan. Findings revealed that less than one-quarter of CDOs are making notable progress in implementing it.
Another intriguing finding which stood out was that three of five survey participants would like to see the creation and appointment of a federal CDO. Such leadership could lend needed vision and support to help them achieve key components of success, which include the development of data strategy and governance (as cited by two-thirds of the CDOs); the facilitation of data-driven decision-making at all levels (nearly one-half); and the enabling of data-sharing (also nearly one-half).
This leads to a central question: If the White House established a federal CDO position, what should be the top priorities on his or her “to do” list? I’d recommend the following:
Encourage agency CDOs to “think different”
To borrow from the iconic Apple ad slogan, a federal CDO could start by encouraging agency-level CDOs to shift their focus. Too many spend most of their time attempting to catalog their data, and then set policies for it. True, those CDOs need to play a leadership role in governance. But they should also drive teams to leverage data for innovation, by helping them discover how to securely share it and maximize its value. Unfortunately, this isn’t happening nearly enough.
Much of the problem is rooted in fixed mindsets and organizational culture. Data owners build a “hoarding mentality” about data—they want to gather lots of it to create impressive inventories and then lord over this digital fiefdom. But that’s as far as it goes. Agency CDOs do not take the critical next step of making data accessible and shareable. A federal CDO must champion the idea that cataloging data is really just a starting point, and hardly the endgame.
Direct agencies to liberate users with free-flowing—but secure—access to data
Agency CDOs often resist opening up data because they fear the consequences. They worry about all of the “what ifs,” e.g., “what if ‘top secret’ information trickles down to ‘secret?’” This freezes any forward progress. They don’t want to overshare and invite potential risks.
Much of the problem is rooted in the continued dependence on siloed legacy systems that require endless paperwork and labor-intensive oversight. Then, there are the highly antiquated security approaches that are still in play. To successfully protect massive amounts of data while still making it readily accessible and shareable, CDOs need to adopt an attribute-based access control model.
Simply defined, ABAC provides dynamic, real-time data access decisions and—coupled with tools such as data-use agreements and data-activity logging—ensures authorized users can proceed without security-based bottlenecks or friction while avoiding new risks for their agencies. ABAC is based on user and data attributes for data security, determining who should have access to sensitive data and whether they are seeking it for the right reasons. With this, it dynamically empowers authorization to sensitive data, applications and databases in real-time.
Issue report cards that grade real progress
Whenever the government establishes a new “fed czar,” we see lots of lofty proclamations and visionary statements. But we don’t see enough follow-through that results in actual, meaningful progress. A federal CDO needs to be empowered enough to inspire change.
Fully funded programs aimed at the improved securing, sharing and leveraging of data would help, obviously. And the federal CDO could make agency counterparts more accountable by developing metrics which measure how effectively they are deploying data sets, enabling end users and so on, and then by issuing regular report cards that publicly grade agencies on these key capabilities.
Certainly, the appointment of a federal CDO may result in much value-generating impact. But it might end up as yet another lost opportunity. A federal CDO who is genuinely empowered should lead a charge that leaps far beyond simply governance and proclamations, and instead goes toward optimal innovation. Doing so requires the enforcement of security controls, so people can fully utilize and share data sets. Without this, data is worthless. But with it, the federal CDO would truly drive agencies to not only “think different,” but do different.
Chris Brown is the chief technology officer for Immuta.
As private sector companies and academia shoot for the moon when it comes to exploring the potential of artificial intelligence, in the federal government, discussions are more down to earth.
The heights to which AI technology has ventured is both awesome and scary, panelists discussed at a conference last month hosted by the Advanced Technology Academic Research Center, which focuses on technology and government. Take recent headlines about ChatGPT,Microsoft’s uncanny chatbot, and Stanford’s “Woebot” robo-therapists, for example. In the government, AI is a nascent novelty, though agencies and the military say they already see use cases within reach.
“What I’m most excited about is not necessarily the sexiest AI technologies … things that are showing up in science magazines,” said Greg Singleton, chief artificial intelligence officer at the U.S. Department of Health and Human Services. “What I’m really excited about for the department is seeing where we can apply these technologies to the general business that we do.”
That might mean holding off on the self-driving cars and robot secretaries for now and using AI instead to pick the low-hanging fruit of automating burdensome tasks that have grown for agencies while their workforces have not.
HHS employs 80,000 people and saw a nearly 30% increase in Medicaid/CHIP enrollment right before the pandemic.
“Our workforce is dedicated, they want to do a good job, but you have challenges when people are overloaded by processes, overloaded by volume, overloaded by batch transactional communications,” Singleton said at the ATARC conference.
The department has identified dozens of scenarios that could benefit from AI, including an automatic tool for finding chemical names in biomedical literature, a virtual assistant for finding grants, and a bot to ensure disability accommodations follow an employee who is being reassigned or promoted.
There are other agencies facing similar problems of growing workloads and deficits in manpower. The Social Security Administration’s workforce is at a 25-year low while the number of Americans receiving benefits has grown 20% since 2010. At the Federal Deposit Insurance Corporation, 38% of its workforce will be able to retire by 2027.
These examples present opportunities for AI to render simple, clerical tasks more efficient through existing technology that is relatively quick and easy to scale, panelists said.
Failure is not an option
In October, President Joe Biden signed a bill that would require a government-led AI training program for its acquisition workforce that would be regularly updated. The government is also buying AI technology experimentally with the focus on research-based deals as opposed to hardware or software contracts, which is more mindful of risk, according to analysis by the Brookings Institute.
“The federal government is not a Silicon Valley start-up; it does not look kindly on failure, no matter the potential future payoff,” wrote Corin Stone, a former Office of the Director of National Intelligence senior official, in an article forJust Security. “The government is understandably hesitant to take risk, especially when it comes to national security activities and taxpayer dollars.
Panelists noted that appetites for AI can differ across agencies, as do budgets and resources. Congress has said that regardless, the government overall needs to play catch up on AI, with Sen. Joe Manchin saying it would be “disastrous” if the U.S. failed to be ready for China and Russia’s scaling up of cyber warfare in a hearing last year.
“It’s not just about rudimentary tasks,” said Bonnie Evangelista, who works in the Office of the Chief Digital and Artificial Intelligence Officer at the U.S. Department of Defense. “Even from a government perspective, I think there are some of us who are trying to go big and go bold, but it can be hard from a workforce perspective. It’s hard to scale because of culture.”
AI may also draw concerns from those who have misgivings about its ethical use and from union representatives who may want to fully understand repercussions for the workforce, including any possible reduction of work hours.
Agency leaders must clear up the common misconception that AI/ML infrastructure, data governance and efficiency must be perfectly aligned to get started.
By Jay Meil
Research suggests that AI can “outperform workers in an increasing set of complex tasks mainly done by educated workers,” though technology has also been recognized as a source of new kinds of jobs, a report for the White House’s Council of Economic Advisers found. Panelists said AI has the potential to free up employee’s workloads so they can devote time to creative or innovative projects and “upskill,” or get up to speed on new tech.
Evangelista also said agencies are building out new teams and positions devoted to exploring AI. Last April, the Department of Defense hired Craig Martell to lead the Chief Digital and Artificial Intelligence Office, which was created just two months earlier.
Meikle Paschal, a program manager for robotic process automation at the U.S. Department of Homeland Security, said the easiest way to standardize a process is to automate it, and solving problems that way is an alternative to hiring more people to transcribe or paper push.
“This is like a generational gift that we’re going to give to the people who come after us in the workforce,” he said. “And if we hold back at this point because we’re scared to make a mistake, then the next three or four generations are going to be behind the ball.”
Molly Weisner is a staff reporter for Federal Times where she covers labor, policy and contracting pertaining to the government workforce. She made previous stops at USA Today and McClatchy as a digital producer, and worked at The New York Times as a copy editor. Molly majored in journalism at the University of North Carolina at Chapel Hill.
Webinar to facilitate expert technical exchange of ideas on practical uses for noisy quantum systems
OUTREACH@DARPA.MIL 3/7/2023
Although fault-tolerant quantum computers are projected to be years to decades away, processors made from tens to hundreds of quantum bits have made significant progress in recent years, especially when working in tandem with a classical computer. These hybrid quantum/classical systems could enable technical disruption soon by superseding the best classical-only supercomputers in solving difficult optimization challenges and related problems of interest to defense, security, and industry.
DARPA is sponsoring a live webinar on Tuesday, April 11, 2023, to highlight an Advanced Research Concept (ARC) topic called Imagining Practical Applications for a Quantum Tomorrow (IMPAQT). Registrants will have the opportunity to hear from government experts, university professors, and industry-leading quantum hardware providers as well as participate in live question-and-answer sessions.
“We’re billing the webinar as a help day for quantum algorithmists,” said DARPA Innovation Fellow Alex Place, who is leading the event. “Building on successes of DARPA’s ONISQ (Optimization with Noisy Intermediate-Scale Quantum devices) program, the webinar’s goal is to spark innovative ideas and discuss new concepts for making near-term intermediate scale quantum computers, as well as sought-after fault tolerant processors, practical and useful for solving real problems. We’re encouraging teams from academia and industry who have expertise in quantum algorithms or a practical problem that could be mapped to a quantum processor to engage with IMPAQT.”
IMPAQT is the first of many anticipated DARPA ARC topics. The ARC initiative is designed to speed the pace of innovation by rapidly exploring and analyzing a high volume of promising new ideas. For more information about ARC, to view the open IMPAQT solicitation, and to see new topics as they become available, visit www.darpa.mil/arc.The ARC topics are managed by DARPA’s innovation fellows, who include recent Ph.D. graduates (within five years of receiving a doctorate) and active-duty military with STEM degrees. To learn more about the DARPA Innovation Fellowship, current fellows, and how you can apply to become a fellow visit: www.darpa.mil/innovationfellowship.
Image caption: A superconducting quantum bit (qubit) ready to be cooled to -273.14°C.
Media with inquiries should contact DARPA Public Affairs at outreach@darpa.mil