Two instances of the homegrown electronic health record have been moved to the cloud as part of a pilot project, but it’s not clear if funding or support exists for more cloud migration as VA transitions to a commercial electronic health record.
Kurt DelBene, the new chief information officer at the Department of Veterans Affairs, said he planned to continue support of the agency’s homegrown electronic health records system during the transition to a commercial system scheduled to be completed in 2028.
The modernization project, launched in 2017 with a planned $16 billion budget, has its own section of the VA’s budget outside the Office of Information and Technology, and its leader reports to the agency’s deputy secretary, Donald Remy. But OI&T is deeply enmeshed in the shift to the same Cerner commercial electronic health record system, providing support for infrastructure development and readiness.
At the same time, OI&T is responsible for keeping the Veterans Health Information Systems and Technology Architecture (Vista) operational. For now, most patient care is handled through Vista; the Cerner product is up and running at the Mann Grandstaff Medical Center in Spokane, Wash., and a few more deployments are scheduled in the near term.
“We’re going to be using Vista for a period of time, and during that we can’t falter on delivering great care to veterans,” DelBene said on a Feb. 17 call with reporters. “So I think of Vista as a system that we absolutely have to continue to modernize. The health care landscape changes, and Vista needs to change and continue with the great support as it has had before.”
Todd Simpson, VA’s deputy assistant secretary of DevSecOps, said on the same call that two of the 130 separate instances of Vista have been put into VA’s Amazon Web Services cloud as part of a pilot. In addition, all the backup data for Vista has been moved into the cloud.
Simpson said it was possible that more instances of Vista could be moved to the cloud. “It comes down to funding and overall strategic direction driven from [Electronic Health Record Modernization], leadership and of course the customer,” meaning the Veterans Health Administration.
Funding is a big issue for Vista, in part because its costs are difficult to measure. The homegrown application now has 130 separate instances in use at different VA facilities.
Because of the way the legacy system has sprawled into multiple systems, “VA lacks a comprehensive definition of Vista and cannot accurately report Vista costs,” the agency stated in its 2022 budget request.
Additionally, DelBene said plans to sunset Vista will have to wait for the EHRM project to advance. “Part of that is further out,” DelBene said. “My major commitment is to continue to support necessary changes. Then we’ll figure out what transition looks like.”
An influx of money from the CARES Act helped the Department of the Navy (DON) make major strides in modernizing its networks. Now, leaders say, it’s time to focus on bolstering those networks’ cybersecurity, including with a major pivot from compliance-driven approaches to a new philosophy called “Cyber Ready,” which focuses instead on continuous monitoring and ongoing risk assessments.
The idea of keeping constant tabs on a system’s cyber health certainly isn’t new — indeed, it’s central to the National Institute of Standards and Technology Risk Management Framework DoD already uses to authorize systems on its networks, as a theoretical matter anyway.
But Navy IT leaders think they’ve hit on a new framing that will help with the cultural changes needed to actually, finally, move away from checklist-based, single-point-in-time security approvals.
Aaron Weis, the Navy Department’s CIO, said describing the problem in terms of “readiness,” akin to the way the military measures its servicemembers and weapons systems’ ability to execute missions on a day-to-day basis, has found a good deal of resonance in the Navy and Marine Corps.
“The whole approach is that we measure readiness in a very holistic way, and it’s something that you dynamically manage,” he said this week at the Navy Department’s annual IT conference in San Diego. “It’s taking the way that we measure ourselves in a very complete way and bringing that to the topic of cybersecurity, and my sense is we’ve really hit on a message that resonates — an approach that can bear fruit — and we’ve gotten really favorable engagement on this idea.”
Exactly what “Cyber Ready” looks like in practice is still to be determined. The concept has only been under discussion since October, and the DON plans to test it with a handful of pilot projects in the coming year before drawing any broad policy conclusions.
But Weis said a key objective is to move away from the current practice of granting systems an Authority to Operate (ATO) once every three years. That approach, he said, incentivizes “bad behavior.”
“One of the things that we’re saying as part of Cyber Ready is that the idea of a three-year ATO is wrongheaded — you fill out a giant spreadsheet and do 10,000 pushups and then you get an ATO that’s good for three years,” he said. “People get that ATO and they go, ‘Yeah, job done.’ And then what happens? Over the next three years, that system hasn’t evolved or been updated. It’s no longer secured, and it ends up as a high-risk escalation that ends up on my desk. We need to get to this idea that you’re always earning your ATO. There are a lot of snazzy phrases for that, like continuous ATO, but the idea is you’re always earning and re-earning your ATO every day.”
And Navy IT leaders think their technology developers are hungry for changes along these lines.
Jane Rathbun, the deputy assistant secretary of the Navy for information warfare and enterprise services, said the acquisition community is often frustrated by the process of building new systems and only turning them over to cybersecurity officials for review and approval once they’re ready to deploy.
She said developers would actually prefer to have the CISO community embedded in their programs from the start.
“They also see a need for common platforms so that they can abstract the actual capability from the IT infrastructure, so that that can be prepared and ATO’d — and not repeated every time we want to add real capability to the Navy and to the Marine Corps,” she said. “They want inheritance. If we’ve gotten Marine Corps approval on some platform or solution that we now want to use in the Navy, we’re all one big happy family. We don’t need to be doing things twice … They do agree that they must comply with cybersecurity, and they want to do that. That’s never been the problem, it’s just about how we get there.”
Cyber Ready also dovetails with an effort the Navy has already had underway for the past two years to streamline its implementation of the Risk Management Framework. Since then, the Navy has reduced the number of security controls it considers “critical” from about 600 to just 72.
And Rear Adm. Susan BryerJoyner, the Navy’s chief information security officer, said her service is looking for ways to automate security checks for many of those remaining controls.
“We also have a commercial solutions opening for the tools that allow us to automate the workflows associated with the compliance checks, because there will still be compliance checks,” she said. “This is not the ticket to say nobody ever has to scan or patch again. This is the ticket that says we’re going to figure out a better way of understanding how well you’re scanning and patching. And on a near-real-time basis, we’re going to identify the vulnerabilities the Fleet Cyber Command commander needs to know about to order a network maneuver when we have unacceptable risk on the network.”
The new approach is also likely to lean heavily on automated approaches to red-teaming and penetration testing, according to Renata Spinks, the Marine Corps’ CISO.
While there’s definitely no intention to eliminate human red teams, the sea services do need ways to test their defenses against real-world threats on a much more ongoing basis, she said. And the frequency of those checks will often be determined by how much risk the DON is willing to accept for a particular system or segment of the network.
“If we use real-time information from the people who are closest to the network, our policy should align to what’s been proven. We shouldn’t set policy and try to make people retrofit their systems with some grand new idea, because that policy is going to be reversed,” she said. “Pen testing will not only tell me how ready you are, but it will also teach me the new things that the adversary is doing. What are some of the things that maybe we’ve become complacent about? Or, do passwords need to be 16 characters? Maybe that’s no longer effective. We need to know how to be ready for the next thing, not if it occurs, but when it occurs.”
The Department of Defense must take space into consideration while developing a zero-trust security framework across its enterprise, CIO John Sherman said Thursday.
The DOD‘s evolving mission in space defense with the recent standup of the U.S. Space Force will be inherently digitally focused. And as the U.S. competes more with near-peer adversaries like China and Russia, securing and defending assets in space from cyberthreats is a growing concern for Sherman, he said at AFCEA NOVA’s Space Force IT Day.
“I want to extend [zero trust] to not only when you think terrestrial domain and kind of traditional networks,” he said. “I think the space piece has got to be a critical part of this discussion. And I look forward to engaging the Department of the Air Force and our Space Force colleagues as we see how this can be applied.”
Sherman said the adoption of a zero-trust framework is a top priority for his office, adding that the topic is “very much prominent right now with the president’s [cybersecurity] executive order of last year.” He highlighted the Defense Information Systems Agency’s Thunderdome program and the move to Microsoft Office 365 as “examples of what a zero-trust framework can look like for the entire Department of Defense.”
And there’s no reason space should be thought of any differently, Sherman said.
“How does zero trust relate when we’re talking about things in orbit? I think it relates in many ways,” he said. “As you look at the entire segment, from the space segment down to the ground, there’s plenty of opportunities for us to microsegment, to think differently about access, about multifactor authentication, about not trusting any piece of that in a way that we have in the past and assuming a very sophisticated adversary is trying to get to our most precious data and dataflows on there.”
Sherman said the department is dealing with a host of technical debt after it spent much of the past decade “kick[ing] the can down the road a bit” while conducting operations in Iraq, Afghanistan and the Middle East. Encryption is one area in particular that “we have to get after with alacrity now,” Sherman, if the U.S. is to maintain its defense prowess.
“Cybersecurity has to be considered a survivability imperative, just like chaff, flares, armor or other capabilities,” he said. “And as we get to on-orbit capabilities, which get into other types of classifications, you can let your imagination run in terms of other types of defensive things we have to have. Cyber cannot be an option anymore.”
Sherman also hopes the Cybersecurity Maturity Model Certification (CMMC) — which was recently transferred to Sherman’s oversight — will “help raise the waterline” for the security of defense information handled by DOD’s industrial partners. And while contractors may be concerned by the costs imposed by CMMC, Sherman said it’s more costly to deal with the consequences of not being secure.
“You know what else costs money: exfiltration of data you and your company have spent blood sweat, tears, and a lot of money to develop that’s going to end up on a weapon platform in Shanghai or outside of Moscow,” Sherman said. “We’re not gonna let that happen on our watch. This is basic hygiene to raise the water level to make sure we can protect our sensitive data so when our service members have to go into action, they’re not going to have an unfair position because our adversaries have already stolen key data and technologies that will put them at an advantage.”
I have followed the evolution of cybersecurity for almost three decades. The one constant is that as quickly as the underlying technology advances, so, too, does the cyber threat.
To understand how things might play out in the coming years, I spoke with four cybersecurity experts, each of whom brings a different lens: a national security leader; a pioneering technologist; a veteran CISO inside one of the most sophisticated technology companies; and a prominent cryptography professor whose students will invariably shape the course of the field.
They are unified on one point: cybersecurity has never been more central and more complex. No one can afford to fall behind.
The National Security Dilemma
Richard A. Clarke’s experience in cybersecurity and counter terrorism stretches across three decades of service at the State Department, the Pentagon, and as counselor to three US presidents. Today he remains an indispensable advisor to countries and businesses on cyber risk and one of the preeminent thought leaders in the space.
As a national security matter, Clarke believes the US government is well-organized for cyber defense, but persistently falls short of providing adequate funding. In a recent conversation with me, he suggested that most informal, criminal hacking organizations around the world could probably be shut down by a combination of the NSA, CIA, FBI, and Cyber Command – “if only the US was willing to expand the resources it now devotes to counter-cyber warfare.”
Nation-state cyber terror is a different issue. “Iran, Russia, China all have cyber vulnerability. But so do we,” he points out. In the current conflict between Russia and the Ukraine, he worries that every non-cyber move by the US – say, shutting down Russian access to the SWIFT messaging system – could trigger a damaging retaliatory strike against US critical infrastructure.
“The problem is that we don’t know how to handle the escalation of cyber warfare between countries,” he told me. New strategies need to be developed. He cites the 1965 seminal work by strategist Herman Kahn, On Escalation, which addressed how major powers could contain and manage the risks of nuclear conflict. “We need a similar roadmap for managing escalation in cyber attacks.”
Clarke’s concern for businesses is a repetition of the SolarWinds attack that went undetected for months. “The biggest threat to most companies is a cyber attack that comes through the software supply chain. That’s what happened to SolarWinds. Today, every company gets a staggering number of software updates every month.”
Companies are vulnerable with no clear place to seek help. “The US government would likely come to the aid of a major defense contractor hit by a cyber attack,” Clarke said. “Large banks might also expect support. But other companies need more clarity about whether or when US government resources would be deployed to help them recover from an attack.”
The Problem of Cybersecurity Complexity
Nir Zuk, the legendary founder and CTO of Palo Alto Networks, remains frustrated by a fundamental truth of cybersecurity: customers have no credible way of knowing whether the products they’ve purchased actually work. Failures are only discovered after an attack has breached their security.
Zuk believes this is one reason why cybersecurity conversations have moved up the ladder of the enterprise hierarchy, from engineers to the CISO to the CEO and the board. He sees growing awareness at all levels of business that simply buying the latest vendor “solution” is no longer a viable strategy. Enterprises must understand why cybersecurity is growing both more sophisticated and more difficult to manage.
According to Zuk, operationalizing cyber systems is the bottleneck. Customers can’t keep up with the volume of information generated by cloud and machine learning technology. An alert about a potential breach might show the whole chain of the attack, stretching back into the architecture of the interconnected components in the cloud. “It’s very hard for any human to absorb and respond to all that information,” Zuk says.
This dynamic makes automation of security crucial and inevitable. But Zuk worries most vendors and companies will get it backwards. Rather than “adding one more automated feature to human tools,” he advocates thinking about automated security the way Tesla thinks about autonomous driving: first create the autonomous products, then add the human factor.
Two threats concern him. First, ransomware continues to spread with impunity. No foolproof system exists against an attacker who only needs to be lucky enough to breach your system once. The best antidote, he argues, is to turn the tables by focusing on how to detect a breach once it has penetrated the system; that’s when the attacker must hide 100% of the time. But he quickly concedes that a good backup and data protection plan may still be the best strategy.
Supply chain attacks are the second major threat and are hard to prevent because the enterprise that is victimized is not the first target of attack. Instead, hackers are going after the vendors in their supply chain, exactly what happened in the SolarWinds attack.
The problems, Zuk believes, are knowable. The challenge is how companies will respond.
The New Corporate Imperative
Phil Venables was already an established and highly respected figure in cybersecurity when he joined Alphabet as Google Cloud CISO. He spent over two decades at Goldman Sachs as both CISO and chief operational risk officer.
When he looks at today’s risk landscape, he sees many companies still thinking about cybersecurity the wrong way. “Companies are rushing to invest in cyber software without modernizing their underlying technology,” says Venables. “They are effectively trying to build a fortress on sand.”
Venables argues the cloud should be viewed as a “digital immune system.” He concedes this may sound self-interested for Google’s Cloud CISO. But his case is hard to refute. Writing recently in Forbes, he described the cloud’s persistent ability to update, adapt, and respond to shifting threats as “an accelerating feedback loop” for enterprise IT leaders.
In the coming years both executives and corporate directors will need to become more sophisticated, Venables believes. Not about the technology itself, but about how to build security into products and processes. Venables argues, business leaders should be prepared to talk about the digital underpinning and security of a product, just as knowledgeably as they would about supply chains or customer relationships. “Think about secure products, not security products.”
Venables proposes an exercise for a board. Instead of quizzing CEOs and their teams about patch updates or the latest security scanners, directors should ask simply how often the organization updates its software. Not long ago, IT teams boasted about quarterly updates. Venables says that leading-edge companies are typically updating software multiple times a day, or more. That’s the reality of an agile approach to cybersecurity.
The Next Frontier
Dan Boneh is a leading professor in applied cryptography and the co-director of Stanford’s computer security lab. He enjoys a distinct advantage in the world of cybersecurity: he sees what new problems fascinate his students.
Not surprisingly, they are gravitating to a set of problems around blockchain security. One involves the scalability of cryptocurrencies such as Bitcoin or Ethereum, which currently are restricted to conducting about 15 transactions a second. Yet as demand goes up, this limitation is causing transaction fees to rise. The research question is how to move far beyond the 15 transaction-per-second limit without compromising the integrity of the system.
The other security issue with blockchain is privacy. While the virtual ledger offers efficiency and accountability for all types of enterprise transactions, the very nature of blockchain requires that the information can be viewed by others. This is a challenge for companies that want to pay suppliers or even employees through a blockchain system. Researchers are exploring how this can be done securely, without compromising competitive or personal information.
Boneh and his students are also focused on a threat that he believes remains overlooked by most enterprises: adversarial machine learning. For some time, engineers have been refining machine learning algorithms so that a robot or a vehicle can reliably recognize patterns: say, defects in a product or the difference between a stop sign and a yield sign. But Boneh points out that “a growing number of results show how to attack these models.”
Some are breaking into the training data algorithms that make machine learning possible. Others are extracting the model and effectively stealing it so that those with malicious intent can query it for the purpose of infiltration. As machine learning becomes more essential to advanced business operations, a new front of vulnerability opens.
He sees other technical vulnerabilities in the world of cyber defense: how to secure code depositories such as GitHub or how to protect package management systems that automate the uploading and updating of software. The fundamental problem, he argues, is that “the security industry is reactive. It is always focused on last year’s problems.” His research and students are a valuable counterweight to that tendency.
Cybersecurity Remains Foundational
While each of these experts have a distinct vantage point, they affirm that, in the midst of much technology innovation, cybersecurity remains foundational, growing, and increasingly complex. As they point out, AI and machine learning, the balance of security and privacy, the vulnerability of supply chains, the growth of the cloud and blockchain, and the demand for automation are fueling frenzied activity in this space.
We see it in venture capital. During the past two years, there has been a surge of entrepreneurs with new approaches to cybersecurity, and nearly all of them are magnets for capital. Barely a day goes by when I don’t hear from a fledgling cyber start-up. Today there are over two dozen pure play cybersecurity companies that have gone public. More will inevitably follow. The demand is unrelenting.
The reasons are obvious. Cybersecurity has become a kind of virtuous circle. At one time, a breach of a legacy server inside a corporation was disruptive, but its consequences limited. In a world increasingly dependent on interconnected services and users, any single breach has deep ramifications and the potential to create havoc. Cybersecurity remains a long game.
The Department of Veterans Affairs (VA) has faced a troubled rollout of its Electronic Health Records Modernization (EHRM) program, including underreported costs, deficiencies in training, and diminished employee morale. But VA’s recently installed CIO says the agency is coming to grips with those problems and looking for better performance in coming expansions of the program.
Kurt DelBene, who took over in December as VA’s Chief Information Officer (CIO) and Assistant Secretary for Information and Technology, acknowledged during a Feb. 17 call with reporters that the agency could have done a better job in areas such as training. But he also made clear that the rollout itself is “an enormous” task that’s bound to come with lessons learned along the way.
“At its heart, I think that it was always going to be a challenging rollout because electronic health record systems are difficult things to deploy, generally speaking,” DelBene said. “I don’t think there’s anybody in the industry [who] wouldn’t say that.”
The new EHRM program aims to provide a seamless experience for veterans as they transition from receiving care from the Department of Defense (DoD) while on active duty, to receiving care at VA facilities. However, replacing an EHR system that was developed by the VA in the 1980s is not a simple task.
DelBene explained the EHR system that is being replaced – the Veterans Health Information Systems and Technology Architecture (VistA) – “tailored itself deeply to the way that the VA works in particular.” He emphasized the transition for VA clinicians early on “was going to be disruptive.”
VA Office of the Inspector General reports revealed just how disruptive it was, finding “significant deficiencies” in training for the program at the first EHRM deployment site: Mann-Grandstaff VA Medical Center in Spokane, Wash. The OIG reports also found worsened employee morale as a result of the EHRM program and employee concerns about the Cerner Millennium system it runs on.
The Road Ahead for EHRM
DelBene acknowledged the VA’s missteps at the first EHRM launch site and said the agency “could do better in terms of training” going forward.
The assistant secretary also explained that getting the balance right between workflow changes and system changes “is really, really difficult and we could have done a better job there as well.”
“I think our level of preparedness overall for Mann-Grandstaff could have been stronger. And I also think that the clear criteria that we should meet before we go live could have been crisper,” DelBene said.
Neil Evans, a VA physician and the former acting CIO of the agency, said the DoD also “struggled with some of their early sites” with their Cerner rollout, but their lessons learned “are now paying dividends as they’re rolling out.” The DoD has now deployed the same EHR solution at over 50 percent of their sites, according to Evans.
“Transitions are hard and the first place you go, there are going to be lessons learned,” Evans said. “A lot of this is that we’re still relatively early in this transformation journey and … coming at this with a learning mindset and applying lessons learned as we move forward.”
DelBene is hopeful as the VA looks ahead to its next deployment site in Walla Walla, Wash. The VA has “done a lot better job in terms of getting those criteria set and established,” as well as “being collaborative with the clinicians,” he said.
“I feel particularly good about where we are right now with Walla Walla. But again, we’re going to look and see if anything changes that would require us to move that,” he added. “We are on track, but we’re going to make sure that we’re responsive to the stakeholders when we make those decisions along the way.”
Laura Prietula, the new deputy CIO for the EHRM program, said the VA is also establishing a deployment assurance team within the DCIO group. Prietula said the deployment assurance team will be focused on ensuring VA sites have a deep understanding and awareness of the EHRM transformation – from a local IT perspective and the overarching VA IT community.
“This is not just a simple technology implementation,” Prietula said. “And so we want to make sure that the facilities are also ready for those changes from their own process perspective, not just, ‘Are you ready to receive the system?’ So, we’re working very, very closely with the facilities, and we’re setting up different practices for change management all around.”
“I think the actual transformation is going to be as much about change management and people systems and how those come in good alignment – how we work comes in alignment with how the system works – and both are going to adapt to get us to a good place,” DelBene said.
The Department of Defense’s Joint Artificial Intelligence Center recently launched new AI education pilots for thousands of DOD employees that range from executive education for general officers to in-depth coding bootcamps.
The most recent cohort of participants started taking an “AI 101” course in early February through a partnership with the Massachusetts Institute of Technology while another recently entered an AI coding bootcamp. The range of educational offerings from the AI-accelerator is designed to eventually be transitioned to other DOD institutions for tens or even hundreds of thousands of people to learn about AI, Greg Allen, the JAIC’s head of policy and strategy, told FedScoop.
“We are running training pilots to really test,” Allen said. “We partner with the broader department of defense … to help them deliver education materiel at scale.”
Allen added that “ultimately there is going to be hundreds of thousands of folks” getting some form of AI training, out of the DOD’s nearly 3 million employees and service members.
The work stems from a congressional mandate for the JAIC to develop an AI workforce and education strategy in the fiscal 2020 National Defense Authorization Act. The JAIC is now implementing that strategy through educational pilots, Allen said.
In that strategy, the JAIC identified six archetypes of AI learner: Lead AI, Drive AI, Create AI, Employ AI, Facilitate AI and Embed AI. Each type of DOD employee needs a different level of detail on AI, so the JAIC is leaning on different platforms to teach them.
For general and flag officers at the highest ranks of the military, Lead AI is an in-person seminar on the basics of what AI can do and how it will impact the capabilities they oversee. On the other end of the spectrum is Create AI, a group of coders that will be developing machine learning models for the military and need specialized training in developing machine learning models.
One of the adjustments the JAIC has made to its offerings for the Create AI category is a new coding bootcamp on the Python coding language that is often used to develop AI.
“These are folks who actually need all the skills to meet their current and future operational needs,” Allen said.
By 2023 the JAIC hopes to have all these lesson plans transitioned to other organizations, like the Defense Acquisition University or the Air Force’s Digital University.
“The No. 1 thing that brings the most joy to us is when we hear back from past participants … they put what they learned to practice in their jobs,” Allen said.
Since its creation in 1865, the United States Secret Service has been tasked with safeguarding the integrity of the nation’s economy. Initial efforts focused on putting an end to rampant counterfeiting taking place in the years following the American Civil War, a time in which as much as 30% of circulating currency was thought to be illicit. As individuals and organized networks evolved and sought new ways to defraud the public, special agents and criminal investigators with the Secret Service have responded, applying their experience and expertise to preventing other financially motivated crimes such as illicit credit card schemes, fraudulent wire transfers, computer fraud and abuse, and, most recently, the illicit use of digital assets, including cryptocurrencies to facilitate crimes like ransomware attacks.
What are digital assets?
The term digital assets refers broadly to representations of value in digital form, regardless of legal tender status. For example, digital assets include cryptocurrencies, stablecoins and nationally backed central bank digital currencies. Regardless of the label used, or the various definitions ascribed to them, digital assets can be used as a form of money or be a security, a commodity or a derivative of either. Digital assets may be exchanged across digital asset trading platforms, including centralized and decentralized finance platforms, or through peer-to-peer technologies.
Why does the Secret Service investigate the use of digital assets?
The Secret Service is responsible for detecting, investigating, and arresting any person who violates certain laws related to financial systems. In recent years digital assets have increasingly been used to facilitate a growing range of crimes, including various fraud schemes and the use of ransomware.
While the United States has been a leader in setting standards for regulating and supervising the use of digital assets for anti-money laundering and countering the financing of terrorism purposes, poor or nonexistent implementation of those standards in some jurisdictions abroad is presenting significant illicit financing risks that harm the American people and our foreign partners. For example, transnational organized criminals often launder and cash out their illicit proceeds using digital asset service providers in jurisdictions that have not yet effectively implemented international standards.
The ongoing growth in decentralized financial ecosystems, peer-to-peer payment activity and obscured blockchain ledgers presents additional risks to the American people and our foreign partners. When digital assets are abused and used in illicit ways it is in the national interest to take actions to mitigate these risks through law enforcement action and other government authorities. The U.S. Secret Service is committed to doing its part to safeguard the nation from illicit activity involving digital assets.
Useful Definitions
Address
Used to receive and transfer digital assets between users. Addresses are managed and stored within wallets. Addresses range in character length and type, but usually consist of long alphanumeric strings.
Altcoin
Typically any cryptocurrency that is not Bitcoin. An altcoin can be a divergent of the Bitcoin consensus model or a cryptocurrency that is linked to other types of blockchains or proofs of work. For example, Ether is one of the most well- known altcoins and is associated with the Ethereum network.
Central bank digital currencies
A form of digital money, denominated in the national unit of account, which is a direct liability of the central bank.
Stablecoins
A category of digital currencies with mechanisms aimed at maintaining a stable value.
Cryptocurrency
Refers to a digital asset, which may be a medium of exchange, for which ownership records are generated and supported through a decentralized distributed ledger technology that relies on cryptography, such as a blockchain.
Wallet
Software programs that interface with blockchains and generate and/or store public and private keys used to send and receive cryptocurrency. A public key or an address is akin to an email address, and a private key is akin to a password.
Bitcoin
A type of cryptocurrency. Payments or transfers of value made with bitcoin are recorded in the Bitcoin blockchain and thus are not maintained by any single administrator or entity.
Digital asset service provider
A custodian that allows users to buy, sell, store and trade digital assets. Forms of digital asset service providers include exchanges, digital asset ATMs and over-the-counter brokers. There are many such providers operating in the United States.
Blockchain
A decentralized distributed ledger technology where data are shared across network participants and the data are typically linked using cryptography to maintain the integrity of the ledger and execute other functions, including transfer of ownership or value.
Virtual currency
A medium of exchange that can operate like currency but does not have all the traditional attributes of “real” currency, including legal tender status.
Atos’ BullSequana XH3000 to enable exascale-class supercomputers in Q4 2022
Atos has introduced its new BullSequana XH3000 hybrid computing platform that enables exascale-class supercomputers. The new high-performance computing platform is agnostic to the hardware architectures it uses, so it is compatible with CPUs from AMD, Intel, Nvidia, and SiPearl, as well as compute GPUs from AMD, Intel, and Nvidia. In addition, the platform uses liquid cooling and can handle nodes with up to 1000W power consumption. The first systems are due in Q4.
Atos’ BullSequana XH3000 hybrid computing platform will offer maximum flexibility for systems with performance from 1 FP64 PetaFLOPS to 1 FP64 ExaFLOPS (and to 10 FP8/FP16 ExaFlops for AI applications) leveraging hardware due to be released during the next six years. In addition to traditional x86 CPUs from AMD and Intel, it also supports upcoming Arm-based Grace processors from Nvidia and Rhea high-performance system-on-chips from SiPearl. Furthermore, as suggested by its hybrid nature, the platform can support next-generation compute GPUs (or accelerators), including AMD’s CDNA 2/3-based designs, Intel’s Ponte Vecchio, and Nvidia’s next-gen GPU solutions.
The future CPUs and GPUs for datacenters and high-performance computing (HPC) promise to be extremely power-hungry, which is why the BullSequana XH3000 is designed for liquid cooling by default. The XH3000 will use Atos’ Direct Liquid Cooling (DLC) solution that promises to provide ‘over 50% more cooling power than previous generations’ and enable nodes that consume up to 1000W. While 1kW of power per node may sound extreme, it looks like this will be a fairly standard energy-efficient HPC node configuration in the future. For example, rumor has it that AMD’s Instinct MI250X consumes up to 550W, whereas the industry-standard OAM form-factor for such units is designed to supply up to 700W of power.
At present, we do not know a lot of specifics about the upcoming HPC solutions from AMD, Intel, Nvidia, and SiPearl, but the very fact that Atos claims compatibility with them indicates that their general characteristics (e.g., power consumption) are known to the server maker. In fact, Nvidia states that it has worked with Atos to build the platform.
“By combining the well-known expertise Atos has with Nvidia AI and HPC technologies and work at our joint lab, this platform will allow researchers to get significant insights much faster to grand challenges both in supercomputing and industrial HPC,” said John Josephakis, global vice president of sales and business development for HPC/supercomputing at Nvidia.
Given the availability timeframe of the first BullSequana XH3000 machines, it is likely that they will be among the first ones to support Nvidia’s upcoming next-gen compute GPUs due later this year.
As for interconnections, the BullSequana XH3000 supercomputer architecture will support a wide range of networking technologies, including BXI, High Speed Ethernet as well as HDR & NDR InfiniBand.
All three U.S. supercomputers announced to date rely on HPE’s Cray EX architecture that can use almost any CPU and various compute accelerators. These days Cray EX systems use AMD’s or EPYC or Intel’s Xeon Scalable CPUs along with compute GPUs from AMD, Nvidia, or Intel (when Ponte Vecchio is available). By adding support for Nvidia’s Grace and SiPearl’s Rhea, Atos offers a more flexible platform (yet we do not know what can stop HPE from adopting its architecture for Grace or Rhea SoCs).
“We are extremely proud of our role as a leader in HPC and of our new BullSequana supercomputer, revealed today, which results from 15 years of R&D efforts and brings together Atos’ proven expertise and experience in high-performance computing, AI, quantum, security and digital decarbonization,” said Rodolphe Belmer, CEO of Atos. “It will no doubt enable, through the gateway of exascale, some of the key scientific and industrial innovation breakthroughs of the future.”
The Kansas City-based health system has opened an inpatient virtual nursing unit that is managed almost entirely by nurses, and is fielding calls from health systems across the country interested in the concept.
KEY TAKEAWAYS
Saint Luke’s Health System has opened a unit in one of its hospitals that is managed virtually by nurses operating from a high-tech command center in downtown Kansas City.
The wing, built on a telemedicine platform designed by Teladoc Health, allows nurses to remotely monitor patients and handle a lot of the administrative work that typically leads to stress and burnout.
The virtual nursing unit has helped the hospital increase both patient and nurse satisfaction rates while improving discharge time and reducing ED waits for a hospital bed.
A Kansas City–based health system is putting nurses at the center of the virtual care platform and seeing positive results not only in patient and nurse satisfaction, but clinical and business outcomes as well.
Saint Luke’s Health System opened its virtual nursing unit in 2018. Launched by Susie Krug, chief nursing officer at Saint Luke’s East Hospital in Lee’s Summit, Missouri, the unit sits on a telemedicine platform built by Teladoc Health and managed by nurses at the health system’s technology center in downtown Kansas City.
“It’s a new model of care,” says Jennifer Ball, the health system’s director of virtual care. “It’s there to [help] the nurses as well as the patients, with a focus on virtual care. Virtual everything is going to be our future.”
It was designed, Ball says, with the idea that nurses are often the focal point of care in the inpatient wings, handling different tasks in between and around rounds and visits made by doctors and specialists. Nurses have the most contact with patients and their families, handle the administrative and educational tasks, manage bedside devices and data-gathering, even lend a hand with everything from the meals to the TV, she says.
That kind of work is why nurses have been rated the most trusted profession for 20 straight years by Gallup, but it’s also why so many are dealing with stress, anxiety and burnout—and why health systems are having a difficult time filling those positions. Add in the challenges of protecting both patients and care providers during a pandemic, and the job becomes tougher.
Due in part to the shift to virtual care caused by COVID-19, many health systems are rethinking how that strategy can be scaled and sustained beyond the pandemic—not only outside the hospital but inside as well.
Saint Luke’s virtual nursing unit operates on the idea that many of the tasks performed by nurses in the inpatient setting not only are repetitive but inefficient, and that a telemedicine platform that connects every room in the unit can allow nurses to manage those tasks from one place. Nurses and staff on the unit would then be freed up to focus on patient-facing care, while those in the command center would monitor the patients and enter data in the medical record.
“They chart in the same system,” Ball points out, “so everything is right in” the EMR.
Early results show positive outcomes for the virtual unit. Patient satisfaction is high, and patients are discharged within two hours of the discharge order, some 20% faster than in other units, and they’re also out of the hospital before noon at a 44% faster rate. This, in turn, reduces the wait time for patients in the ED and reduces the time to treatment.
Turning those metrics around, health system officials say the unit has boosted nurse morale as well, improving workforce engagement, reducing fatigue (physical and intellectual) and even improving Saint Luke’s recruitment capabilities.
Ball says the health system learned quickly that a virtual nursing unit is different from any other virtual care program. Workflows must be designed specifically with nursing in mind, and often go through a few iterations before working out.
“We’ve changed what the virtual nurse does several times,” she says. “It was challenging at first because this is a new model, and we had to learn what works and what doesn’t work. And while this is [modeled] as an observation unit, it has been anything but that over the past year. “
Ball says Saint Luke’s had the advantage of launching the program in a new, specially designed unit, rather than integrating it into an existing wing. She expects to integrate virtual nursing to other wings in the future, and to deal with new challenges as they expand the footprint.
“There will be some culture change involved,” she says.
For that reason, Saint Luke’s launched its own virtual nurses training program about a year ago, with the idea that nurses should be trained specifically in virtual care rather than brought over from another area of the hospital and introduced to it. With virtual care the emphasis is more on technology, as well as on communication. After all, sitting in a command center surrounded by six large monitors isn’t quite what nurses are taught to expect in school.
“For some of the nurses, it’s a lot, but for others not so much,” Ball says.
And that’s why education and team-building are so important to the program. Unlike many doctors, nurses work in a team setting, with the understanding that care coordination and management are group-based rather than individual goals.
“You need buy-in from nurses at the beginning,” Ball says. “You can’t start too early with education … and team building. In some cases, you have to sell what a virtual nurse can do,” but once they see what is possible, they’re invested in the program.
That goes for the patients as well. Many might wonder whether a virtual nursing unit isolates patients too much, depriving them of the in-person care that helps them adjust to being in a hospital and puts them on the path to recovery. But Ball says patients have come to appreciate the idea that they’re always being looked after, and they develop connections to their virtual nurses. They identify more closely with those nurses than with the nurse who shows up when someone pushes the help button.
Just as Saint Luke’s checked in on Ochsner Health’s virtual nursing model as it was developing its program, Ball says she’s fielding requests from other health systems who want to adopt that strategy. And she has an eye to the future as well, including integrating the virtual nursing model into existing wings and hospitals in the system.
“There’s always new technology as well,” she says, eyeing the fast-developing telemedicine landscape and the emergence of digital health tools, including wearables. “This [model] is going to be used in new ways in the future,” such as mentoring and precepting, and integrated with other services such as the pharmacy, social workers, dietitians, and chronic care management.
Eric Wicklund is the Technology Editor for HealthLeaders.
February 13, 2022 13:00, Last Updated: February 16, 2022 14:39
By Andrew Thornebrooke
Bureaucracy and waste are hamstringing U.S. militarydevelopment and adversely affecting the nation’s military readiness, according to the former chief software officer of the Air Force. That means the United States is less prepared for a potential conflict with China.
“Any bureaucracy which slows down outcomes for the sake of bureaucracy is going to ensure we get behind China,” Nicolas Chaillan, who resigned in September, said in a recent interview.
“China doesn’t let complacency or bureaucracy get in the way of [its military],” he told The Epoch Times.
He made the comments amid increasing tensions between the United States and the Chinese Communist Party (CCP) concerning the de facto independence of Taiwan, mass intellectual property theft, and human rights abuses in Xinjiang and elsewhere.
As that competition becomes more adversarial, the CCP has focused on technological development in sectors that its leadership considers vital, such as artificial intelligence (AI) and machine learning (AI/ML).
According to Chaillan, it’s these sectors, and IT more broadly, that U.S. military bureaucracy is negatively affecting the most.
A ‘Toxic’ Budgeting System
There are long-running concerns that the Pentagon is encumbered with red tape, redundant oversight measures, and safety protocols that slow military development to a snail’s pace. Indeed, the bureaucracy has become something close to synonymous with the Department of Defense (DoD), according to its leadership.
Gen. John Hyten, vice chair of the Joint Chiefs of Staff at the time, said in October 2021 that the Department’s bureaucracy was “brutal,” and that a risk-averse culture among military leadership was stifling technological development and allowing the CCP to seize the advantage in critical sectors such as hypersonic weapons development.
“The pace [China is] moving and the trajectory that they’re on will surpass Russia and the United States if we don’t do something to change it,” Hyten said. “It will happen.
“We can go fast if we want to. But the bureaucracy we’ve put in place is just brutal.”
When asked if he agreed with that assessment, Chaillan’s response was matter-of-fact.
“Yes, without a doubt,” he said. “Everything from the budget process with Congress … to the acquisition process that is cumbersome and prevents agile acquisition due to reporting requirements and [the fact that] budgets are allocated so far ahead that we are stuck in time.”
He added that the sheer amount of compartmentalization within the DoD makes it difficult to build consensus among leadership and to get anything done.
Chaillan also said that such difficulties were particularly burdensome in terms of developing cutting-edge technologies, as it was difficult within the DoD to upgrade to yesterday’s technologies, much less tomorrow’s.
He noted that the budgeting process as a whole ultimately is the responsibility of Congress to fix. The process’s knock-on effects throughout the DoD were dangerous nonetheless, he said. Key among them, he said, was the need to allot finances for specific projects many years in advance.
Currently, Pentagon leaders develop a five-year program that will serve as an outline for the scope and funding of their endeavors. This program is then used to inform the branch’s budget request of Congress. Congress then allocates funds for future use, as it dissects the Pentagon’s alleged needs.
The problem, of course, is that military responses to emerging threats and IT development can’t be pre-allocated, meaning that many programs come into existence and end up being unnecessary, while other needed technologies go unfunded.
“[The budget process] isn’t the Pentagon’s fault, but it has toxic ripple effects across the building as the Pentagon is now working the 2024 to 2029 budget,” Chaillan said.
“How could this even work out? No one knows what IT will look like in 2025, let alone 2029.”
As such, he said, current acquisition processes hinder the integration of expert industry experience into the military, effectively preventing the military from expanding its talent pool. This is because the Pentagon cannot effectively train and continuously fund IT workers whose expertise needs to be continuously changing with the technology, as the budget process forces them into long-term, semi-static programs agreed to at times years in advance.
Such statements echo similar remarksmade by Michael Sekora last month. Sekora, who spearheaded Project Socrates, a Reagan-era Defense Intelligence Agency program designed to increase U.S. competitive advantage, lambasted what he referred to as a “finance-planning” model of defense.
In such a system, Sekora said, the government merely allocates funds year by year, in the hope that the funds will somehow be transformed into the technologies that are needed when they are needed.
The CCP, meanwhile, was pursuing a technology-based strategy whereby specific technologies were created and deployed in a whole-of-society effort to address real problems in real time.
“China understands that exploiting technology more effectively than the competition is the foundation of all competitive advantage,” Sekora said.
“Anything else is a guaranteed exercise in futility.”
Talent Retention ‘Worse Than I’ve Ever Seen’
To that end, Chaillan noted that a vital aspect of working in technology-oriented fields is the need to be forever learning, improving, and iterating. Unfortunately, he said, making continuous learning a part of the job is not something that the military bureaucracy allows much of.
“We don’t invest enough in our talent,” Chaillan said. “We must understand that with the pace of IT, the only answer is continuous learning.”
Chaillan said that when he was chief software officer for the Air Force, he would give his team an hour every day to dedicate to learning new concepts. That model isn’t in vogue with the Pentagon, however, which typically approaches IT-related fields from a managerial perspective.
“Usually, learning is seen as a yearly thing or worse,” Chaillan said. “Additionally, the DoD believes in this concept of ‘knowing everything’ where no one is supposed really to become an expert at something, they just learn to manage and they supposedly can manage anything from a [fighter] wing to a maintenance crew to an IT team.”
This managerial approach was negatively affecting IT in the military, which requires a more active and entrepreneurial approach, Chaillan said. Moreover, it encourages the placement of military officers to leadership roles based purely on rank rather than in-field qualifications.
“We are setting up critical infrastructure to fail,” Chaillan wrote in a separate open letter in September that explored his reasons for resigning from the Pentagon.
“We would not put a pilot in the cockpit without extensive flight training; why would we expect someone with no IT experience to be close to successful?”
That problem was compounded, Chaillan said, by a lack of opportunities for IT professionals in the military to actually apply themselves at the jobs they were trained to do.
“Obviously, that doesn’t work with IT,” Chaillan said.
That means that many IT professionals in the military are prevented by bureaucratic processes from further developing professionally, eventually becoming “stale” at their skill set. It’s a problem that Chaillan himself cited as a reason for his own departure from the Pentagon.
Perhaps because of this, many of the companies at the forefront of new technological development simply won’t work with the DoD, Chaillan said.
“Unfortunately, many companies still refuse to work with DoD, which isn’t helping us get access to best-of-breed talent,” Chaillan said. “We also have a very tough time retaining talent right now, particularly in IT. It’s worse than I’ve ever seen.”
Chaillan lamented the fact that chief information officers in the DoD weren’t being fully utilized, and were effectively being treated as lottery tickets whose projects may or may not be picked for funding rather than as drivers of innovation.
“DoD Chief Information Officers are just seen as policy shops instead of actual doers,” Chaillan said. “That would never happen on the commercial side. That must be changed.”
Indeed, in his open letter, Chaillan wrote that the very same happened to him, much to the detriment of U.S. defense needs and equally to the advantage of the CCP. He was underutilized, he said, and spent the majority of his professional time trying to convince others to consider more efficient solutions to well-known problems.
“The DoD should stop pretending they want industry folks to come and help if they are not going to let them do the work,” he wrote. “While we wasted time in bureaucracy, our adversaries moved further ahead.”
‘Four Cents of Real Value on a Dollar’
That mismanagement and the adverse effects it has on military readiness are only the tip of the iceberg, however. By Chaillan’s estimation, the Pentagon is only utilizing 4 percent of its funds on actual solutions.
“Overall, I believe we get four cents of value out of one dollar of taxpayer money spent,” Chaillan said.
“First, probably 60 cents are wasted due to cumbersome acquisition processes and requirements which were made to prevent fraud or conflicts of interest or bad behavior, but really ended up creating a massive bureaucracy which created more waste than it is preventing.
“Second, because we are stuck in time and get requirements that are five to 10 years too old, we effectively buy many things that are obsolete or aren’t a good fit to the current mission and should have been voided but we don’t because we spent years trying to get the contract done.
“That’s probably another 30 cents. Another six cents is wasted by making mistakes, which are fair and common.
“That’s four cents of real value on a dollar. With $750 billion funding, that means $30 billion is actually real tangible value.”
Chaillan was careful to point out that the figure is only his opinion but said that most of his professional acquaintances would agree that the DoD loses at least 60 percent of its funds to such wastage.
Regardless, bureaucratic waste remains widespread, and outdated or otherwise unneeded technologies remain a persistent problem. The fact is that parts of the military, for its trillions of dollars received over the years, simply don’t have adequately functioning technology.
That problem was highlighted in anopen letter to the DoD penned in January by Michael Kanaan, director of operations for the Air Force-MIT Artificial Intelligence Accelerator, a joint AI research endeavor, in which he flogged military leadership about the fact that its IT professionals were working with computers that were decades old.
“Want innovation? You lost literally HUNDREDS OF THOUSANDS of employee hours last year because computers don’t work,” Kanaan, who was Air Force’s first chairman for AI, wrote. “Fix our computers.”
“It’s not a money problem, it’s a priority problem.”
US Approaching ‘The Point of No Return’
Such problems and the waste that causes them are not new, however, nor are they a secret. Indeed, in many instances, such waste is baked into the U.S. defense-industrial complex by excessive congressional oversight.
In 2015, then-Navy Secretary Ray Mabus said that 20 percent of the defense budget was “pure overhead.” Former Secretary of Defense Robert Gates, meanwhile, set that number at closer to 40 percent.
Likewise, another former Navy secretary, John Lehman, said that “roughly half of all uniformed personnel serve on staffs that spend most of their time going to meetings and responding to tasks from the hundreds of offices that have grown like mold throughout the vast Defense Department.”
In 2015, a DoD report was compiled to examine the problem, and found that $125 billion, roughly a quarter of its annual budget at the time, was lost on administrative waste over the course of five years.
The Pentagon ultimately buried the report, classifying its data out of a fear that Congress would cut its budget.
Likewise, accounting firm Ernst and Young found in 2018 that the Defense Logistics Agency, the Pentagon’s main procurement wing, failed to properly document more than $800 million in construction projects.
Perhaps unsurprisingly, the Pentagon also failed all three of its audits, each conducted since 2018, when it was subjected to its first independent and department-wide audit.
In 2020, then-DoD Comptroller Thomas Harker said that the department likely wouldn’t pass an audit until at least 2027.
In sum, the Pentagon is failing to invest in relevant new technologies because its bureaucracy is slowing the investments it does make. Meanwhile, its funding structure and excessive administrative oversight are hemorrhaging most of the funds it receives before the money gets to projects that might be outdated or unnecessary by the time the budget is approved.
According to Chaillan, the United States must compete with China’s superior population by becoming more efficient, more innovative, and more agile. Under the current bureaucratic structure, however, the likelihood of that happening is fast becoming an impossibility, he says.
Even as the CCP invests in AI/ML, autonomous systems, space warfare, and quantum computing, the U.S. military is struggling with a lack of basic, functioning IT equipment. Moreover, Chaillan said, it’s failing to deliver on the tangible products required to win a war.
“The biggest threats to readiness today [are] our lack of access to basic IT capabilities like Edge computing, Edge cloud, proper connectivity, modern devices, and ensuring we can connect all the military domains to get a full picture of our current situation,” Chaillan said.
“More importantly, the lack of investment in AI/ML tangible military outcomes instead of focusing on AI/ML ethics will be what leads us to getting behind to the point of no return if that doesn’t change by December 2022.”
Andrew Thornebrooke is a reporter for The Epoch Times covering China-related issues with a focus on defense, military affairs, and national security. He holds a master’s in military history from Norwich University