healthcarereimagined

Envisioning healthcare for the 21st century

  • About
  • Economics

Addressing DOD’s Tech Focus Areas Requires New Approaches – DOD News

Posted by timmreardon on 03/04/2022
Posted in: Uncategorized. Leave a comment

MARCH 3, 2022 |BY C. TODD LOPEZ, DOD NEWS

Earlier this year, the Defense Department’s chief technology officer, Heidi Shyu, released a list of 14 technology areas deemed most critical for investment, including biotechnology, advanced materials, trusted artificial intelligence and microelectronics.

A handful of products related to those focus areas — such as hypersonics and directed energy weapons — are almost exclusively military-related, but the majority are already being developed for the commercial market by private companies that may not have done business with the federal government. Those innovations include next-generation wireless communications, microelectronics, and human-machine interfaces. 

For DOD to have its needs addressed by the private sector with or without DOD involvement, the department will need to do a better job of engaging with those companies. 

Spotlight: Engineering in the DOD

DOD’s Defense Innovation Unit is one segment of the government already on board with locating companies involved in the development of critical technologies and helping them become suppliers. 

“We’re really trying to look at what all of the innovative companies are doing around the country … because most of what we need to do to modernize the Defense Department is led by industry now; it’s commercial technologies,” Mike Brown, director of the DIU, said during a discussion Wednesday at George Mason University’s Center for Government Contracting in Northern Virginia. 

“We have to be harnessing what the private industry is doing if we’re going to be giving our warfighters the capability that they need,” Brown said. 

DIU, Brown said, has been working to accelerate adoption of commercial technology using “other transaction authority,” which is different from classic procurement contracts and is instead used for things like research or prototyping. 

The approach DIU uses is called “commercial solutions open,” and Brown said this includes things like agile work statements, modular contracts and working at commercial speeds. 

“We don’t start with requirements, which often dictates how the department might start to bring in a new capability — a process that’s well-honed if you’re going to build a new aircraft or tank,” Brown said. “If you’re going to look at commercial technology, you don’t need to start with requirements. The commercial market has already built that.” 

When DIU was looking at counterdrone technologies, Brown said, it didn’t need to specify requirements because the commercial market had already developed things DOD could use. 

With modular contracts, he said, comes the flexibility to bring in different vendors and have them work together and go from a successful prototyping effort directly into production. 

Finally, modular contracts can limit the challenging requirements for intellectual property that might delay the transition of a capability from the private sector into warfighter hands, he said. 

“We’re trying to get companies on contract in 60 to 90 days, in commercial terms — that means no onerous IP [intellectual property] requirements for companies that we work with,” Brown said. 

Budgeting is also an issue, Brown said. Traditional budgeting requires planning as much as two years in advance before dollars can be spent. 

“That’s not the agile process we need to compete with China in technology,” he said. “We need to be able to move not the whole $750 billion defense budget, but we need some flexibility at the edges to respond to emerging threats and plug in new commercial technology solutions that address those threats. 

The Commission on Planning, Programming, Budgeting, and Execution Reform, as directed in the 2022 National Defense Authorization Act, is looking now at better solutions to budgeting so that the department can be more agile in the technology it procures. That’s something Brown said he’s glad to see.

Article link: https://www.defense.gov/News/News-Stories/Article/Article/2953893/addressing-dods-tech-focus-areas-requires-new-approaches/

What Is Blockchain? – Gartner

Posted by timmreardon on 03/04/2022
Posted in: Uncategorized. Leave a comment

March 02, 2022

Contributor: David Furlonger and Christophe Uzureau

Blockchain is already building trusted digital environments, and one day, it will underpin Web3 and the Metaverse. Don’t get left behind.

In short: 

  • Blockchain enables two or more people, businesses or computers that may or may not know each other to exchange value in digital environments — in a monetary transaction, information or other exchange of assets — without an intermediary.
  • Blockchains are a form of distributed (digital) ledger. Complete blockchains combine five design elements to authenticate users, validate transactions and record information to the ledger in a way that can’t be corrupted or changed later. It thus eliminates the need for a central administrator.
  • Blockchain solutions today are continuing to mature, e.g. in terms of scalability, but will ultimately underpin new social and economic models.

Have you bought a house lately? Imagine if you could have transacted with the seller directly, even though you had never met, confident that the deal would be recorded in a way that neither of you could change or rescind later. You wouldn’t have to reconcile rafts of personal information with a real-estate agent, mortgage broker, insurance agent, property inspector and title company. Enter blockchain: A way to directly and securely connect and record value chain interactions.

Albeit this is just a scenario for records management inspired by blockchain. Once further developed, however, blockchain technology will be able to layer on the intelligent decision-making ability of artificial intelligence (AI) and the sensory powers of the Internet of Things (IoT) to create whole new social and economic constructs in the peer-to-peer age of Web3 and the Metaverse. 

But let’s not get ahead of ourselves.

Read book excerpt: The Real Business of Blockchain

What is blockchain?

Blockchain combines existing technologies and techniques including distributed digital ledgers, encryption, immutable records management, asset tokenization and decentralized governance to capture and record information that participants in a network need to interact and transact. There are no intermediaries, like banks, validating and protecting the transactions.

While the recorded event could be a monetary transaction, it could also be an exchange of information, such as metadata attached to records like medical history, personal identity and supply chain logistics. In a blockchain-enabled future, your government ID won’t be floating around dozens of databases without your knowledge. 

Five Key Elements of Blockchain

Its component technologies enable the blockchain to:

  • Confirm the pseudonymous identity of the participants 
  • Validate that the participants own the information/assets they want to exchange 
  • Authenticate and approve that the transaction can occur
  • Record the transaction information to the ledger, a copy of which is independently updated and held by each node on the network. 

In this digital environment, records are unalterable, time-stamped, encrypted and linked to each other in blocks, where each block is a cluster for Bitcoin of roughly 1,500 transaction records (other ledgers vary in block size). The ledger grows as participants transact, and roughly seven transactions are processed every second (for Bitcoin; other ledgers vary in throughput scale). 

The business potential of blockchain 

The potential impact of blockchain on business is massive. Imagine all the deals your firm won’t or can’t do today because you don’t know who is on the other end of the transaction and can’t be certain they own the assets they want to trade. 

For millions of potential trading partners, asset types and transactions, that uncertainty will cease to matter. The blockchain will identify participants, ensure all elements of a transaction are valid, enforce the ecosystem rules and guarantee everyone holds to them. 

Gone will be the slow, expensive, analog-based methods we have relied on to establish identity and legal status in commercial transactions since the 19th century.

Equally important is blockchain’s ability to enable faster and more diverse transactions — in both type and size — than is possible with traditional centralized systems.

For generations, businesses have relied on centralized infrastructures, such as payment systems, insurance, delivery and logistics services, and governments, to execute commercial transactions and manage risk. But these systems weren’t designed to handle the complexity, size and scale of the machine-to-machine transactions made possible by digital platforms. 

Single units of data, digital business assets such as cryptocurrency, reward points or pieces of an asset are (or soon will be) tradable over digital networks as part of the programmable economy. Units worth less than $0.01 can be traded by the millions or trillions. Current payment systems can’t cost-effectively and securely process transactions below a certain value nor can they handle the volume possible today.

Businesses need a different way to deal with new digital assets and interactions without involving an intermediary that collects data on every party and takes a cut of the value. Blockchain promises a solution.

How blockchain works

Blockchains are a specific type of create, write and read-only distributed ledger. Not all distributed ledgers are blockchains, but all blockchains are distributed ledgers. 

Complete blockchains combine five design elements to authenticate users, validate transactions and record that information in a way that can’t be corrupted by a single participant or changed after the fact. They also enable the adoption of decentralized ecosystem governance and deployment of programmable capabilities, including the creation and use of digital business assets or new forms of money.

While complete blockchain solutions do exist, many of today’s enterprise initiatives only include some of the elements — distribution, encryption and immutability. Often missing are tokenization to exchange value, and decentralization to enable consensus-driven governance. 

These incomplete or “blockchain inspired” solutions can still add value by digitalizing manual processes, such as real-estate purchases, or by enabling more efficient information exchange in multiparty transactions, such as insurance claims. But without all five elements, their value is limited in terms of new revenue growth. 

Five key elements of blockchain

A complete blockchain incorporates all five elements:

  • Distribution. Blockchain participants, connected on a distributed network, operate nodes (computers) that run a program to enforce the business rules of the blockchain. Nodes also keep a full copy of the ledger, which updates independently when new transactions occur.
  • Encryption. Blockchain uses technologies such as public and private keys to record data securely and semi-anonymously. During the process of creating a Bitcoin wallet, for example, the blockchain generates an address for the participant that is visible to all network participants but provides pseudonymity.
  • Immutability. Completed transactions are cryptographically signed, time-stamped and sequentially added to the ledger. Records can’t be changed unless all participants agree to do so. Such an agreement is known as a fork. 
  • Tokenization. Value is exchanged in the form of tokens, which can represent a wide variety of asset types, including monetary assets, units of data or user identities. Token use can be programmed via smart contracts. Tokenization, or the creation of tokens, is the way a blockchain represents and enables trade via digital business assets.
  • Decentralization. No single entity controls a majority of the nodes or dictates the rules. A consensus mechanism verifies and approves transactions, eliminating the need for a central intermediary to govern the network. Decentralization — and its inverse, centralization — comprises three core elements: technology, economics and decision making. Each can be adjusted to vary the manner in which governance is applied to the ecosystem.

Executives: Don’t ignore blockchain

Blockchain has already started to revolutionize ways of doing business, but even CIOs aren’t totally on board, let alone the rest of the executive leadership team. Gartner research shows that from 2016 to 2021, on average, 45% of CIOs said their organization had no interest in blockchain. 

Startups and leading digital enterprises are nevertheless deploying blockchain to solve problems and create value that conventional, centralized technologies and processes can’t. The cryptocurrency market is now worth around $2 trillion, and nation states and public and private companies own nearly $30 billion of Bitcoin alone.

Large companies in fashion, sports, consumer packaged goods, music and gaming are using nonfungible tokens (NFTs) to bridge online and offline worlds, especially in the Metaverse as a way to enhance their brands and engage more effectively with digital customers. In the supply chain space, FoodTrust now has over 300 members. In shipping, TradeLens is processing over 700 million events and 6 million documents per year. In financial services, the Depository Trust & Clearing Corporation (DTCC) plans to launch its Project ION stock settlement solution in 1Q 2022.

Blockchain is clearly heading out of the Gartner Hype Cycle’s Trough of Disillusionment, and now is the time to act. CIOs should alert and educate their board of directors, CEO and fellow business leaders about the fact that enterprises must accelerate blockchain initiatives as part of the enterprise digital transformation, or risk falling so far behind that they permanently lose a competitive edge.

Article link: https://www.gartner.com/en/articles/what-is-blockchain?

You don’t need to rewrite acquisition regulations to improve DoD buying

Posted by timmreardon on 03/04/2022
Posted in: Uncategorized. Leave a comment

By Jerry McGinn

Mar 2, 09:35 AM

Under Secretary of Defense for Research and Engineering Heidi Shyu recently outlined the Pentagon’s critical technology areas. Not surprisingly, 11 of the 14 are heavily commercial areas such as biotechnology, artificial intelligence, quantum science and advanced materials.

The Department of Defense’s increased engagement with commercial companies in recent years clearly demonstrates the critical role these and other technologies will play in future military systems. Unfortunately, DoD has struggled to adopt commercial technology at scale, in large part due to its industrial-age acquisition practices.

Defense Secretary Lloyd Austin acknowledged this in his remarks at December’s Reagan National Defense Forum when he noted the department “has to do better” at helping innovative companies cross the valley of death from “inception to prototype to adoption.”

Indeed, DoD must do better. To ensure strategic advantage against adversaries like Russia and China, the Pentagon needs an acquisition system that innovates, iterates and scales to develop and field effective military capabilities. This is true for commercial or dual-use technologies, but it’s also critical for the major defense programs that use software and advanced technologies to create the next generation of ships, vehicles and aircraft.

Getting to this objective end state requires first and foremost a change in mindset, a paradigm shift. At the Center for Government Contracting, we call this mindset Acquisition Next.

Defense acquisition today is optimized for the assembly lines of the industrial age, developing programs to build platforms and acquire services based on precisely designed military-specific requirements. Meanwhile, the broader economic system has entered the digital age.

Modern engineering and business practices have dramatically accelerated the product development cycle. Commodity production has been replaced by software, data and product design — intangible capital that requires knowledge labor. For example, in 1970, most of the value of a newspaper was in the paper, ink and printing equipment. By the 1990s, the news moved online, but firms still had to own servers. By the 2010s, almost the entire tech stack was virtualized. The largest and most innovative firms today are software natives.

For the Department of Defense to keep pace, it needs to adopt approaches based on modularity and iteration as opposed to linearity and prediction. We looked for this approach in a year-long study of real-world programs. Interviews with more than 75 government and industry professionals from a variety of backgrounds offered evidence that a paradigm shift is already underway.

Importantly, we focused on practices that can be undertaken today and do not require any changes in authority and regulations. Congress has given DoD numerous new and expanded authorities in recent years and DoD has been actively pursuing innovative approaches through initiatives such as the Adaptive Acquisition Framework. Moreover, there is actually a tremendous amount of flexibility in the much-maligned Federal Acquisition Regulation.

Our final report suggests a way to implement the Acquisition Next mindset through a series of “plays” or leading practices in defense acquisition programs. The first three apply at the total program level:

  • Requirements. Focus on short statements of outcomes to increase flexibility in solution design and create a stakeholder process for requirements iteration over time.
  • Market research. Develop an organizational capability for continuously engaging with industry to identify technologies and vendors that can increase program value.
  • Master the baseline. Determine which system elements are technically separable and pursue traditional contracting approaches for technologies with slower cycle times while faster moving applications receive modular contracts.

These practices enable the second group of three plays, which are suited to contracts with software intensive content:

  • Agile work statements. Separate technical direction from contract requirements and use a living roadmap adjusted to the product backlog and user feedback.
  • Modular Contracts. Start with broad and flexible solicitations, then transition to multiple-award contract vehicles with recurring task orders and streamlined procedures.
  • ·Intellectual Property. Rather than focus on specific standards, influence a microservices architecture with rights to interfaces and operational data.

Taking an Acquisition Next approach can help government programs become leaders in innovation again.

Many of these practices are being used in programs today, but widespread adoption will help drive culture change across the acquisition community. All that’s required is top cover from leadership and support up and down the chain of command.

While there is a need for structural reform in defense budgeting, our playbook demonstrates the flexibility in today’s authorities and regulations. To accelerate the next generation of military capabilities, we need to adopt an Acquisition Next mindset focused on modularity, speed, iteration, competition. Let’s get rolling.

Article link: https://www.defensenews.com/opinion/commentary/2022/03/02/you-dont-need-to-rewrite-acquisition-regulations-to-improve-dod-buying/

Jerry McGinn is executive director of the Center for Government Contracting in George Mason’s School of Business and a former senior DoD acquisition official.

How AI is reinventing what computers are – MIT Technology Review

Posted by timmreardon on 03/02/2022
Posted in: Uncategorized. Leave a comment

Three key ways artificial intelligence is changing what it means to compute.

by Will Douglas Heaven

October 22, 2021

Fall 2021: the season of pumpkins, pecan pies, and peachy new phones. Every year, right on cue, Apple, Samsung, Google, and others drop their latest releases. These fixtures in the consumer tech calendar no longer inspire the surprise and wonder of those heady early days. But behind all the marketing glitz, there’s something remarkable going on. 

Google’s latest offering, the Pixel 6, is the first phone to have a separate chip dedicated to AI that sits alongside its standard processor. And the chip that runs the iPhone has for the last couple of years contained what Apple calls a “neural engine,” also dedicated to AI. Both chips are better suited to the types of computations involved in training and running machine-learning models on our devices, such as the AI that powers your camera. Almost without our noticing, AI has become part of our day-to-day lives. And it’s changing how we think about computing.

What does that mean? Well, computers haven’t changed much in 40 or 50 years. They’re smaller and faster, but they’re still boxes with processors that run instructions from humans. AI changes that on at least three fronts: how computers are made, how they’re programmed, and how they’re used. Ultimately, it will change what they are for. 

“The core of computing is changing from number-crunching to decision-making,” says Pradeep Dubey, director of the parallel computing lab at Intel. Or, as MIT CSAIL director Daniela Rus puts it, AI is freeing computers from their boxes. 

More haste, less speed

The first change concerns how computers—and the chips that control them—are made. Traditional computing gains came as machines got faster at carrying out one calculation after another. For decades the world benefited from chip speed-ups that came with metronomic regularity as chipmakers kept up with Moore’s Law. 

But the deep-learning models that make current AI applications work require a different approach: they need vast numbers of less precise calculations to be carried out all at the same time. That means a new type of chip is required: one that can move data around as quickly as possible, making sure it’s available when and where it’s needed. When deep learning exploded onto the scene a decade or so ago, there were already specialty computer chips available that were pretty good at this: graphics processing units, or GPUs, which were designed to display an entire screenful of pixels dozens of times a second. 

Anything can become a computer. Indeed, most household objects, from toothbrushes to light switches to doorbells, already come in a smart version.

Now chipmakers like Intel and Arm and Nvidia, which supplied many of the first GPUs, are pivoting to make hardware tailored specifically for AI. Google and Facebook are also forcing their way into this industry for the first time, in a race to find an AI edge through hardware. 

For example, the chip inside the Pixel 6 is a new mobile version of Google’s tensor processing unit, or TPU. Unlike traditional chips, which are geared toward ultrafast, precise calculations, TPUs are designed for the high-volume but low-precision calculations required by neural networks. Google has used these chips in-house since 2015: they process people’s photos and natural-language search queries. Google’s sister company DeepMind uses them to train its AIs. 

In the last couple of years, Google has made TPUs available to other companies, and these chips—as well as similar ones being developed by others—are becoming the default inside the world’s data centers.

AI is even helping to design its own computing infrastructure. In 2020, Google used a reinforcement-learning algorithm—a type of AI that learns how to solve a task through trial and error—to design the layout of a new TPU. The AI eventually came up with strange new designs that no human would think of—but they worked. This kind of AI could one day develop better, more efficient chips. 

Show, don’t tell

The second change concerns how computers are told what to do. For the past 40 years we have been programming computers; for the next 40 we will be training them, says Chris Bishop, head of Microsoft Research in the UK. 

Traditionally, to get a computer to do something like recognize speech or identify objects in an image, programmers first had to come up with rules for the computer.

With machine learning, programmers no longer write rules. Instead, they create a neural network that learns those rules for itself. It’s a fundamentally different way of thinking. 

Examples of this are already commonplace: speech recognition and image identification are now standard features on smartphones. Other examples made headlines, as when AlphaZero taught itself to play Go better than humans. Similarly, AlphaFold cracked open a biology problem—working out how proteins fold—that people had struggled with for decades. 

For Bishop, the next big breakthroughs are going to come in molecular simulation: training computers to manipulate the properties of matter, potentially making world-changing leaps in energy usage, food production, manufacturing, and medicine. 

Breathless promises like this are made often. It is also true that deep learning has a track record of surprising us. Two of the biggest leaps of this kind so far—getting computers to behave as if they understand language and to recognize what is in an image—are already changing how we use them.

Computer knows best

For decades, getting a computer to do something meant typing in a command, or at least clicking a button.

Machines no longer need a keyboard or screen for humans to interact with. Anything can become a computer. Indeed, most household objects, from toothbrushes to light switches to doorbells, already come in a smart version. But as they proliferate, we are going to want to spend less time telling them what to do. They should be able to work out what we need without being told.

This is the shift from number-crunching to decision-making that Dubey sees as defining the new era of computing.  

Rus wants us to embrace the cognitive and physical support on offer. She imagines computers that tell us things we need to know when we need to know them and intervene when we need a hand. “When I was a kid, one of my favorite movie [scenes] in the whole world was ‘The Sorcerer’s Apprentice,’” says Rus. “You know how Mickey summons the broom to help him tidy up? We won’t need magic to make that happen.”

We know how that scene ends. Mickey loses control of the broom and makes a big mess. Now that machines are interacting with people and integrating into the chaos of the wider world, everything becomes more uncertain. The computers are out of their boxes.

Article link: https://www-technologyreview-com.cdn.ampproject.org/c/s/www.technologyreview.com/2021/10/22/1037179/ai-reinventing-computers/amp/

Cyber Talk – Leveraging Breakthroughs in Privacy-Enhancing Technologies – DCO Defensive Cyber Operations

Posted by timmreardon on 03/01/2022
Posted in: Uncategorized. Leave a comment

Our Next Cyber Talk is on operations and intelligence analysis methods involving searches and analytics performed on external datasets can be very revealing. This exposure includes not only attribution (who is performing it), but also the content of the operation itself (what is being searched for), which may include sensitive information that would be extremely damaging to national security if exposed. Recent advances in an increasingly visible technology category known as Privacy Enhancing Technologies (PETs) are changing the paradigm of secure data usage by allowing sensitive indicators to be securely processed while remaining in the untrusted domain, extending the boundaries of trusted compute.
PETs are transformative because the mission-enabling capabilities they deliver are not making something better; they are making something entirely new possible. Users can encrypt the content of their search, analytic, or machine learning model, ensuring their interests and intents remain secure and private throughout the processing lifecycle. This allows U.S. Government organizations to leverage publicly available, open-source, and/or low-side, government-curated data sources in ways that were never before possible.

Event link: https://www.linkedin.com/events/https-www-zoomgov-com-j-16124336904437132174467072/

RUSSIA’S WAGNER GROUP AND THE RISE OF MERCENARY WARFARE – USMA MWI

Posted by timmreardon on 02/28/2022
Posted in: Uncategorized. Leave a comment

Kyle Atwell and Daphne McCurdy | 12.04.22


What role do private military companies (PMCs) such as Russia’s Wagner Group play on the modern battlefield? How should US policymakers and US and allied troops in conflict zones manage threats from armed groups when Russia denies their existence? Is war by private armies a rising trend in modern conflict?

Our two guests argue that the Wagner Group plays an important role in Russian military strategy. Russia’s use of PMCs has implications both operationally for US troops on the ground, as well as for US foreign policy more broadly. We discuss at length a February 2018 incident in which the Wagner Group and Syrian partner forces engaged in a multi-hour firefight with US special operations forces in Syria, allegedly resulting in hundreds killed on the Russian/Syrian side.

The conversation concludes with recommendations for both policymakers and practitioners on how to address Russia’s use of plausible deniability and PMCs. The answer is made more pressing by the expanded use of PMCs in combat arms and kinetic roles not just by Russia but other countries, in places such as Nagorno-Karabakh, Libya, and beyond.

Dr. Robert Hamilton is an associate professor of Eurasian studies at the US Army War College and a Black Sea fellow at the Foreign Policy Research Institute. In a thirty-year career in the US Army, spent primarily as a Eurasian foreign area officer, he served overseas in Saudi Arabia, Iraq, Germany, Belarus, Qatar, Afghanistan, the Republic of Georgia, Pakistan, and Kuwait. He is a graduate of the German Armed Forces Staff College and the US Army War College and holds a bachelor of science degree from the United States Military Academy, and a master’s degree in contemporary Russian studies and PhD in political science, both from the University of Virginia. In 2017 he led the US cell for ground deconfliction with Russia in Syria.

Candace Rondeaux is a professor of practice at the School of Politics and Global Studies and a senior fellow with the Center on the Future of War at Arizona State University. She has served as a strategic advisor to the US special inspector general for Afghanistan reconstruction and senior program officer at US Institute of Peace where she launched the RESOLVE Network, a global research consortium on violent extremism. She spent five years living and working in South Asia where she served as South Asia bureau chief for the Washington Post and as senior analyst on Afghanistan for the International Crisis Group. A graduate of Sarah Lawrence College, she holds a bachelor’s degree in Russian area studies, a master’s degree in journalism from New York University, and a master’s degree in public policy from the Woodrow Wilson School of Public and International Affairs at Princeton University. She is the author of the report “Decoding the Wagner Group: Analyzing the Role of Private Military Security Contractors in Russian Proxy Warfare.”

The Irregular Warfare Podcast is a collaboration between the Modern War Institute and Princeton University’s Empirical Studies of Conflict Project. You can listen to the full episode below, and you can find it and subscribe on Apple Podcasts, Stitcher, Spotify, TuneIn, or your favorite podcast app. And be sure to follow the podcast on Twitter!

Article link: https://mwi.usma.edu/russias-wagner-group-and-the-rise-of-mercenary-warfare/

How China built a one-of-a-kind cyber-espionage behemoth to last – MIT Technology Review

Posted by timmreardon on 02/28/2022
Posted in: Uncategorized. Leave a comment

A decade-long quest to become a cyber superpower is paying off for China.

By Patrick Howell O’Neill

February 28, 2022

The “most advanced piece of malware” that China-linked hackers have ever been known to use was revealed today. Dubbed Daxin, the stealthy back door was used in espionage operations against governments around the world for a decade before it was caught.

But the newly discovered malware is no one-off. It’s yet another sign that a decade-long quest to become a cyber superpower is paying off for China. While Beijing’s hackers were once known for simple smash-and-grab operations, the country is now among the best in the world thanks to a strategy of tightened control, big spending, and an infrastructure for feeding hacking tools to the government that is unlike anything else in the world.

This change has been going on for years, driven right from the very top. Soon after he ascended to power, President Xi Jinping began a reorganization of China’s military and intelligence agency, which prioritized cyberwarfare and initiated a “fusion” of military and civilian organizations geared toward boosting the nation’s cyber capabilities

The results are new tools and tactics that have rapidly become more sophisticated and ambitious over the past decade. For example, Chinese government hackers have exploited more powerful zero-day vulnerabilities—previously undiscovered weaknesses in technology for which there is no known defense—than any other nation, according to congressional testimony from Kelli Vanderlee, an intelligence analyst at the cybersecurity firm Mandiant. Researchshows that Beijing exploited six times as many such powerful vulnerabilities in 2021 as in 2020.

China’s offensive cyber capabilities “rival or exceed” those of the United States, said Winnona DeSombre, a research fellow at the Harvard Belfer Center, in congressional testimony on China’s cyber capabilities on February 17. “And its cyber defensive capabilities are able to detect many US operations—in some cases turning our own tools against us.”

Powerful tools

Daxin is just the latest powerful tool linked to China over the past year. It works by hijacking legitimate connections to hide its communications in normal network traffic. The result provides stealth and, on highly secure networks where direct internet connectivity is impossible, allows hackers to communicate across infected computers. The researchers who discovered it, from the cybersecurity firm Symantec, compare it to advanced malware they’ve seen that’s been linked to Western intelligence operations. It’s been in use at least as recently as November 2021.   

And in February of last year,  a massive hacking spree against Microsoft Exchange servers by multiple Chinese groups, beginning with zero-day exploits known as ProxyLogon vulnerabilities, showcased Beijing’s ability to coordinate an offensive so large in scale it seemed chaotic and reckless to outside observers. The onslaught effectively left a door wide open on tens of thousands of vulnerable email servers for any hacker to step through.

A quieter campaign uncovered in May saw multiple Chinese hacking groups use another zero-day vulnerability to successfully hack military, government, and tech industry targets across the United States and Europe.

People at the highest levels of power in China appreciate the importance of cyber capabilities. The CEO of Qihoo 360, the country’s biggest cybersecurity company, famously criticized Chinese researchers doing work outside the country and implored them to “stay in China” to realize the “strategic value” of powerful software vulnerabilities used in cyber-espionage campaigns. Within months, his company was linked to a hacking campaign against the country’s Uyghur minority. 

A wave of stricter regulations followed, tightening the government’s control of the cybersecurity sector and prioritizing the state’s security and intelligence agencies over all else—including the companies whose software is insecure. 

“The Chinese have a unique system reflecting the party-state’s authoritarian model,” says Dakota Cary, an analyst at Georgetown’s Center for Security and Emerging Technology. 

Chinese cyber researchers are effectively banned from attending international hacking events and competitions, tournaments they once dominated. A hacking contest pits some of the world’s best security researchers against one another in a race to find and exploit powerful vulnerabilities in the world’s most popular tech, like iPhones, Teslas, or even the kind of human-machine interfaces that help run modern factories. Prizes worth hundreds of thousands of dollars incentivize people to identify security flaws so that they can be fixed.  

Now, however, if Chinese researchers  want to go to international competitions, they require approval, which is rarely granted. And they must submit everything to government authorities beforehand—including any knowledge of software vulnerabilities they might be planning to exploit. No one other country  exerts such tight control over such a vast and talented class of security researchers. 

This mandate was expanded with regulation requiring all software security vulnerabilities to be reported to the government first, giving Chinese officials unparalleled early knowledge that can be used for defensive or offensive hacking operations.

“All of the vulnerability research goes through an equities process where the Chinese government gets right of first refusal,” says Adam Meyers, senior vice president of intelligence at the cybersecurity company CrowdStrike. “They get to choose what they’ll do with this, really increasing the visibility they have into the research being conducted and their ability to find utility in all of it.”

We’ve seen one exception to this rule: an employee of the Chinese cloud computing giant Alibaba reported the famous Log4j vulnerability to developers at Apache instead of first delivering it to Chinese government authorities. The result was a public punishment of Alibaba and implicit warning for anyone else thinking of making a similar move.

China’s stricter policies have an impact well outside the country itself.

Over the last decade, the “bug bounty” model has provided millions of dollars to build a global ecosystem of researchers who find software security vulnerabilities and are paid to report them. Multiple American companies host marketplaces where any tech firm can put its own products up for close examination in exchange for bounties to the researchers. 

By any measurement, China ranks at or near the top in alerting American firms to vulnerabilities in their software. In his congressional testimony last week, Cary said an unnamed large American firm had disclosed to him that Chinese researchers received $4 million in 2021. The American companies benefit from the participation of these Chinese researchers. When the researchers report a bug, the companies can fix it. That’s been the status quo since the bounty programs began booming in popularity a decade ago.

However, as the Chinese government tightens control, this multimillion-dollar ecosystem is now delivering a steady stream of software vulnerabilities to Chinese authorities—effectively funded by the companies and at no cost to Beijing.

“China’s policy that researchers must submit vulnerabilities to the Ministry of Industry and Information Technology creates an incredibly valuable pipeline of software capabilities for the state,” says Cary. “The policy effectively bought at least $4 million worth of research for free.”

Robot Hacking Games

In 2016, a powerful machine called Mayhem won the Cyber Grand Challenge, a cybersecurity competition held by the US Defense Advanced Research Projects Agency.

Mayhem, which belongs to a Pittsburgh company called ForAllSecure, won by automatically detecting, patching, and exploiting software security vulnerabilities. The Pentagon is now using the technology in all military branches. Both the defensive and offensive possibilities were immediately obvious to everyone watching—including Chinese officials.

DARPA hasn’t run a similar program since 2016. China, on the other hand, has put on at least seven “Robot Hacking Games” competitions since 2017, according to Cary’s research. Chinese academic, military, and private-sector teams have all been drawn to competitions overseen by the Chinese military. Official documents tie automated discovery of software vulnerabilities directly to China’s national goals.

As the Robot Hacking Games were beginning, the CEO of Qihoo 360 said automated vulnerability discovery tools were an “assassin’s mace” for China.

“Whoever masters the automatic vulnerability mining technology will have the first opportunity to attack and defend the network,” he said. Claiming that his own company had developed “a fully autonomous automatic vulnerability mining system,” he argued that the technology is the “‘killer’ of network security.”

The Robot Hacking Games are one example of the way Chinese officials at the highest level have been able to see an American success and then smartly make it their own.

“Time and again, China has studied the US system, copied its best attributes, and in many cases expanded the scope and reach,” says Cary.

As the US-China rivalry continues to function as the defining geopolitical relationship of the 21st century, cyber will play an outsize role in what China’s leaders rightfully call a “new era.” It touches everything from commercial competition to technological advancement and even warfare. 

In that new era, Xi’s stated goal is to make China a “cyber superpower.” By any measure, he’s done it.

Article link: https://www.technologyreview.com/2022/02/28/1046575/how-china-built-a-one-of-a-kind-cyber-espionage-behemoth-to-last/

The US must harness the power of Silicon Valley to spur military innovation – TechCrunch

Posted by timmreardon on 02/27/2022
Posted in: Uncategorized. Leave a comment

Steve Blank, Joe Felter, Raj Shah/ 11:06 AM EST•February 24, 2022

The Department of Defense, other U.S. government agencies and a bipartisan consensus in Congress realize that China is strategically leveraging diplomacy, information and intelligence, its military might and economic strength, and all other instruments of its national power to redefine the future world order.

Given China’s stated goals and objectives, we should expect continuity in this assessment in the coming decades.

To any dispassionate observer, U.S. responses to China’s aggressive whole-of-government efforts to dominate – especially in the military domain – have been piecemeal and ineffective. The systems (and people) we have in place to respond (requirements, acquisition, budgeting) were designed to optimize lifecycle cost and manage 30-year DOTMLPF processes.

Yet, we need the opposite to compete with our strategic rival – speed, urgency, scale, short life cycle, and attributable systems. Existing DoD systems are not designed to effectively tap into a commercial technology ecosystem that’s now driving most of the DoD-relevant advanced tech (AI/ML, autonomy, biotech, quantum, access to space, semiconductors, etc.)

We must more aggressively and deliberately harness the vast untapped potential of our world-renowned institutions of higher learning, namely the brilliant, innovative and creative students and faculty that flock to America’s flagship universities.

Many among the DoD’s senior military and civilian leaders understand this and have established well-intended innovation initiatives. But the enduring funding and efficacy of such initiatives often hinge on support from visionary individual leaders and are at risk when these key leaders’ tenures end.

The result is that our systems, organizations, headcount, and budget can’t scale to meet the challenge of China and other potential rivals. Our adversaries are innovating faster than our traditional systems can respond.

Many have written about reform of existing DoD systems (fix the planning, programming, budgeting and execution (PPBE) process; scale DoD accelerators and the Defense Innovation Unit; better utilizing existing acquisition authorities, etc.).

All are fine ideas, but they miss a core problem – the DoD has not engaged with commercial industry at scale. Harnessing the extraordinary potential of the U.S. private sector and bringing its vastly superior resources to bear more effectively is the key to prevailing in this strategic competition.

Silicon Valley is ready to get back in the game

For the first two decades after World War II, Silicon Valley was really “defense valley.” It built chips and systems for the DoD and intelligence community. Innovation in Silicon Valley started post-World War II with funding to Stanford University from the Office of Naval Research and then follow-on contracts from all the services to build advanced microwave and electronic systems.

The first major contract for the fledging semiconductor companies was for the guidance systems for the Minuteman II intercontinental ballistic missile and then the Apollo spacecraft. During the Cold War, Lockheed was Silicon Valley’s largest employer, building three generations of submarine launch ballistic missiles, satellites and other weapons systems. Arguably, our ability to mobilize the resources of Silicon Valley was critical to the U.S. ultimately prevailing in the Cold War competition with the Soviet Union.

Prevailing in this century’s strategic competition will require us to bring similar resources and talent to bear. Today, Silicon Valley sits at ground zero of a technology ecosystem that dwarfs the DoD, its prime contractors and federal labs. This ecosystem thrives on the toughest problems, moves with speed and urgency and, when incentivized, can bring capital and people at an enormous scale to solve these problems.

But the DoD is reluctant to acknowledge that this is a resource to tap at scale and speed. And because it hasn’t fully acknowledged it, it hasn’t considered what would be possible if you could marshal those resources.

And because it hasn’t imagined it, it hasn’t thought about what types of incentives could move the more than $300 billion per year in VC investments (versus $112 billion for DoD’s Research, Development, Test and Evaluation (RDT&E) Program or $132 billion in procurement) into areas that support dual-use activities with the potential to bolster our national security. (Think massive tax incentives, etc.)

The adage “to a hammer everything looks like a nail” aptly describes the answer to problem sets like threats to sovereignty and international law. Most thinking has been restricted to having more/better versions of existing weapon systems (ships, carriers, boomers, etc.) rather than alternate operational concepts and weapons that can be rapidly deployed to deter or win a war in the South China Sea or in the Baltics or Kaliningrad corridor.

It almost seems that conversations about new systems and concepts built around small, cheap, attributable, autonomous, lethal, mass, distributed, short-lifecycle projects from new vendors are prohibited. Yet, these are exactly the solutions an innovation ecosystem would rally around. Imagine, for example, 50 SpaceX equivalents helping to build a 21st-century DoD.

Bringing America’s best and brightest to strategic competition

Focusing and unleashing the power of the United States’ private sector and of Silicon Valley, in particular, with its unmatched innovation and extraordinary capital investment potential, can reverse the U.S. slide in capabilities relative to China and maintain our edge across a range of critical technologies.

Beyond that, we must more aggressively and deliberately harness the vast untapped potential of our world-renowned institutions of higher learning, namely the brilliant, innovative, and creative students and faculty that flock to America’s flagship universities.

We’ve succeeded at this before. Stanford and nearly every other major U.S. research university were integral to the military innovation ecosystem during the Cold War. What was unique in Silicon Valley, however, was that Stanford’s engineering department actively encouraged professors and graduate students to start military electronics companies, taking the best people and commercializing the technology to help win the race against the Soviet Union.

Inspired by this historic precedent, we recently established the Stanford Gordian Knot Center for National Security Innovation to harness Silicon Valley’s technology, talent, capital, speed, and passion for tough problems and help the United States prevail in this new era of strategic competition.

We must further advance efforts that coordinate resources across Silicon Valley and other innovation ecosystems. At our top universities, we must scale national security innovation education; train national security innovators; offer insight, integration, and policy outreach; and provide a continual output of minimal viable products that can act as catalysts for solutions to the toughest problems.

The stakes are too high not to bring all our resources and our very best human capital to the table.

Article link: https://techcrunch.com/2022/02/24/the-us-must-harness-the-power-of-silicon-valley-to-spur-military-innovation/

RISC-V AI Chips Will Be Everywhere – IEEE Spectrum

Posted by timmreardon on 02/26/2022
Posted in: Uncategorized. Leave a comment

Esperanto Techology’s chip heralds new era in open-source architecture; Intel set to cash in

SAMUEL K. MOORE

24 FEB 2022

The adoption of RISC-V, a free and open-source computer instruction set architecture first introduced in 2010, is taking off like a rocket. And much of the fuel for this rocket is coming from demand for AI and machine learning. According to the research firm Semico, the number of chips that include at least some RISC-V technology will grow 73.6 percent per year to 2027, when there will be some 25 billion AI chips produced, accounting for US $291 billion in revenue.

The increase from what was still an upstart idea just a few years ago to today is impressive, but for AI it also represents something of a sea change, says Dave Ditzel, whose company Esperanto Technologies has created the first high-performance RISC-V AI processor intended to compete against powerful GPUs in AI-recommendation systems. According to Ditzel, during the early mania for machine learning and AI, people assumed general-purpose computer architectures—x86 and Arm—would never keep up with GPUs and more purpose-built accelerator architectures. 

“We set out to prove all those people wrong,” he says. “RISC-V seemed like an ideal base to solve a lot of the kinds of computation people wanted to do for artificial intelligence.” 

With the company’s first silicon—a 1,092-core AI processor—in the hands of a set of early partners and a major development deal with Intel [see sidebar], he might soon be proved right. 

Ditzel’s entire career has been defined by the theory behind RISC-V. RISC, as you may know, stands for reduced instruction set computer. It was the idea that you could make a smaller, lower-power but better-performing processor by slimming down the core set of instructions it can execute. IEEE Fellow David Patterson coined the term in a seminal paper in 1980. Ditzel, his student, was the coauthor. Ditzel went on to work on RISC processors at Bell Labs and Sun Microsystems before cofounding Transmeta, which made a low-power processor meant to compete against Intel by translating x86 code for a RISC architecture. 

With Esperanto, Ditzel saw RISC-V as a way to accelerate AI with relatively low power consumption. At a basic level, a more complex instruction set architecture means you need more transistors on the silicon to make up the processor, each one leaking a bit of current when off and consuming power when it switches states. “That was what was attractive about RISC-V,” he says. “It had a simple instruction set.” 

The Core

The core of RISC-V is a set of just 47 instructions. The actual number of x86 instructions is oddly difficult to enumerate, but it’s likely near 1,000. Arm’s instruction set is thought to be much smaller, but still considerably larger than RISC-V’s. But simply using a slim set of instructions wouldn’t be enough to achieve the computing power Esperanto was aiming for, says Ditzel. “Most of the RISC-V cores out there aren’t that small or that energy efficient. So it’s not just a question of us taking a RISC-V core and slapping 1,000 of them on a chip. We had to completely redesign the CPU so that it would fit into those very tough constraints.”

Notably missing from the RISC-V instruction set at the time Ditzel and his colleagues started work were the “vector” instructions needed to efficiently do the math of machine learning, such as matrix multiplication. So Esperanto engineers came up with their own. As embodied in the architecture of the processor core, the ET-Minion, these included units that do 8-bit integer vectors and both 32- and 16-bit floating-point vectors. There are also units that do more complex “tensor” instructions, and systems related to the efficient movement of data and instructions related to the arrangement of ET-Minion cores on the chip. 

The resulting system-on-chip, ET-SoC-1, is made up of 1,088 of the ET-Minion cores along with four cores called ET-Maxions, which help govern the Minions’ work. The chip’s 24 billion transistors take up 570 square millimeters. That puts it at around half the size of the popular AI accelerator Nvidia A100. Those two chips follow very different philosophies. 

The ET-SoC-1 was designed to accelerate AI in power-constrained data centers at the heart of boards that fit into the peripheral component interconnect express (PCIe) slot of already installed servers. That meant the board had only 120 watts of power available, but it would have to provide at least 100 trillion operations per second to be worthwhile. Esperanto managed more than 800 trillion in that power envelope. 

Most AI accelerators are built around a single chip that uses the bulk of the board’s power budget, Esperanto.ai principal architect Jayesh Iyer told technologists at the RISC-V Summit in December. “Esperanto’s approach is to use multiple low-power chips, which still fits within the power budget,” he said. 

Each chip consumes 20 W when performing a recommender-system benchmark neural network—less than one-tenth what the A100 can draw—and there are six on the board. That combination of power and performance was achieved by reducing the chips’ operating voltages without the expected sacrifice in performance. (Generally, a higher operating voltage means you can run the chip’s clock faster and get more computing done.) At 0.75 volts, the nominal voltage for the ET-SoC-1’s manufacturing process, a single chip would blow way past the board’s power budget. But when dropping the voltage to about 0.4 V, you can run six chips from the board’s 120 W and achieve better than a fourfold boost in recommender-system performance over the single higher-voltage chip. At that voltage, each ET-Minion core is consuming only about 10 milliwatts. 

“Low voltage operation is the key differentiator for Esperanto’s ET-minion [core] design,” said Iyer. It informed architectural and circuit-level decisions, he said. For instance, the core’s pipeline for the RISC-V integer instructions is made up of the fewest number of logic gates per clock cycle, allowing a higher clock rate at the reduced voltage. And when the core is performing long tensor computations, that pipeline is shut down to save energy. 

Other AI Processors

Other recently developed AI processors have also turned to a combination of RISC-V and their own custom machine-learning acceleration. For example, Ceremorphic, which recently came out of stealth with its Hierarchical Learning Processor, uses both a RISC-V and an Arm core along with its own custom machine-learning and floating-point arithmetic units. And Intel’s upcoming Mobileye EyeQ Ultra will have 12 RISC-V cores with its neural-network accelerators in a chip meant to provide the intelligence for Level 4 autonomous driving. 

Intel’s Big Bet on RISC-V

In perhaps its biggest move to jump-start its manufacturing business, Intel said it would create a US $1 billion fund to support startups and other potential customers of Intel Foundry Services. Though the company didn’t say where the first funds will go in announcing the fund, it is planning “investments and offerings that will strengthen the ecosystem and help drive further adoption of RISC-V.”

Putting its weight behind RISC-V was a smart move, because it makes the company seem more like a manufacturing partner than a competitor, according to Semico Research principal analyst Rich Wawrzyniak. “All at once they not only validated the RISC-V architecture, they said they’re open for business,” he says. “They’re not going to restrict people in terms of the CPU designs they want to build. That itself is probably more important than anything else.”

Intel says it will help RISC-V companies innovate faster by prioritizing their manufacturing runs and working with them to codevelop manufacturing technology that works for their designs, among other things. Intel, which is now a member of RISC-V International, singled out four major collaborations with RISC-V heavyweights that it expects will attract foundry customers:

Andes Technology: The San Jose–based company designs RISC-V processor cores for embedded systems. And through the new deal, Intel Foundry Services customers can integrate its low-power cores. Intel’s foundry capacity “ensures that SoC designs based on Andes RISC-V processor cores will achieve the production ramp and volume successfully,” said CEO Frankwell Lin, in a press release.

Esperanto Technologies:The company’s current 1,092-core chip is made using TSMC’s N7 process. Through a strategic partnership, “Esperanto is getting access to Intel’s advanced technology nodes,” says CEO Dave Ditzel. The company could make the next generation of its SoC using the Intel 3 process, he says. “That’s not one but actually two generations ahead of its current design.” With that, Esperanto could increase the number of cores on its chips or make those chips smaller. The latter is more likely, he says. Different numbers of 1,092-core chiplets could then be combined in a package using Intel’s chiplet packaging technologies to tackle a variety of tasks.

SiFive: The Silicon Valley designer of RISC-V cores and development tools partnered with Intel Foundry Services to produce SiFive’s P550 multicore processors in the foundry’s Intel 4 manufacturing process, which is set to come on line later this year. The resulting chips will go to developer boards meant to drive growth in RISC-V. It’s worth noting that SiFive extended an existing partnership with Samsung last April to integrate custom AI accelerator blocks with SiFive’s RISC-V blocks using Samsung’s foundry technology.

Ventana Microsystems: The startup has a deal to make its data-center-class tech available to Intel Foundry Services customers. Foundry customers will be able to integrate Ventana’s data-center-class cores into their SoCs using Intel’s “leading manufacturing process,” or use its multicore chiplets to rapidly make custom systems.

Turning to RISC-V was both a business and technical move for embedded AI processor firm Kneron. The company has been selling chips and intellectual property using Arm CPU cores and its custom accelerator infrastructure. But last November Kneron released its first RISC-V-based tech in the KL530, aimed at supporting autonomous driving with a relatively new type of neural network called a vision transformer. According to Kneron CEO Albert Liu, the RISC-V architecture makes it easier to preprocess neural-network models so they run more efficiently. However, “it also made sense in light of the potential Arm acquisition by Nvidia last year to de-risk ourselves of any possible business decisions that could impact us,” he says. That deal fell apart in February but would have put the provider of Kneron’s previous CPU core architecture in the hands of a competitor. 

Future RISC-V processors will be able to tackle machine-learning-related operations using an open-source set of instructions agreed upon by the community. RISC-V International, the body that governs the codification of the core instruction set architecture and new extensions, ratified a set of just over 100 vector instructions in December 2021. 

With the new vector instructions, “somebody doing their own thing in AI doesn’t have to start from scratch,” says the organization’s CTO, Mark Himelstein. “They can use the instructions that other companies are using. They can use the tools that other companies are using. And then they can innovate in the implementation or power consumption, or performance, or whatever it is their added value is.” 

Even with the vector extensions, promoting machine learning remains a top priority for the RISC-V community, says Himelstein. Most of the development of ML-related extensions to RISC-V is happening in the organization’s graphics special interest group, which merged with the machine-learning group “because they wanted the same things,” he says. But other groups, such as those interested in high-performance and data-center computing, are also focusing on ML-related extensions. It’s Himelstein’s job to make sure the efforts converge where they can. 

Despite RISC-V’s successes, Arm is the market leader in many of the markets where a lot of new AI functions are being added, and it is likely to still be so five years from now, with RISC-V capturing about 15 percent of total revenue in the market for CPU core designs, says Semico Research principal analyst Rich Wawrzyniak. “It’s not 50 percent, but it’s not 5 percent either. And if you think about how long RISC-V has been around, that’s pretty fast growth.”

Article link: https://spectrum.ieee.org/risc-v-ai

Vista Modernization Still a Priority at VA, but Funding Is a Question – FCW

Posted by timmreardon on 02/23/2022
Posted in: Uncategorized. Leave a comment

Two instances of the homegrown electronic health record have been moved to the cloud as part of a pilot project, but it’s not clear if funding or support exists for more cloud migration as VA transitions to a commercial electronic health record.

ADAM MAZMANIAN | 

FEBRUARY 22, 2022 10:00 AM ET

Kurt DelBene, the new chief information officer at the Department of Veterans Affairs, said he planned to continue support of the agency’s homegrown electronic health records system during the transition to a commercial system scheduled to be completed in 2028.

The modernization project, launched in 2017 with a planned $16 billion budget, has its own section of the VA’s budget outside the Office of Information and Technology, and its leader reports to the agency’s deputy secretary, Donald Remy. But OI&T is deeply enmeshed in the shift to the same Cerner commercial electronic health record system, providing support for infrastructure development and readiness.

At the same time, OI&T is responsible for keeping the Veterans Health Information Systems and Technology Architecture (Vista) operational. For now, most patient care is handled through Vista; the Cerner product is up and running at the Mann Grandstaff Medical Center in Spokane, Wash., and a few more deployments are scheduled in the near term.

“We’re going to be using Vista for a period of time, and during that we can’t falter on delivering great care to veterans,” DelBene said on a Feb. 17 call with reporters. “So I think of Vista as a system that we absolutely have to continue to modernize. The health care landscape changes, and Vista needs to change and continue with the great support as it has had before.”

Todd Simpson, VA’s deputy assistant secretary of DevSecOps, said on the same call that two of the 130 separate instances of Vista have been put into VA’s Amazon Web Services cloud as part of a pilot. In addition, all the backup data for Vista has been moved into the cloud.

Simpson said it was possible that more instances of Vista could be moved to the cloud. “It comes down to funding and overall strategic direction driven from [Electronic Health Record Modernization], leadership and of course the customer,” meaning the Veterans Health Administration.

Funding is a big issue for Vista, in part because its costs are difficult to measure. The homegrown application now has 130 separate instances in use at different VA facilities. 

Because of the way the legacy system has sprawled into multiple systems, “VA lacks a comprehensive definition of Vista and cannot accurately report Vista costs,” the agency stated in its 2022 budget request.

Additionally, DelBene said plans to sunset Vista will have to wait for the EHRM project to advance. “Part of that is further out,” DelBene said. “My major commitment is to continue to support necessary changes. Then we’ll figure out what transition looks like.”

Article link:
https://fcw.com/2022/02/vista-modernization-still-priority-va-funding-question/362219/

Posts navigation

← Older Entries
Newer Entries →
  • Search site

  • Follow healthcarereimagined on WordPress.com
  • Recent Posts

    • There are more AI health tools than ever—but how well do they work? – MIT Technology Review 03/30/2026
    • Are AI Tools Ready to Answer Patients’ Questions About Their Medical Care? – JAMA 03/27/2026
    • How AI use in scholarly publishing threatens research integrity, lessens trust, and invites misinformation – Bulletin of the Atomic Scientists 03/25/2026
    • VA Prepares April Relaunch of EHR Program – GovCIO 03/19/2026
    • Strong call for universal healthcare from Pope Leo today – FAN 03/18/2026
    • EHR fragmentation offers an opportunity to enhance care coordination and experience 03/16/2026
    • When AI Governance Fails 03/15/2026
    • Introduction: Disinformation as a multiplier of existential threat – Bulletin of the Atomic Scientists 03/12/2026
    • AI is reinventing hiring — with the same old biases. Here’s how to avoid that trap – MIT Sloan 03/08/2026
    • Fiscal Year 2025 Year In Review – PEO DHMS 02/26/2026
  • Categories

    • Accountable Care Organizations
    • ACOs
    • AHRQ
    • American Board of Internal Medicine
    • Big Data
    • Blue Button
    • Board Certification
    • Cancer Treatment
    • Data Science
    • Digital Services Playbook
    • DoD
    • EHR Interoperability
    • EHR Usability
    • Emergency Medicine
    • FDA
    • FDASIA
    • GAO Reports
    • Genetic Data
    • Genetic Research
    • Genomic Data
    • Global Standards
    • Health Care Costs
    • Health Care Economics
    • Health IT adoption
    • Health Outcomes
    • Healthcare Delivery
    • Healthcare Informatics
    • Healthcare Outcomes
    • Healthcare Security
    • Helathcare Delivery
    • HHS
    • HIPAA
    • ICD-10
    • Innovation
    • Integrated Electronic Health Records
    • IT Acquisition
    • JASONS
    • Lab Report Access
    • Military Health System Reform
    • Mobile Health
    • Mobile Healthcare
    • National Health IT System
    • NSF
    • ONC Reports to Congress
    • Oncology
    • Open Data
    • Patient Centered Medical Home
    • Patient Portals
    • PCMH
    • Precision Medicine
    • Primary Care
    • Public Health
    • Quadruple Aim
    • Quality Measures
    • Rehab Medicine
    • TechFAR Handbook
    • Triple Aim
    • U.S. Air Force Medicine
    • U.S. Army
    • U.S. Army Medicine
    • U.S. Navy Medicine
    • U.S. Surgeon General
    • Uncategorized
    • Value-based Care
    • Veterans Affairs
    • Warrior Transistion Units
    • XPRIZE
  • Archives

    • March 2026 (9)
    • February 2026 (6)
    • January 2026 (8)
    • December 2025 (11)
    • November 2025 (9)
    • October 2025 (10)
    • September 2025 (4)
    • August 2025 (7)
    • July 2025 (2)
    • June 2025 (9)
    • May 2025 (4)
    • April 2025 (11)
    • March 2025 (11)
    • February 2025 (10)
    • January 2025 (12)
    • December 2024 (12)
    • November 2024 (7)
    • October 2024 (5)
    • September 2024 (9)
    • August 2024 (10)
    • July 2024 (13)
    • June 2024 (18)
    • May 2024 (10)
    • April 2024 (19)
    • March 2024 (35)
    • February 2024 (23)
    • January 2024 (16)
    • December 2023 (22)
    • November 2023 (38)
    • October 2023 (24)
    • September 2023 (24)
    • August 2023 (34)
    • July 2023 (33)
    • June 2023 (30)
    • May 2023 (35)
    • April 2023 (30)
    • March 2023 (30)
    • February 2023 (15)
    • January 2023 (17)
    • December 2022 (10)
    • November 2022 (7)
    • October 2022 (22)
    • September 2022 (16)
    • August 2022 (33)
    • July 2022 (28)
    • June 2022 (42)
    • May 2022 (53)
    • April 2022 (35)
    • March 2022 (37)
    • February 2022 (21)
    • January 2022 (28)
    • December 2021 (23)
    • November 2021 (12)
    • October 2021 (10)
    • September 2021 (4)
    • August 2021 (4)
    • July 2021 (4)
    • May 2021 (3)
    • April 2021 (1)
    • March 2021 (2)
    • February 2021 (1)
    • January 2021 (4)
    • December 2020 (7)
    • November 2020 (2)
    • October 2020 (4)
    • September 2020 (7)
    • August 2020 (11)
    • July 2020 (3)
    • June 2020 (5)
    • April 2020 (3)
    • March 2020 (1)
    • February 2020 (1)
    • January 2020 (2)
    • December 2019 (2)
    • November 2019 (1)
    • September 2019 (4)
    • August 2019 (3)
    • July 2019 (5)
    • June 2019 (10)
    • May 2019 (8)
    • April 2019 (6)
    • March 2019 (7)
    • February 2019 (17)
    • January 2019 (14)
    • December 2018 (10)
    • November 2018 (20)
    • October 2018 (14)
    • September 2018 (27)
    • August 2018 (19)
    • July 2018 (16)
    • June 2018 (18)
    • May 2018 (28)
    • April 2018 (3)
    • March 2018 (11)
    • February 2018 (5)
    • January 2018 (10)
    • December 2017 (20)
    • November 2017 (30)
    • October 2017 (33)
    • September 2017 (11)
    • August 2017 (13)
    • July 2017 (9)
    • June 2017 (8)
    • May 2017 (9)
    • April 2017 (4)
    • March 2017 (12)
    • December 2016 (3)
    • September 2016 (4)
    • August 2016 (1)
    • July 2016 (7)
    • June 2016 (7)
    • April 2016 (4)
    • March 2016 (7)
    • February 2016 (1)
    • January 2016 (3)
    • November 2015 (3)
    • October 2015 (2)
    • September 2015 (9)
    • August 2015 (6)
    • June 2015 (5)
    • May 2015 (6)
    • April 2015 (3)
    • March 2015 (16)
    • February 2015 (10)
    • January 2015 (16)
    • December 2014 (9)
    • November 2014 (7)
    • October 2014 (21)
    • September 2014 (8)
    • August 2014 (9)
    • July 2014 (7)
    • June 2014 (5)
    • May 2014 (8)
    • April 2014 (19)
    • March 2014 (8)
    • February 2014 (9)
    • January 2014 (31)
    • December 2013 (23)
    • November 2013 (48)
    • October 2013 (25)
  • Tags

    Business Defense Department Department of Veterans Affairs EHealth EHR Electronic health record Food and Drug Administration Health Health informatics Health Information Exchange Health information technology Health system HIE Hospital IBM Mayo Clinic Medicare Medicine Military Health System Patient Patient portal Patient Protection and Affordable Care Act United States United States Department of Defense United States Department of Veterans Affairs
  • Upcoming Events

Blog at WordPress.com.
healthcarereimagined
Blog at WordPress.com.
  • Subscribe Subscribed
    • healthcarereimagined
    • Join 153 other subscribers
    • Already have a WordPress.com account? Log in now.
    • healthcarereimagined
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...