healthcarereimagined

Envisioning healthcare for the 21st century

  • About
  • Economics

Public Health Data Still Facing Challenges Despite Previous Recommendations, Watchdog Says – Nextgov

Posted by timmreardon on 09/28/2022
Posted in: Uncategorized. Leave a comment

By KIRSTEN ERRICKSEPTEMBER 26, 2022

The Government Accountability Office highlighted deficiencies in common data standards, interoperability and public health IT infrastructure.

In light of public health emergencies like the rapidly-evolving COVID-19 pandemic, the importance of health data and the ability to share it is critical, according to a Government Accountability Office report released Thursday.

The watchdog found that public health entities do not have the ability to share new data and other important information in real-time, hindering the ability to respond to health crises quickly. 

GAO stated that there are numerous challenges to the way the government manages public health data. In particular, the watchdog noted three deficiencies that the government must address: common standards for collecting data, interoperability and public health IT infrastructure.

Common data standards are “requirements for public health entitles to collect certain data elements, such as patient characteristics (e.g., name, sex, and race) and clinical information (e.g., diagnosis and test results) in a specific way.” A lack of such standards can make it difficult for federal agencies to compare data across different states, a challenge the Centers for Disease Control and Prevention faced in collecting information on the COVID-19 pandemic.

Interoperability is “the ability of data collection systems to exchange information with and process information from other systems.” GAO noted that the lack of interoperability during the early days of COVID-19, made health officials and hospitals to manually input data into several systems, affecting the CDC’s ability to make timely decisions.

Lastly, public health IT infrastructure refers to “the computer software, hardware, networks and policies that enable health entities to report and retrieve data and information.” Not having a public health IT infrastructure can hinder the timeliness and completeness of information shared during public health emergencies. For example, during the early stages of the COVID-19 pandemic, some states manually collected, processed and transferred data, which increased the likelihood of errors. 

While the Department of Health and Human Services launched the HHS Protect platform in April 2020 to provide a centralized source of information reflecting COVID-19 numbers, public health and state organizations questioned the completeness and accuracy of the data.

The report noted that federal law requires HHS to establish a national public health situational awareness network with a standard data format that provides secure, real-time information to help detect and respond to infectious diseases. However, this has not been implemented nor have previous issues identified by GAO been addressed. According to a previous GAO report, HHS Protect does not satisfy all of the legal requirements for this network. As a result, the watchdog states that the legally mandated requirements for this network do not currently exist. 

Specifically, the report stated, “We agree that HHS has implemented several systems related to public health situational awareness and biosurveillance [including HHS Protect]. However, the law called for a near real-time electronic nationwide public health situational awareness and biosurveillance capability through an interoperable network of systems to share data and information. This capability was to be made up of interoperable systems that would enable the simultaneous sharing of information needed to enhance situational awareness at the federal, state, local and tribal levels of public health. This capability does not exist today.” 

According to GAO, “having near real-time access to these data could significantly improve our nation’s preparedness for public health emergencies and potentially save lives. Without the network, federal, state and local health departments, hospitals and laboratories are left without the ability to easily share health information in real-time to respond effectively to diseases.”

GAO stated that it previously made several recommendations related to the three challenges, but these have not been implemented.

For common data standards, GAO recommended that HHS establish an expert committee for data collection and reporting standards and for the CDC to define specific actions and a timeframe for its data modernization efforts. 

To address the issue of interoperability, GAO recommended that HHS ensure that its public health situational awareness network includes how interoperability standards will be used. 

For public health IT infrastructure, GAO recommended that HHS prioritize developing the network by establishing actions—both near- and long-term—to show progress, for the agency to identify an office to oversee the network’s development as well as to identify and document challenges to information sharing and lessons from the COVID-19 pandemic.

The most recent report did not include how the agencies responded to GAO’s recommendations. However, the previous recommendations were met with mixed agreement by HHS.

Article link: https://www.nextgov.com/analytics-data/2022/09/public-health-data-still-facing-challenges-despite-previous-recommendations-watchdog-says/377665/

Artificial intelligence reduces a 100,000-equation quantum physics problem to only four equations

Posted by timmreardon on 09/26/2022
Posted in: Uncategorized. Leave a comment

by Thomas Sumner , Simons Foundation

Using artificial intelligence, physicists have compressed a daunting quantum problem that until now required 100,000 equations into a bite-size task of as few as four equations—all without sacrificing accuracy. The work, published in the September 23 issue of Physical Review Letters, could revolutionize how scientists investigate systems containing many interacting electrons. Moreover, if scalable to other problems, the approach could potentially aid in the design of materials with sought-after properties such as superconductivity or utility for clean energy generation.

“We start with this huge object of all these coupled-together differential equations; then we’re using machine learning to turn it into something so small you can count it on your fingers,” says study lead author Domenico Di Sante, a visiting research fellow at the Flatiron Institute’s Center for Computational Quantum Physics (CCQ) in New York City and an assistant professor at the University of Bologna in Italy.

The formidable problem concerns how electrons behave as they move on a gridlike lattice. When two electrons occupy the same lattice site, they interact. This setup, known as the Hubbard model, is an idealization of several important classes of materials and enables scientists to learn how electron behavior gives rise to sought-after phases of matter, such as superconductivity, in which electrons flow through a material without resistance. The model also serves as a testing ground for new methods before they’re unleashed on more complex quantum systems.

The Hubbard model is deceptively simple, however. For even a modest number of electrons and cutting-edge computational approaches, the problem requires serious computing power. That’s because when electrons interact, their fates can become quantum mechanically entangled: Even once they’re far apart on different lattice sites, the two electrons can’t be treated individually, so physicists must deal with all the electrons at once rather than one at a time. With more electrons, more entanglements crop up, making the computational challenge exponentially harder.

One way of studying a quantum system is by using what’s called a renormalization group. That’s a mathematical apparatus physicists use to look at how the behavior of a system—such as the Hubbard model—changes when scientists modify properties such as temperature or look at the properties on different scales. Unfortunately, a renormalization group that keeps track of all possible couplings between electrons and doesn’t sacrifice anything can contain tens of thousands, hundreds of thousands or even millions of individual equations that need to be solved. On top of that, the equations are tricky: Each represents a pair of electrons interacting.

Di Sante and his colleagues wondered if they could use a machine learning tool known as a neural network to make the renormalization group more manageable. The neural network is like a cross between a frantic switchboard operator and survival-of-the-fittest evolution. First, the machine learning program creates connections within the full-size renormalization group. The neural network then tweaks the strengths of those connections until it finds a small set of equations that generates the same solution as the original, jumbo-size renormalization group. The program’s output captured the Hubbard model’s physics even with just four equations.

“It’s essentially a machine that has the power to discover hidden patterns,” Di Sante says. “When we saw the result, we said, ‘Wow, this is more than what we expected.’ We were really able to capture the relevant physics.”

Training the machine learning program required a lot of computational muscle, and the program ran for entire weeks. The good news, Di Sante says, is that now that they have their program coached, they can adapt it to work on other problems without having to start from scratch. He and his collaborators are also investigating just what the machine learning is actually “learning” about the system, which could provide additional insights that might otherwise be hard for physicists to decipher.

Ultimately, the biggest open question is how well the new approach works on more complex quantum systems such as materials in which electrons interact at long distances. In addition, there are exciting possibilities for using the technique in other fields that deal with renormalization groups, Di Sante says, such as cosmology and neuroscience.

Article link: https://phys-org.cdn.ampproject.org/c/s/phys.org/news/2022-09-artificial-intelligence-equation-quantum-physics.amp

White House: U.S. agencies have 90 days to create inventory of all software

Posted by timmreardon on 09/24/2022
Posted in: Uncategorized. Leave a comment

The White House released new guidance this week ordering federal agencies to create a full inventory of the software they use within 90 days. 

In a letter to all heads of executive departments and agencies, White House Office of Management and Budget (OMB) director Shalanda Young said a wide-ranging cybersecurity executive order handed down last May by President Joe Biden directed the National Institute of Standards and Technology (NIST) to publish guidance on how the agencies can better protect government systems through more secure software. 

Now that NIST has finished creating its guidance, the OMB wants all agencies to implement it for any third-party software used with an organization’s computer systems. The rules do not apply to software developed by agencies themselves. 

One of the key tenets of the NIST guidance is having agencies ask the producers of software to show they have followed a “risk-based approach for secure software development,” and agencies are now banned from using software that does not comply with the NIST guidelines. 

The software vendors will need to send agencies a “self-attestation” letter about the product’s security features, recent changes and more. The vendors also have to attest to following “secure development practices.”

In addition to the 90-day deadline for creating an inventory of all software used, agency chief information officers have 120 days to build out a process for communicating the new requirements to software vendors. Within 270 days, agencies need to collect letters from vendors about “critical” software. Agencies will need to have letters from vendors about all software — critical and otherwise — by next September. 

The letter gives agency CIOs one more task: they have six months to train employees to validate what the software companies claim in their letters. Any extensions to the timelines need to be applied for within 30 days of the deadline.

Chris DeRusha, Federal Chief Information Security Officer and Deputy National Cyber Director, said it is important for agencies to embark on this process because they handle “everything from tax returns to veteran’s health records.”

DeRusha specifically cited the SolarWinds scandal as one of the reasons why the effort was paramount for agencies, noting that the 2020 incident saw several federal agencies and corporations compromised by malicious code that was added into SolarWinds software.

The small change created a backdoor into the digital infrastructure of federal agencies and private sector companies.

“This incident was one of a string of cyber intrusions and significant software vulnerabilities over the last two years that have threatened the delivery of Government services to the public, as well as the integrity of vast amounts of personal information and business data that is managed by the private sector,” DeRusha said.

He added that the goal of the effort is to ensure that “millions of lines of code that underpin Federal agencies’ work are built with industry security standards in place.”

Young, the OMB director, said these tasks are part of a larger effort to get agencies to regularly use NIST guidance when choosing what software to use. 

Young also outlined a process where the “self-attestation” letters from vendors can be supplemented or replaced by a third-party assessment provided by a federally-certified organization or government agency. 

The letter to agencies adds that they can require a Software Bill of Materials (SBOMs) — an ingredient list of parts that make up a piece of software. 

In recent months, the Cybersecurity and Infrastructure Security Agency (CISA) has led an industry-wide pushto make SBOMs a common part of releasing any piece of software, with the hope that the practice will make it easier to find potentially vulnerable features. Cybersecurity experts and lawmakers say that SBOMs would make it easier for organizations to deal with issues like the Log4j vulnerabilityuncovered in December 2021, because it would help identify whether they’re at risk and how to mitigate the threat.

OMB is working with CISA and the General Services Administration to create a centralized repository for software attestations. CISA was also given a range of tasks to complete over the next year that include creating some of the guidelines agencies will need to follow going forward. 

DeRusha explained that the guidance released this week was created with input from the public and private sector as well as academia. It will make it easier for federal authorities to quickly identify security gaps when new vulnerabilities are discovered, he added.

“Not too long ago, the only real criteria for the quality of a piece of software was whether it worked as advertised,” DeRusha said. “With the cyber threats facing Federal agencies, our technology must be developed in a way that makes it resilient and secure, ensuring the delivery of critical services to the American people while protecting the data of the American public and guarding against foreign adversaries.”

“Long Overdue”

Rick McElroy, principal cybersecurity strategist for VMware, told The Record that while the timeline “seems aggressive,” all of the goals listed in the advisory are “worthy and long overdue.”

He added that it will have a major impact on any provider of technology services or software to governmental agencies, noting that suppliers will now need to be prepared to respond to the requirements outlined. 

Additionally, because the government spends billions of dollars on software, the requirements might have a downstream effect — software developers might decide to offer these features to regular customers.

Immersive Labs’ Kev Breen explained that executives’ focus on rapidly shipping new products to market means that cybersecurity is not always the top priority, potentially exposing companies to millions in lost revenue and damaged brand reputations.

Former Obama administration cybersecurity commissioner Tom Kellermann told The Record that software supply chains are now under siege thanks to a deluge of cybercriminals and spies specifically targeting software development, integration, and delivery infrastructure.

“Given the sophistication of recent software supply chain cyberattacks, ensuring software integrity is paramount to protecting Federal systems systemic cyberattacks,” he said. 

DeRusha said the software changes are part of a larger effort by the Biden-Harris Administration to modernize agency cybersecurity practices, improve detection and response to threats as well as “quickly investigate and recover from cyberattacks.”

“The guidance released today will help us build trust and transparency in the digital infrastructure that underpins our modern world and will allow us to fulfill our commitment to continue to lead by example while protecting the national and economic security of our country,” he said.

Jonathan Greig 

Jonathan has worked across the globe as a journalist since 2014. Before moving back to New York City, he worked for news outlets in South Africa, Jordan and Cambodia. He previously covered cybersecurity at ZDNet and TechRepublic.

Article link: https://therecord-media.cdn.ampproject.org/c/s/therecord.media/white-house-u-s-agencies-have-90-days-to-create-inventory-of-all-software/amp/

VA Official Has ‘Deep Concerns’ About Agency’s EHR Deployment – Nextgov

Posted by timmreardon on 09/23/2022
Posted in: Uncategorized. Leave a comment

By EDWARD GRAHAMSEPTEMBER 22, 2022

But leadership at the agency pushed back against projected cost overruns and delays moving forward.

Senators pressed officials from the Veterans Affairs Department to address cost and usability issues hindering the rollout of VA’s multi-billion dollar Oracle-Cerner Millennium electronic health record system during a Senate Appropriations Subcommittee on Military Construction, Veterans Affairs and Related Agencies hearing on Wednesday.

Dr. Shereef Elnahal, VA’s undersecretary for health, said he has “deep concerns about the system as it’s functioning for frontline employees and service to veterans,” noting that he visited the VA Central Ohio Healthcare System in Columbus earlier this year following the medical center’s EHR deployment and “saw folks struggling with this system deeply.”

“Among the most concerning things that I saw was a phenomenon whereby our frontline clinicians, when they put in an order or were trying to interface with the system, they were not confident in many cases and in many clinical settings that those orders were actually getting where they needed to go,” Elnahal said, adding that the experience led to a broader review within VA of how workflows and the configuration of the EHR system could be improved.

Sen. Martin Heinrich, D-N.M., who chairs the subcommittee, reiterated his support for VA’s EHR modernization efforts, calling it “an extremely important effort to solve a decades-long problem.” But Heinrich expressed concern about VA’s transparency throughout the EHR deployment process, citing cost overruns that were not initially conveyed to Congress when the program was launched and an implementation process that has been hindered by delays and functionality concerns. 

“I am glad VA is not rushing deployments until there’s more confidence in the likelihood of success, but the department needs to be straightforward with Congress about what is reasonable and achievable,” Heinrich said.

VA signed a $10 billion, 10-year contract with Cerner in 2018 to implement new, interoperable EHR software across the department’s national network of 171 medical centers, replacing its legacy system. The EHR system rollout, however, has been plagued by software outages, logistical delays and patient safety issues that have hampered the deployment process. A highly critical watchdog report released by the VA Inspector General’s office in July found that the EHR system deployed at the department’s initial rollout site—the Mann-Grandstaff VA Medical Center in Spokane, Washington—routed more than 11,000 clinical orders for veterans to an “unknown queue” without alerting clinicians. 

Sen. John Boozman, R-Ark., the subcommittee’s ranking member, noted that approximately $8.5 billion has already been appropriated to launch the EHR system over the past five years, and that VA is seeking another $1.75 billion for the program for fiscal year 2023. 

“At the outset, we were told that this program would cost no more than $16 billion and would be complete in 10 years,” Boozman said. “In the years since, VA has deployed the new system at only a small handful of sites, and those rollouts have been challenging, to say the least.”

VA’s beleaguered EHR deployment has garnered bipartisan criticism from lawmakers, who passed legislation—the VA Electronic Health Record Transparency Act—earlier this year requiring that the VA Secretary submit reports to Congress on cost, performance and patient safety issues related to the system’s deployment. President Joe Biden signed the bill into law in June. The Senate subcommittee hearing was held one day after lawmakers on two House Veterans’ Affairs subcommittees faulted VA for its major acquisition failures, including its EHR system deployment. 

Sen. Jon Tester, D-Mont., who chairs the Senate Veterans’ Affairs Committee, criticized VA’s lack of progress deploying the new EHR system and said “I don’t know if we’ve got a return on investment to speak of at all.”

“We’re into this damn near five years—it’ll be five years in May—and we still haven’t done a damn thing,” Tester said. “I mean, we’ve implemented, and it’s been a trainwreck, in my opinion.”

The VA had rolled out the EHR system at five medical sites, before announcing in July that it was postponing future deployments of the software until next year. VA initially delayed a planned EHR rollout at Boise VA Medical Center in June, before indefinitely postponing the site’s rollout in July after an assessment determined that “more could be done to ensure a safe and successful deployment.” VA’s EHR deployment schedulecurrently has the software slated to go live at 25 medical sites, including Boise, in fiscal year 2023. 

VA Deputy Secretary Donald Remy said the department is “looking closely at the schedule” and that “there are issues that need to be resolved before we can go live.”

Remy said that VA uses a site readiness checklist—which includes issues related to training, infrastructure and patient safety protections—to determine whether or not the EHR system is ready for deployment. That review process, he said, will largely guide future rollouts. 

“As we’re working with a site for potential deployment, we work through these issues to make sure they have them covered,” Remy added. “An example of the effectiveness of that checklist was Boise recently, where we determined we weren’t going live as we went through the checklist.”

A cost estimate provided to Congress by the Institute for Defense Analyses—or IDA—in July estimated that the lifecycle cost of the EHR system’s implementation would be more than $50 billion over 28 years, with the EHR deployment process at all of VA’s medical centers taking 13 years. As was noted during the hearing, VA has not yet allowed IDA to publicly release its cost estimate. 

Dr. Brian Rieksts, a research staff member in IDA’s cost analysis and research division, said that VA’s 10-year, $16 billion implementation and cost estimate did not take into account assumed productivity losses at medical centers deploying the EHR software, as well as sustainment costs needed to keep the EHR systems operational. And Rieksts told Boozman that IDA’s additional three-year deployment estimate was based on risk analysis.

“We estimate a range of one to five additional years over the 10-year period that will be required, and that’s based on both looking at historical programs and the challenges that they’ve had, and then events that have happened with the current program that have led to delays that don’t indicate that this program would behave differently than historical programs,” Rieksts said. 

Remy said, however, that VA still believes it can accomplish the EHR rollout within the 10-year contract period—although he added the caveat that the department is committed to doing so in a “safe, effective manner for our clinicians and our veterans.”

“If that needs to go beyond 10 years, we’re working through the process of determining what that time period might be,” Remy said. 

Mike Sicilia, executive vice president for industries at Oracle, told the senators that proper implementation of the EHR system was Oracle’s “most important” priority and added that the company has moved 2,000 employees onto the project to complement the existing Cerner team. Sicilia said Oracle—which acquired Cerner in June—still believes it “can deliver this system within the budget envelope and without the need for any additional funds.” 

Sicilia cited Oracle’s recent partnership with Accenture to “evaluate the current training program,” as well as the launch this week of a new dashboard “that catalogs our to-do list and progress being made” on the project, as examples of the company’s commitment to ensuring the success of the EHR system’s deployment moving forward. 

“As we have examined the underlying causes for these delays and challenges, our conclusion is that we have found nothing that can’t be addressed in reasonably short order to get us back on a workable schedule and within budget,” Sicilia said. “We know we have a lot to prove with deployments next year at larger, more complex sites. We view the next year as a key window for building momentum and turning the corner.”

Article link: https://www.nextgov.com/it-modernization/2022/09/va-official-has-deep-concerns-about-agencys-ehr-deployment/377508/

VA Acquisition Management:Action Needed to Ensure Success of New Oversight Framework – GAO

Posted by timmreardon on 09/22/2022
Posted in: Uncategorized. Leave a comment

Fast Facts

For over a decade, VA has tried with little success to implement a framework for managing and overseeing how it purchases goods and services. We added VA acquisition management to our High Risk List in 2019.

VA plans to implement a new framework but hasn’t:

  • Identified major acquisition programs subject to the new framework
  • Addressed acquisition workforce needs

If VA takes steps to address these and other challenges prior to implementing the new framework, VA could acquire goods and services more efficiently to support veterans.

We recommended VA take steps to address challenges before implementing the new framework.

medical supplies

Skip to HighlightsHighlights

What GAO Found

For over a decade, the Department of Veterans Affairs (VA) has worked to implement a framework for managing how it purchases goods and services, with little success. GAO found that VA has not used its Acquisition Program Management Framework, its current framework, which has been in place since 2017. This framework includes features—such as phases, key documents, and identified decision authorities—that could provide standardized management and oversight of VA’s major acquisitions. These features generally align with GAO-identified acquisition leading practices. However, VA’s major acquisition programs that GAO reviewed instead use program-specific approaches that vary widely in robustness.

VA plans to implement its proposed acquisition framework—the Acquisition Lifecycle Framework—in 2022. Plans for the new framework include features and processes similar to those of the current one. However, VA and its acquisition programs are not well-positioned to successfully implement the new framework because VA plans to implement it before addressing challenges that hindered adoption of its predecessor. For example:

  • Identifying programs subject to the framework. VA has yet to develop a list of major acquisitions that would be subject to increased oversight within the framework because it lacks a mechanism to collect and monitor acquisition program costs. Without such a mechanism, VA will struggle to identify acquisition programs subject to increased oversight within the framework.
  • Assessing acquisition workforce needs. VA identified gaps in its acquisition workforce that affected programs’ ability to implement the existing framework. Since VA has yet to assess its current workforce to determine whether gaps still exist, it will not know if current staff levels and skillsets are adequate to effectively support the new framework.
  • Aligning the framework with other processes. VA planned to align its current framework with IT program and major construction project management processes, but it issued potentially confusing guidance as to which processes to follow. VA has yet to provide clear direction to integrate the new framework with its IT and other management processes, increasing the risk it will not be implemented effectively.
  • Ensuring framework compliance. VA identified that the lack of a mechanism to ensure that acquisition programs adopt its framework was a weakness when it implemented its current framework. But VA has yet to establish and communicate a mechanism to ensure program compliance with its new framework, risking a repeat of limited adoption.

VA’s current plans to implement the new framework in 2022 do not provide the department time to address these challenges. If VA does not take steps to address these challenges prior to implementing the new Acquisition Lifecycle Framework, then VA will face increased risks of another unsuccessful implementation that does not achieve meaningful improvements in management of its major acquisitions.

Why GAO Did This Study

Over the past 10 years, VA’s contract obligations nearly doubled in size to $38 billion in fiscal year 2021. The increase was driven in part by key program growth and efforts to modernize VA systems. GAO added VA’s acquisition management to its High-Risk List in 2019 due to numerous challenges to efficiently purchasing goods and services, including medical supplies.

GAO was asked to examine how VA manages major acquisitions, which Office of Management and Budget guidance identify as requiring special management attention. This report assesses the extent to which VA acquisitions are following the current acquisition management framework and the extent to which VA is positioned to implement its proposed acquisition framework, among other objectives. To conduct this assessment, GAO reviewed relevant VA policies and guidance; analyzed VA program documents for a mix of IT modernization efforts and service acquisitions; and interviewed VA officials.Skip to Recommendations

Recommendations

GAO is making seven recommendations. These include establishing a mechanism to collect and monitor program costs, assessing workforce gaps, aligning the proposed framework with other agency processes, identifying a mechanism to ensure compliance, and ensuring these steps are taken before implementation of the new framework. VA agreed with GAO’s recommendations.

Recommendations for Executive Action

Department of Veterans Affairs The Secretary of Veterans Affairs should ensure that the Chief Acquisition Officer addresses challenges that pose risks to the Acquisition Lifecycle Framework’s success prior to its implementation. These risks include collecting cost data to enable identification of programs subject to increased oversight within the framework, addressing acquisition workforce needs, aligning the framework with other processes, and ensuring program compliance with the framework. (Recommendation 1) 

Open

When we confirm what actions the agency has taken in response to this recommendation, we will provide updated information.Department of Veterans Affairs The Secretary of Veterans Affairs should ensure that the Chief Acquisition Officer establishes a mechanism to collect, maintain, and monitor program costs and cost estimates necessary to identify programs subject to increased oversight within the Acquisition Lifecycle Framework. (Recommendation 2) 

Open

When we confirm what actions the agency has taken in response to this recommendation, we will provide updated information.Department of Veterans Affairs The Secretary of Veterans Affairs should ensure that the Chief Acquisition Officer and the Assistant Secretary for the Office of Human Resources and Administration conduct an enterprise-wide workforce assessment to identify any gaps that could limit effective adoption of the Acquisition Lifecycle Framework, such as in roles related to program management. (Recommendation 3) 

Open

When we confirm what actions the agency has taken in response to this recommendation, we will provide updated information.Department of Veterans Affairs The Secretary of Veterans Affairs should ensure that the Chief Acquisition Officer determines, documents, and communicates to programs how the Acquisition Lifecycle Framework should be used in conjunction with existing project management frameworks and processes, such as the Veteran-Focused Integration Process, VA Construction, and VA’s Project Management Framework. (Recommendation 4) 

Open

When we confirm what actions the agency has taken in response to this recommendation, we will provide updated information.Department of Veterans Affairs The Secretary of Veterans Affairs should ensure that the Chief Acquisition Officer identifies and documents a mechanism to monitor and ensure that applicable acquisition programs adopt and comply with the Acquisition Lifecycle Framework processes and decisions. (Recommendation 5) 

Open

When we confirm what actions the agency has taken in response to this recommendation, we will provide updated information.Department of Veterans Affairs The Secretary of Veterans Affairs should ensure that the Chief Acquisition Officer establishes key measures for the performance of the Acquisition Lifecycle Framework, collects and analyzes data related to these measures, and reports results. (Recommendation 6) 

Open

When we confirm what actions the agency has taken in response to this recommendation, we will provide updated information.Department of Veterans Affairs The Secretary of Veterans Affairs should ensure that the Chief Acquisition Officer establishes a documented lessons learned process to consistently collect, analyze, validate, archive, and share lessons learned related to implementation of the Acquisition Lifecycle Framework. (Recommendation 7) 

When we confirm what actions the agency has taken in response to this recommendation, we will provide updated information.

SEE ALL 7 RECOMMENDATIONS

Full Report

Highlights Page (1 page)Full Report (48 pages)Accessible PDF (55 pages)

Article link: https://www.gao.gov/products/gao-22-105195

VA in Need of New Acquisition Management Approach – Meritalk

Posted by timmreardon on 09/22/2022
Posted in: Uncategorized. Leave a comment

The Department of Veterans Affairs (VA) has faced acquisition management problems – placing it on the Government Accountability Office’s (GAO) High-Risk List – but agency officials on Sept. 20 said they are looking to chart a new course in VA acquisition, one that may require an enterprise-wide approach.

During a House Committee on Veterans’ Affairs hearing, Michael Parrish, chief acquisition officer, and principal executive director at VA’s Office of Acquisition, Logistics, and Construction, said the agency has made “considerable progress” to address GAO’s concerns.

“I am personally involved in ensuring that we show GAO and others how VA is properly monitoring and demonstrating progress across the enterprise,” Parrish said. “In its most recent report regarding VA acquisition management, GAO noted substantial work has been done to develop our acquisition lifecycle framework.”

However, lawmakers and Shelby Oakley, director of contracting and national security acquisitions at GAO, said VA has a long way to go until VA’s acquisition management will be able to successfully meet the needs of veterans.

Lawmakers pointed to the VA’s troubled Electronic Health Records Modernization (EHRM) program, which the agency initially estimated would cost $10 billion over 10 years. The new cost estimate is closer to $56 billion, with $39 billion for implementation of the EHR system over 13 years, and then $17 billion to maintain the system over 15 years.

The committee also noted the VA’s struggles to acquire a modern medical supply chain management system, which the GAO has also reported on several times. As Rep. Chris Pappas, D-N.H., warned, “the supply chain modernization failures resulted in hundreds of millions of dollars in cost overruns and years of delays,” according to GAO’s findings.

Chairman Rep. Mark Takano, D-Calif., stressed that the committee has seen a “fundamental lack of planning, budgeting, and adherence to contracting best practices by VA and its contracting centers.”

Notably, Oakley said one of the biggest challenges GAO has seen regarding VA’s acquisition is “VA’s even ability to develop a credible cost estimate. You know, we have a lot of concerns about the capacity within the organization to do that.”

Oakley said that while she is encouraged by “the time and attention” Parrish has invested into improving acquisition management, the agency needs an enterprise-wide effort.

“The sustainable, repeatable progress that his office is seeking requires much change across the VA enterprise – not only in the organizations that he directly manages but also in other parts of the department, which manage substantial acquisition programs,” Oakley said. “Without a coordinated and persistent enterprise-wide effort, the improvements VA seeks to meet the needs of veterans will be elusive.”

Parrish acknowledged and agreed with GAO’s concerns and said he considers the agency “to be a force multiplier helping us to improve our processes.” 

“Broader acquisition reform within VA is essential to our ability to survive and flourish in these dynamic markets such as healthcare and technology,” he said. “Systemic reforms and new tools such as other transactional authority will allow us to fuse acquisition with innovation to create that cradle-to-grave force multiplying effect.”

Article link: https://www.meritalk.com/articles/va-in-need-of-new-acquisition-management-approach/

VA Major Acquisitions Failures: In Search of Solutions

Posted by timmreardon on 09/22/2022
Posted in: Uncategorized. Leave a comment

U.S. government issues guidance for developers to secure the software supply chain: Key takeaways

Posted by timmreardon on 09/15/2022
Posted in: Uncategorized. Leave a comment

The U.S. NSA, CISA and ODNI created the Securing the Software Supply Chain guide to focus on the software development lifecycle.

By Chris Hughes

CSO

SEP 15, 2022 2:00 AM PT

Software supply chain attacks are on the rise, as cited in the Cloud Native Computing Foundation’s (CNCF’s) Catalog of Supply Chain Compromises. Industry leaders such as the Google, Linux Foundation, OpenSSF, and public sector organizations such as NIST have provide guidance on the topic over the past year or so.

The U.S. National Security Agency (NSA) alongside the Cybersecurity and Infrastructure Security Agency (CISA), and the Office of the Director of National Intelligence (ODNI) now join that list with their publication Securing the Software Supply Chain: Recommended Practices Guide for Developers. The announcement of the publication emphasizes the role developers play in creating secure software and states the guide strives to help developers adopt government and industry recommendations on doing so. Subsequent releases from Enduring Security Framework (ESF) will focus on the supplier and the software consumer, given the unique role each plays in the broader software supply chain and its resilience.

At a high-level the document is organized into three parts:

  • Part 1: security guidance for software developers
  • Part 2: software supplier considerations
  • Part 3: software customer recommendations
The role of developers, software suppliers and customers

The guidance notes the unique role developers, suppliers and customers play in the broader software supply chain ecosystem.

Software providers and their development teams may end up in the dichotomy of speed to market, versus secure and resilient software or software-enabled products.

As noted in the image above, each of the three roles has respective security activities it can and should be doing. These activities span the gamut from initial secure software development, composition and architecture all the way through security acceptance testing and integrity validation on the customers’ end.

Secure software begins with a secure software development lifecycle (SDLC) and the guidance cites many options teams can use, such as the U.S. National Institute of Standards and Technology’s (NIST’s) Secure Software Development Framework (SSDF), Carnegie Mellon University’s Secure Software Development Lifecycle Processes, and others such as the recently announced OpenSSF Secure Software Development Fundamentals courses.

How to develop secure software

The guidance stresses not just using secure software development processes but producing tangible artifacts and attestations that are used for validation, both by the software producer and consumer to have assurances related to the security and resiliency of the software. These processes and activities include best practices such as threat modeling, SAST, DAST and pen testing but also using secure release activities such as digitally signing, a notable example being the increased adoption of Sigstore, which is a standard for signing, verifying and protecting software. The adoption and use of Sigstore is also cited in the OpenSSF’s Open Source Security Mobilization Plan as a method to deliver enhanced trust in the software supply chain.

Threat modeling gets a significant mention, recognizing that during product development and delivery teams should be examining the potential threat scenarios that can occur and what controls  could be put in place to mitigate them. Teams should also have established security test plans and associated release readiness criteria to ensure unacceptable vulnerabilities are not making it to production environments or getting to customers.

Mature product teams have established support and vulnerability handling policies as well. This includes having a system where product vulnerabilities can be submitted and an associated incident response team that is in a position to respond and get engaged should an incident occur. Given the impact developers can have on producing secure or insecure products, formalized assessment and training must occur. Determine what training is required and who is required to take it at a specified frequency. The OpenSSF’s Open Source Software Security Mobilization Plan lists upskilling developers on secure software development as a key objective that is recognized as an industry-wide necessity. Training topics include secure software development, code reviews, verification testing and using vulnerability assessment tools during development to drive down vulnerabilities that make it into their end products.

The activities and practices discussed above, such as secure development training, threat modeling, security test plan and developed security policies and procedures, are mapped to activities in the previously mentioned NIST SSDF, which will be a requirement soon for software vendors to self-attest to when selling software products to the U.S. federal government.

Secure code development has many aspects, including selecting programming languages that could mitigate vulnerabilities from the onset. There is also the need for organizations to address insider threats, which might be a compromised engineer or simply poorly trained engineers. Organizations can mitigate these threats by having codified source control processes with appropriate authentication, running static and dynamic tests on code, as well as looking for exposed secrets.

Organizations should also implement nightly builds and security regression testing to recognize and address flaws and vulnerabilities. Development efforts shouldn’t be ad hoc and should be mapped to specific system requirements with associated security testing to avoid feature creep which can introduce risk.

Code reviews should be prioritized, especially critical code to ensure fundamentals such as cryptography are in place and there are requirements for privilege escalation and access protection for resources. It isn’t just the code that must be secured but also the development environment. There has been notable incidents such as SolarWinds where the development environment can be compromised and poison downstream consumers so systems, such as developers endpoints, source code repositories and CI/CD pipelines, should be threat modeled and have vulnerability assessments conducted.

Open-source software (OSS) introduces its own unique risk and the guidance recommends using dedicated systems to download, scan and perform recurring checks on OSS components that internal development teams can use. This concept is also advocated by NIST in its Improving the Nation’s Cybersecurity executive order guidance for Section 4 and has been dubbed as continuous packaging.

Another practice emphasized is securing the developer environment, using secure development build configurations and secure third-party software toolchains and libraries. Development systems should be hardened and only used for development purposes, without internet access, and only with pre-approved tooling and software. The guidance recommends vetting third-party modules for CVEs against the NIST National Vulnerability Database (NVD). Tooling and automation can help facilitate this process and can even be done as part of the integrated development environment (IDE) using security dependency analyzers and similar tooling to identify vulnerabilities.

Hardening the build environment is critical, including the developer network, enterprise network and internal build environments. This mitigates threats being introduced from the internet and external malicious actors as well as integrity and validation measures to validate that malicious activities have not occurred to compromise products.

Software components should be sourced from known trusted suppliers that meet organizational requirements and validated through methods such as SPDX or CycloneDX SBOM formats as well as the suppliers responsiveness to vulnerabilities with established methods for vulnerability reporting.

Securing the Software Supply Chain’s guidance goes beyond hardening the build environment to make recommendations such as using hermetic reproducible builds as well. This means fully declared build steps, immutable references, and no network access, as well as identical outputs and artifacts regardless of variable metadata changes to things such as timestamps.

Software should be delivered securely, including an SBOM of final composition to the customers. As part of package validation customers can use binary analysis outputs to ensure only intended software components are in place. To address compromises of software packages and updates, both product and components can make use of hashes and digital signatures for product distribution, components and upgrades. Organizations should also take steps to mitigate compromises of the distribution system itself. This can include applying security measures to repositories and package managers as well as using secure transport layer mechanisms.

Other resources for securing the software supply chain

The guidance includes a crosswalk among various scenarios with developers, suppliers and customers to specific practices outlined in SSDF. It also includes a mapping of dependencies and artifacts that exist among the supplier, third-party suppliers and the end customer.

A mapping to the SLSA framework shows how specific recommendations in the guidance map to the various levels of SLSA, ranging from L1 to L4. Lastly, there’s a comprehensive list of artifacts and checklists to be used throughout the SDLC and a list of informative references such as the cyber executive order, DoD and NIST documentation as well as industry organizations such as OWASP.

This secure software supply chain guidance is a critical resource that will undoubtedly be adopted by the industry as a go-to reference for organizations looking to bolster their software supply chain practices for both software producers and consumers. With this document taking a developer-centric focus, the industry would be well advised to look for the subsequent guidance, which will focus on software suppliers and consumers.[ Learn how to track and secure open source in your enterprise.

Article link: https://www-csoonline-com.cdn.ampproject.org/c/s/www.csoonline.com/article/3673233/u-s-government-issues-guidance-for-developers-to-secure-the-software-supply-chain-key-takeaways.amp.html

The Power of Leaders Who Focus on Solving Problems – HBR

Posted by timmreardon on 09/11/2022
Posted in: Uncategorized. Leave a comment
  • Deborah Ancona
  • Hal Gregersen

April 16, 2018, Updated April 20, 2018

Summary.   There’s a new kind of leadership taking hold in organizations. Strikingly, these new leaders don’t like to be called leaders, and none has any expectation that they will attract “followers” personally — by dint of their charisma, status in a hierarchy, or access to resources. Instead, their method is to get others excited about whatever problem they have identified as ripe for a novel solution. Having fallen in love with a problem, they step up to leadership — but only reluctantly and only as necessary to get it solved. Leadership becomes an intermittent activity as people with enthusiasm and expertise step up as needed, and readily step aside when, based on the needs of the project, another team member’s strengths are more central. Rather than being pure generalists, leaders pursue their own deep expertise, while gaining enough familiarity with other knowledge realms to make the necessary connections. They expect to be involved in a series of initiatives with contributors fluidly assembling and disassembling.close

In front of a packed room of MIT students and alumni, Vivienne Ming is holding forth in a style all her own. “Embrace cyborgs,” she calls out, as she clicks to a slide that raises eyebrows even in this tech-smitten crowd. “Really. Fifteen to 25 years from now, cognitive neuroprosthetics will fundamentally change the definition of what it means to be human.”

She’s referring to the work that interests her most these days, as cofounder of machine learning company Socos and a visiting scholar at UC Berkeley’s Center for Theoretical Neuroscience. (“So — can I literally jam things in your brain and make you smarter? If you’re curious, the answer is unambiguously yes.”) But the talkhas covered a lot more than this, as Ming has touched on many initiatives and startups she’s been involved with, all solving problems at the intersection of advanced technology, learning, and labor economics.

She’s an entrepreneur, a CEO, and a teacher — all leadership roles — but when we ask her about her leadership style, she demurs. “What I’ve learned about myself as a leader, as an executive, is — I’ll be blunt — I’m a pretty mediocre manager. I try to do the right things, but I’m much more focused on problems than I am on people, and that’s not always that healthy.” While she’s utterly confident in herself, she just doesn’t identify as top management. She’s happier to think of herself as a data scientist, a computer geek. She loves talking about hacks she’s pulled off — like the alterations she made to her diabetic son’s medical devices, so she could merge all their data to produce a predictive model. Now, she gets an alert an hour in advance if a spike or drop is coming in his blood glucose level. This is an unprecedented, and highly valuable, thing. “Turns out, it would have broken several federal laws if I had done this with patients as a medical device company,” she laughs.

Ming is a tech optimist, believing that all kinds of previously intractable problems will be able to be solved as the tool kit for addressing them is developed. And she’s decided her best way of contributing to that progress is to keep honing her individual-contributor skills. “For a long time, I tried to be the whole package. I put a lot of energy into making certain that I was shepherding everyone along, doing all the right things for my teams. Then I realized: You know what? If I can get some people that are really good at the things that I’m not, then I can focus on my strengths. And my strengths are in creative problem solving — all the way down to writing the code myself.”

The attitude she’s espousing doesn’t really map to the traditional image of the enterprise leader, or to what typically gets taught in leadership development programs. Yet there is no denying that truly awesome stuff gets done thanks to Ming’s abilities to see possibilities and assemble talent. There’s a reason she’s been invited by the MIT Leadership Center to speak. We think her approach to taking on big problems will resonate with this crowd.

Over the past year, as faculty director and executive director of the MIT Leadership Center, we have been trying to put a finer point on a distinctive style of leadership we keep seeing all around us. We weren’t sure if it was because we spent so much time with MIT-trained people, or if there was a much more widespread shift under way, but the people we saw driving impactful, world-changing initiatives just didn’t look like old-school leadership material — and didn’t seem to want to. Cautiously, we called it problem-led leadership, and launched into all the interviewing, case studying, and literature review that goes into a leadership research project.

To make a long story very short, we found several common threads in the work of problem-led leaders. Most striking is that none of these leaders has any expectation that they will attract “followers” personally — by dint of their charisma, status in a hierarchy, or access to resources. Instead, their method is to get others excited about whatever problem they have identified as ripe for a novel solution. Having fallen in love with a problem, they step up to leadership — but only reluctantly, and only as necessary to get it solved. As Ming says about her entrepreneurial ventures, “The only reason I do it is because it is an amazingly effective way to have an impact on the world.”

From this “problem-led” beginning, other differences follow. Leadership becomes an intermittent activity as people with enthusiasm and expertise step up as needed, and readily step aside when, based on the needs of the project, another team member’s strengths are more central. Rather than being pure generalists, leaders pursue their own deep expertise, while gaining enough familiarity with other knowledge realms to make the necessary connections. No one assumes that the life of a team, or even organization, will be prolonged for its own sake. They expect to be involved in a series of initiatives with contributors fluidly assembling and disassembling. It’s a key leadership talent, then, to know how to put together a team. To tackle a problem, they need to find the right talent and to convince others that their project offers the chance to be part of a breakthrough. (Talented people always have other options, after all.)

Back to Vivienne Ming. In an interview after her talk, she is pressed again to describe the role she plays in her ambitious projects and she pauses to reflect. “I lead by leading,” she begins, and then quickly worries that this “sounds a little self-aggrandizing.” She clarifies what she means. “I get out there, and I solve problems. And I hope that motivates my colleagues to do the same.”

It’s the leadership that dare not speak its name — we’ve heard this discomfort with the term expressed so many times that we now call people like Ming “anti-leadership leaders.” But while we started out thinking it was “MIT style,” we now believe it is a style that is generally trending in the world, at least in settings where a premium is placed on innovation. People like Vivienne Ming can be found in many places, getting focused on opportunities, getting others energized and organized, and getting problems solved. Call them what you want, but what they’re doing is leading.

Article link: https://hbr.org/2018/04/the-power-of-leaders-who-focus-on-solving-problems?

Editor’s note: We’ve updated this article to clarify that Vivienne Ming did not actually break any federal laws.

  • Deborah Ancona is the Seley Distinguished Professor of Management at MIT’s Sloan School of Management and the founder of the MIT Leadership Center.
  • Hal Gregersen is Executive Director of the MIT Leadership Center, a Senior Lecturer in Leadership and Innovation at the MIT Sloan School of Management, a Thinkers50  globally ranked management thinker, and the founder of the 4-24 Project. He is also the author of Questions Are the Answer: A Breakthrough Approach to Your Most Vexing Problems at Work and in Life and the coauthor of The Innovator’s DNA: Mastering the Five Skills of Disruptive Innovators.

White House Team Unveils New Recommendations for U.S. Semiconductor Industry Growth – Nextgov

Posted by timmreardon on 09/09/2022
Posted in: Uncategorized. Leave a comment

By ALEXANDRA KELLEYSEPTEMBER 7, 2022

The President’s Council of Advisors on Science and Technology introduced new advice to help successfully implement the CHIPS Act and develop an advanced U.S. semiconductor market.

The President’s Council of Advisors on Science and Technology presented several recommendations focused on how to support and foster a strong domestic semiconductor industry within the U.S., following the passage of the Semiconductor and CHIPS Science Act in August. 

Speaking in a public session, members of the PCAST introduced investigative findings that can help support a strong manufacturing economy to spur both revenue growth and global leadership in the semiconductor market.

“We’ve been…really looking at the key points of the CHIPS Act, particularly as it relates to the research and development investment,” PCAST member Lisa Su said. “So the purpose of our report is to really focus on ‘how do we revitalize the semiconductor ecosystem in the face of significant global competition?’”

The ten recommendations released today are broadly tailored toward implementing the National Semiconductor Technology Center, as stipulated in the CHIPS Act, as well as cultivating steady workforce talent for sustainable market sector growth. 

The CHIPS and Science Act will allocate a total of $11 billion in government funding to help grow a sustainable domestic semiconductor manufacturing and research industry in the U.S.

Su introduced the first five recommendations, including building a coalition of public-private partnerships, developing a workforce specializing in microelectronics, reducing startup business barriers to entry, setting a national semiconductor research agenda, and supporting immigration related to semiconductor-based employment. 

Fellow PCAST working group member Bill Dally introduced the second half of recommendations, which advocate for the creation of chiplet devices—tiny processing units that can be combined to form a more complex system—at the forthcoming NSTC. 

Dally said that the chiplet priority will help smaller businesses participate in the U.S.’s burgeoning semiconductor industry.

The recommendations also direct the Department of Commerce to allocate over $500 million in federal funding to NSTC and further delegate 30% to 50% of this money to research, incorporating research challenges into NSTC’s initiatives. The NSTC would then have to publish investment figures and evaluate performance metrics.  

Some of the research areas the working group want the NSTC to focus on are supercomputing, advanced logic and memory processing, and life sciences applications and use cases.

Su and Dally also emphasized the importance of public and private organizations collaborating on research and development projects. 

Su said that cultivating a strong accompanying workforce is critical to a sustainable semiconductor industry and a high priority within internal PCAST discussions. 

These recommendations follow a letter sent by PCAST members to President Biden in August, previewing the recommendations announced today. Su commented that the report builds out more detail into the initial bullet points included in the letter. 

“I think we’re very excited about the broad coalition for the NSTC,” she said. “I think the opportunity there is to have a very geographically diverse…experience across both the entire semiconductor ecosystem.”

Article link: https://www.nextgov.com/policy/2022/09/white-house-team-unveils-new-recommendations-us-semiconductor-industry-growth/376834/

Posts navigation

← Older Entries
Newer Entries →
  • Search site

  • Follow healthcarereimagined on WordPress.com
  • Recent Posts

    • Are AI Tools Ready to Answer Patients’ Questions About Their Medical Care? – JAMA 03/27/2026
    • How AI use in scholarly publishing threatens research integrity, lessens trust, and invites misinformation – Bulletin of the Atomic Scientists 03/25/2026
    • VA Prepares April Relaunch of EHR Program – GovCIO 03/19/2026
    • Strong call for universal healthcare from Pope Leo today – FAN 03/18/2026
    • EHR fragmentation offers an opportunity to enhance care coordination and experience 03/16/2026
    • When AI Governance Fails 03/15/2026
    • Introduction: Disinformation as a multiplier of existential threat – Bulletin of the Atomic Scientists 03/12/2026
    • AI is reinventing hiring — with the same old biases. Here’s how to avoid that trap – MIT Sloan 03/08/2026
    • Fiscal Year 2025 Year In Review – PEO DHMS 02/26/2026
    • “𝗦𝗼𝗰𝗶𝗮𝗹 𝗠𝗲𝗱𝗶𝗮 𝗠𝗮𝗻𝗶𝗽𝘂𝗹𝗮𝘁𝗶𝗼𝗻 𝗳𝗼𝗿 𝗦𝗮𝗹𝗲” – NATO Strategic Communications COE 02/26/2026
  • Categories

    • Accountable Care Organizations
    • ACOs
    • AHRQ
    • American Board of Internal Medicine
    • Big Data
    • Blue Button
    • Board Certification
    • Cancer Treatment
    • Data Science
    • Digital Services Playbook
    • DoD
    • EHR Interoperability
    • EHR Usability
    • Emergency Medicine
    • FDA
    • FDASIA
    • GAO Reports
    • Genetic Data
    • Genetic Research
    • Genomic Data
    • Global Standards
    • Health Care Costs
    • Health Care Economics
    • Health IT adoption
    • Health Outcomes
    • Healthcare Delivery
    • Healthcare Informatics
    • Healthcare Outcomes
    • Healthcare Security
    • Helathcare Delivery
    • HHS
    • HIPAA
    • ICD-10
    • Innovation
    • Integrated Electronic Health Records
    • IT Acquisition
    • JASONS
    • Lab Report Access
    • Military Health System Reform
    • Mobile Health
    • Mobile Healthcare
    • National Health IT System
    • NSF
    • ONC Reports to Congress
    • Oncology
    • Open Data
    • Patient Centered Medical Home
    • Patient Portals
    • PCMH
    • Precision Medicine
    • Primary Care
    • Public Health
    • Quadruple Aim
    • Quality Measures
    • Rehab Medicine
    • TechFAR Handbook
    • Triple Aim
    • U.S. Air Force Medicine
    • U.S. Army
    • U.S. Army Medicine
    • U.S. Navy Medicine
    • U.S. Surgeon General
    • Uncategorized
    • Value-based Care
    • Veterans Affairs
    • Warrior Transistion Units
    • XPRIZE
  • Archives

    • March 2026 (8)
    • February 2026 (6)
    • January 2026 (8)
    • December 2025 (11)
    • November 2025 (9)
    • October 2025 (10)
    • September 2025 (4)
    • August 2025 (7)
    • July 2025 (2)
    • June 2025 (9)
    • May 2025 (4)
    • April 2025 (11)
    • March 2025 (11)
    • February 2025 (10)
    • January 2025 (12)
    • December 2024 (12)
    • November 2024 (7)
    • October 2024 (5)
    • September 2024 (9)
    • August 2024 (10)
    • July 2024 (13)
    • June 2024 (18)
    • May 2024 (10)
    • April 2024 (19)
    • March 2024 (35)
    • February 2024 (23)
    • January 2024 (16)
    • December 2023 (22)
    • November 2023 (38)
    • October 2023 (24)
    • September 2023 (24)
    • August 2023 (34)
    • July 2023 (33)
    • June 2023 (30)
    • May 2023 (35)
    • April 2023 (30)
    • March 2023 (30)
    • February 2023 (15)
    • January 2023 (17)
    • December 2022 (10)
    • November 2022 (7)
    • October 2022 (22)
    • September 2022 (16)
    • August 2022 (33)
    • July 2022 (28)
    • June 2022 (42)
    • May 2022 (53)
    • April 2022 (35)
    • March 2022 (37)
    • February 2022 (21)
    • January 2022 (28)
    • December 2021 (23)
    • November 2021 (12)
    • October 2021 (10)
    • September 2021 (4)
    • August 2021 (4)
    • July 2021 (4)
    • May 2021 (3)
    • April 2021 (1)
    • March 2021 (2)
    • February 2021 (1)
    • January 2021 (4)
    • December 2020 (7)
    • November 2020 (2)
    • October 2020 (4)
    • September 2020 (7)
    • August 2020 (11)
    • July 2020 (3)
    • June 2020 (5)
    • April 2020 (3)
    • March 2020 (1)
    • February 2020 (1)
    • January 2020 (2)
    • December 2019 (2)
    • November 2019 (1)
    • September 2019 (4)
    • August 2019 (3)
    • July 2019 (5)
    • June 2019 (10)
    • May 2019 (8)
    • April 2019 (6)
    • March 2019 (7)
    • February 2019 (17)
    • January 2019 (14)
    • December 2018 (10)
    • November 2018 (20)
    • October 2018 (14)
    • September 2018 (27)
    • August 2018 (19)
    • July 2018 (16)
    • June 2018 (18)
    • May 2018 (28)
    • April 2018 (3)
    • March 2018 (11)
    • February 2018 (5)
    • January 2018 (10)
    • December 2017 (20)
    • November 2017 (30)
    • October 2017 (33)
    • September 2017 (11)
    • August 2017 (13)
    • July 2017 (9)
    • June 2017 (8)
    • May 2017 (9)
    • April 2017 (4)
    • March 2017 (12)
    • December 2016 (3)
    • September 2016 (4)
    • August 2016 (1)
    • July 2016 (7)
    • June 2016 (7)
    • April 2016 (4)
    • March 2016 (7)
    • February 2016 (1)
    • January 2016 (3)
    • November 2015 (3)
    • October 2015 (2)
    • September 2015 (9)
    • August 2015 (6)
    • June 2015 (5)
    • May 2015 (6)
    • April 2015 (3)
    • March 2015 (16)
    • February 2015 (10)
    • January 2015 (16)
    • December 2014 (9)
    • November 2014 (7)
    • October 2014 (21)
    • September 2014 (8)
    • August 2014 (9)
    • July 2014 (7)
    • June 2014 (5)
    • May 2014 (8)
    • April 2014 (19)
    • March 2014 (8)
    • February 2014 (9)
    • January 2014 (31)
    • December 2013 (23)
    • November 2013 (48)
    • October 2013 (25)
  • Tags

    Business Defense Department Department of Veterans Affairs EHealth EHR Electronic health record Food and Drug Administration Health Health informatics Health Information Exchange Health information technology Health system HIE Hospital IBM Mayo Clinic Medicare Medicine Military Health System Patient Patient portal Patient Protection and Affordable Care Act United States United States Department of Defense United States Department of Veterans Affairs
  • Upcoming Events

Blog at WordPress.com.
healthcarereimagined
Blog at WordPress.com.
  • Subscribe Subscribed
    • healthcarereimagined
    • Join 153 other subscribers
    • Already have a WordPress.com account? Log in now.
    • healthcarereimagined
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...