healthcarereimagined

Envisioning healthcare for the 21st century

  • About
  • Economics

Quantum Computing Debuts at Cleveland Clinic

Posted by timmreardon on 09/24/2023
Posted in: Uncategorized.

The world’s first quantum computer devoted to healthcare research is now operational at Cleveland Clinic, part of a landmark partnership with IBM called the Discovery Accelerator that will use advanced computing technologies to hasten biomedical innovations.

The IBM Quantum System One is installed in the Lerner Research Institute on Cleveland Clinic’s main campus. Cleveland Clinic and IBM researchers already are devising projects that will bring quantum computing and the Discovery Accelerator’s other components to bear on highly complex challenges in areas such as drug discovery and design, genomic analysis, molecular modeling and medical imaging.

“This technology holds tremendous promise in revolutionizing healthcare and expediting progress toward new cares, cures and solutions for patients,” says Tom Mihaljevic, MD, Cleveland Clinic CEO and President and Morton L. Mandel CEO Chair. “Quantum and other advanced computing technologies will help researchers tackle historic scientific bottlenecks and potentially find new treatments for patients with diseases like cancer, Alzheimer’s and diabetes.”

Quantum advantages

Quantum computing is a radically new and rapidly evolving computing technology that has the potential to dramatically change how data is analyzed and what can be discovered from it. It uses the principles of quantum mechanics — which describe the strange ways that nature works at the subatomic level — to solve problems too complicated or massive for classical computers.

Traditional digital computers operate with binary information-storing bits, which function in only one state at a time — off or on, zero or one. Although powerful, bit-based computers struggle with problems that have multiple variables interacting in complex ways, such as trying to predict what 3D shape a chain of protein molecules will form.

Quantum computers use units of information called quantum bits, or qubits, which exist in multiple states simultaneously (a condition known as superposition). That capability allows qubits to hold much more information than binary bits and interact in novel ways (called entanglement and interference) to process data. Using qubits, quantum computers can create multidimensional computing environments in which the patterns that link seemingly disparate individual data points emerge.

Quantum computers can operate substantially faster and at a larger scale than classical computers, meaning they will be superior for certain data- and time-intensive uses, such as probing complex biological processes and systems. For example, simulating the energy configuration of a single caffeine molecule would require 1048 bits — an impossible processing task even for today’s most powerful supercomputers. A quantum computer could represent the molecular configuration of caffeine with 160 qubits. 

The Quantum System One in place at the Discovery Accelerator uses IBM’s 127-qubit Eagle processor. Within three years, the Discovery Accelerator will be the first client facility to host IBM’s next-generation quantum computer, the 1,000+ qubit Quantum System Two. The 1,000-qubit threshold is an inflection point at which problems will be solved more efficiently on quantum computers than on the world’s best supercomputers.

The need for speed

Speed and efficiency are vital at a time when researchers are awash in information that must be analyzed to be understood and acted on. The volume of healthcare data — such as information from clinical trials, disease registries, electronic health records and medical devices — is growing at a compound annual growth rate of 36%. At present, it takes an average of 17 years for a scientific discovery in a biomedical research lab to become a tangible therapy or diagnostic test available to patients.

“It still takes forever to go from a question to an answer,” says Lara Jehi, MD, Cleveland Clinic’s Chief Research Information Officer and Discovery Accelerator Executive Program Leader. “It’s frustrating because at the same time in this digital age, we have all the data. We should be able to come up with better answers more quickly, but we can’t because the data is too much for classical computing tools to handle.” 

Although other institutions may be able to access quantum computing via the cloud, they could encounter long wait times and limits on usage. “It’s hard to lead in that situation,” Dr. Jehi says. “Fortunately, thanks to our partnership with IBM, Cleveland Clinic has its own quantum computer, in the middle of one of the top healthcare systems in the world.”

More computing capabilities

In addition to quantum computing, the Discovery Accelerator’s suite of advanced computing technologies includes artificial intelligence (AI) and high-performance computing (HPC) through the hybrid cloud.

AI involves advanced computer algorithms and systems that learn from data, organize knowledge and make inferences and deductions. AI is important in areas such as natural language processing, image analysis and speech recognition, which are useful in biomedical research and clinical care. Recent advances in deep learning and foundation models have potential to leverage complex data across the biomedical spectrum, from molecular and protein data to omics and electronic health records. AI surrogate models are accelerating and lowering the cost of computing simulations in tasks such as protein structure prediction. AI generative models are valuable for scaling hypotheses generation, which is a key in drug and biomarker design and discovery.

HPC uses clusters of powerful processors, working in massively parallel fashion, to analyze huge datasets and solve complex problems at extremely high speeds — typically more than a million times faster than desktop or laptop computers and servers. HPC runs on clusters of high-speed computer servers networked together. In healthcare research, the power of HPC can be used for complicated processes and projects such as DNA sequencing, drug discovery and design, rapid cancer diagnosis, molecular modeling, and running artificial intelligence algorithms and simulations.

Research focus and projects

To maximize the impact of its advanced computing resources, the Discovery Accelerator will focus on eight cross-cutting areas of research that bridge multiple biomedical domains, systems, organs and diseases. This integrative approach means that discoveries in one focus area will have a synergistic effect, accelerating progress across the enterprise. The priority areas are:

  • Drug discovery
  • Clinical translation
  • Bedside translation
  • Population health
  • Healthy behaviors
  • Omics
  • Physiology
  • Imaging

Cleveland Clinic and IBM already have begun a robust portfolio of research projects likely to produce high-impact results.

Projects include:

  • Developing a quantum computing method to explore protein-drug interactions, improving the process of screening and optimizing drugs targeted to specific proteins.
  • Developing AI-based predictive systems and other analytic tools to search databases of human gene sequences and the molecular targets of existing drugs, looking for matches that could indicate therapeutic potential in Alzheimer’s disease and dementia.
  • Creating an analytic environment in the hybrid cloud to support single-omics research, a field that involves simultaneously profiling the state and interactions of DNA, RNA, proteins and other components of a single cell to gain insights into cellular mechanisms and how specific types of cells and their genetic material affect, and are affected by, disease.

The Discovery Accelerator could hold the key to making drug discovery more expeditious and cost-effective, says Shaun Stauffer, PhD, Director of Cleveland Clinic’s Center for Therapeutics Discovery.

“It’s going to change the vector,” says Dr. Stauffer, whose team is working closely with IBM to leverage AI and molecular dynamics modeling to streamline the drug discovery process.

“A big part of drug discovery is risk mitigation and risk removal,” he says. “With computational science, we have an opportunity to make smarter decisions upfront and reduce the number of compounds we have to synthesize to identify a clinical candidate. These new tools will have embedded knowledge about what to avoid. We’ll receive a clear option that should allow us to get from point A to point B with a much higher level of confidence.”

Collaboration opportunities

Access to quantum computing and the Discovery Accelerator’s other advanced computing resources and expertise isn’t limited to Cleveland Clinic and IBM researchers.

The facility is seeking collaborators from diverse fields — pharmaceutical companies, academic medical centers, hospital systems, diagnostic testing laboratories, healthcare software and electronic health record system developers, research institutions, colleges and universities, government organizations, biotechnology firms, medical device companies and others.

Collaborators can import their own research project, team up with an existing Discovery Accelerator research project, or utilize the Discovery Accelerator’s computing assets via IBM’s hybrid cloud.

To further encourage collaboration and advancement in quantum computing, Cleveland Clinic is a founding partner of the Life Sciences and Healthcare Quantum Innovation Hub, an initiative led by the Washington, D.C.-based nonprofit development organization Connected DMV. Cleveland Clinic has been tapped to help define quantum computing’s role in the future of healthcare and to educate other health systems about the technology’s possibilities.

Educating a high-tech workforce

A pillar of the Discovery Accelerator’s mission is education — providing training to develop the healthcare research and technology workforce of the future and creating jobs to grow the economy.

The Cleveland Clinic-IBM partnership plans to provide innovative curricula in data science and quantum computing, offering training and certification programs designed for participants ranging from high school to the professional level, with access through grants, scholarships, fellowships and endowments. There will be research symposia and workshops intended for academia, industry, government and the public.

“We need to prepare a workforce that is able to make sense of the new technology,” says Neil Mehta, MD, a Cleveland Clinic internal medicine physician who oversees curriculum and technology as Associate Dean at Cleveland Clinic Lerner College of Medicine and who is helping lead the Discovery Accelerator education efforts along with Christine Moravec, PhD, and IBM colleagues.

“We’re already training people how to be good clinicians and good scientists,” says Dr. Moravec, Director of the Lerner Research Institute’s Research Education and Training Center. “They already speak the language of medicine and science. Now we need to train them how to speak the language of data and computation, too.”

Photo credit: Ryan Lavine for IBM

Article link: https://consultqd-clevelandclinic-org.cdn.ampproject.org/c/s/consultqd.clevelandclinic.org/quantum-computing-debuts-at-cleveland-clinic/amp/

VA and Oracle eye new EHR deployment schedule by next summer – Healthcare IT News

Posted by timmreardon on 09/24/2023
Posted in: Uncategorized.

But restarting the paused Electronic Health Record Modernization program will first require a successful deployment at Lovell Federal Health Care Center in Chicago, says Dr. Neil Evans of the EHRM Integration Office.

By Andrea Fox

September 14, 2023

10:24 AM

The Department of Veterans Affairs and leaders from Oracle provided an update on the paused electronic health record modernization project to the House Appropriations Committee this week, indicating progress and their expectations about when they will develop a new deployment schedule.

WHY IT MATTERS

Workflow reconfigurations will make the VA’s electronic health record “more intuitive” for providers, said Mike Sicilia, EVP of Oracle Global Industries, while testifying Wednesday before members of Congress.

Before open discussions for a restart happen, said Dr. Neil Evans, acting program executive director of the EHRM Integration Office, the VA needs to see sustained improvement. 

“We are now focusing on delivering the improvements needed for the current system users while also preparing the enterprise for future deployment success,” he reported.

On August 31, VA completed its first increment for the program reset – the work of “dedicated attention and positive improvements,” addressing necessary system changes, technical stability, enhancing end-user support and ticketing and more, said Evans.

More needs to be done to revisit deployment scheduling, according to the VA leaders and Oracle.

The company is working with the VA to make 270 workflow reconfigurations during the EHRM’s reset period, and are currently meeting all 22 service-level agreements. As part of joint health information exchange requirements, Sicilia said Oracle delivered a package of upgrades, including pharmacy and other patient safety enhancements, to 90% of the VA’s community hospitals.

Comparing the VA’s EHRM project to the Department of Defense rollout of MHS Genesis, he said it was “similarly challenged” in its first two years, “only completing four deployments and then taking a two-year pause to improve governance and fine-tune a standard enterprise baseline system.” 

DoD then had a repeatable deployment model and was then able to roll out its EHR to all domestic facilities in under four years, he said.

The VA’s EHRM deployment at the Captain James A. Lovell Federal Health Care Center in Chicago is the only exception to the VA’s full stop of its previous rollout schedule. It is on track to go live in March 2024, according to Sicilia.

He said that the work for that deployment “will demonstrate that the system is scaled to function well and handle operations at complex facilities.”

THE LARGER TREND

Earlier this year, the VA halted its planned EHR roll-out and struck a new deal with Oracle.

“All in all, this is a much stronger contract, and I’m hopeful it will help VA ensure that Oracle Cerner gets this EHR program to work for Washington state providers and veterans,” U.S. Sen. Patty Murray, D-Wash., chair of the Senate Appropriations Committee and the VA Subcommittee, and a senior member of the Senate Committee on Veterans Affairs, said about the revised requirements.

On Tuesday, the U.S. Senate Veterans’ Affairs Committee announced it confirmed Tanya Bradsher to be the VA’s deputy secretary – the first woman named to the role. Bradsher, who most recently served as the VA’s chief of staff, is charged with running the agency’s day-to-day operations – including the roll-out of the EHRM.

ON THE RECORD

“Since implementing our engineering changes, the Oracle-owned outage free time has been 100% in 11 of the last 12 months,” Sicilia testified before Congress.

Andrea Fox is senior editor of Healthcare IT News.
Email: afox@himss.org

Healthcare IT News is a HIMSS Media publication.

Article link:https://www.healthcareitnews.com/news/va-and-oracle-eye-new-ehr-deployment-schedule-next-summer

AlphaMissense Classifies Mutation Pathogenicity – Inside Precision Medicine

Posted by timmreardon on 09/24/2023
Posted in: Uncategorized.

September 19, 2023

A new, freely available AI catalog has classified the potential effects of millions of missense genetic mutations, which could help establish the cause of diseases such as cystic fibrosis, sickle-cell anemia, and cancer.

The AlphaMissense resource from Google DeepMind categorized 89% of all 71 million possible missense variants.

This compares with just 0.1% that have been already categorized by human experts.

The machine-learning algorithm predicted 57% as likely benign and 32% likely highly pathogenic, using a threshold that yielded 90% precision on a database of known disease variants.

“AlphaMissense achieves state-of-the-art predictions across a wide range of genetic and experimental benchmarks, all without explicitly training on such data,” Jun Cheng and colleagues at Google Deepmind assert in a blog accompanying their findings in the journal Science.

Their predictions are freely available to the research community, together with open-sourced model code for AlphaMissense.

Missense genetic mutations arise from a single letter substitution in DNA, resulting in an altered amino acid that can potentially affect the entire function of a protein.

The average person contains 9000 missense mutations, most of which are benign, and it remains largely a mystery which give rise to disease.

In some cases, a disease may result from single or few missense variants while other complex diseases such as Type 2 diabetes may result from a combination of many different types of genetic changes.

AI tools provide an alternative to expensive and laborious experiments, enabling researchers to gain a preview of results for thousands of proteins, allowing them to prioritize resources and fast-track more complex studies into their predicted effects.

The benefits could potentially cover fields ranging from molecular biology to clinical and statistical genetics.

The AlphaMissense resource is adapted from AlphaFold, a previously published breakthrough model that predicted the structures of almost all known proteins from their amino acid sequences.

The newly released variant effect predictor (VEP) algorithm predicts the pathogenicity of missense variants altering individual amino acids of proteins.

To train it, the team fine-tuned AlphaFold on labels that distinguished variants seen in humans and closely related primate populations from variants never seen in humans.

The catalog does not predict changes to protein structure from mutations or other effects on protein stability. Instead, it deploys databases of related protein sequences and the structural context of variants to score the likelihood of a variant being pathogenic on a continuous scale from zero to one.

This enables a threshold for classifying variants as pathogenic or benign to be chosen to match user accuracy requirements.

Together with EMBL’s European Bioinformatics Institute, the creators are making the information more usable for researchers through the Ensembl Variant Effect Predictor.

In addition to a look-up table of missense mutations, they have shared expanded predictions of all possible 216 million single amino acid sequence substitutions across more than 19,000 human proteins.

They have also included the average prediction for each gene, which is similar to measuring a gene’s evolutionary constraint—how essential the gene is for the organism’s survival.

“Our tool outperformed other computational methods when used to classify variants from ClinVar, a public archive of data on the relationship between human variants and disease,” the investigators note.

“Our model was also the most accurate method for predicting results from the lab, which shows it is consistent with different ways of measuring pathogenicity.”

Article link: https://www.insideprecisionmedicine.com/topics/precision-medicine/alphamissense-classifies-mutation-pathogenicity/

Paper-Based Government Forms Cost Federal Agencies $37.8B – Nextgov

Posted by timmreardon on 09/24/2023
Posted in: Uncategorized.

By ALEXANDRA KELLEYOCTOBER 18, 2022 11:40 AM ET

A report from the U.S. Chamber of Commerce underscored the environmental and consumer need for a digital government services modernization.

Work optimization and waste reduction are the two key factors behind the federal government’s need to update its digital services, with a new report from the Chamber of Commerce outlining the financial and operational benefits that come with modernization. 

Published by the lobbying group’s Technology Engagement Center, the report, titled “Government Digitization: Transforming Government to Better Serve Americans,” debuted on Monday. It captures the volume of government forms federal agencies rely on to provide government services.

These government processes include drivers’ license and passport applications and renewals, social security card applications, and health record access, among others.

“Increased government digitization doesn’t just mean saved time and money, it also means providing greater accountability and access to underserved communities when it comes to utilizing government services,” U.S. Chamber Technology Engagement Center Vice President Jordan Crenshaw said in prepared remarks. 

In total, there are 9,858 unique government forms used to access various services, and 106 billion such forms are processed annually, according to the report.

This culminates in about 10.5 billion hours American citizens spend filling out federal forms, which in turn use 2.25 million trees to create. Collecting and processing common government services using the current paper methods costs the federal government over $38.7 billion, based on statistics from June 2022.

The Department of Treasury stood out as having the highest number of unique forms at 1,838. Trailing the agency are the Department of Health and Human Services, the Department of Agriculture and the Department of Commerce.  

“Digitization will enable government agencies to cut costs, increase efficiency and reduce waste every day,” the report said. 

The Chamber encouraged Congress to conduct oversight examinations to see what paper processes can be digitized. A part of this process would ideally include lawmakers analyzing where to find funding to support IT modernization efforts. 

Researchers at the Chamber told Nextgovthat the digitization process could be supported by technologies like cloud computing, which enables better data streamlining and protection; blockchain softwares, which can track supply chain data; and artificial intelligence, which facilitates rapid data-based decision-making.

Experts within the Technology Engagement Center noted that this funding can potentially come from the Technology Modernization Fund, along with other avenues for similar modernization updates at the state level available via the American Rescue Plan.

Article link: https://www.nextgov.com/it-modernization/2022/10/paper-based-government-forms-cost-federal-agencies-378b/378566/

Federal CIO touts ’10-year-plan’ to build a truly digital federal government – Nextgov

Posted by timmreardon on 09/24/2023
Posted in: Uncategorized.

By NATALIE ALMSSEPTEMBER 22, 2023 05:00 PM ET

The long-awaited guidance to support the implementation of the IDEA Act tasks agencies with ramping up their digital offerings while mandating consistent brand and design standards across the federal government.

The White House advanced an effort launched in Congress years ago to put the federal government on a digital footing. Late Friday, the Office of Management and Budget released guidance directing agencies to deliver a “digital-first public experience,” and giving agencies details and deadlines for the implementation of the 21st Century IDEA Act, which was signed into law four years ago.

“This policy guidance gives federal agencies the mandate and the momentum that we need to deliver federal government that meets today’s expectations,” Clare Martorana, federal chief information officer, told Nextgov/FCW. It “is going to transform the way the federal government interacts and delivers for Americans.”

There’s a lot of work to do.

Currently, just 2% of federal forms are usable as dynamic online forms (as opposed to a fillable online PDF), according to the White House. Nearly half of federal websites aren’t mobile friendly and 60% aren’t fully accessible for people who use assistive technology, according to  Martorana. Additionally, 80% of federal websites don’t use the U.S. Web Design System, guidelines and code housed at the General Services Administration that are meant to standardize websites across agencies. 

The new policy framework “is setting the standards for the federal government for digital transformation with over 100 actions and standards that are going to help all agencies design, develop and deliver modern websites and digital services that are trustworthy, accessible and easy to use,” Martorana said.

The guidance memo, from OMB Director Shalanda Young, gives agencies a to-do list focused on analytics, accessibility, branding, content, design, search and digitization. 

Agencies will be required to use web analytics — “you can’t manage what you don’t measure,” said Martorana — and new federal-wide branding guidelines to give government websites a distinct and consistent look. Agencies will also be required to use .gov or .mil domains and the U.S. Web Design System.

Agencies are also on notice to consolidate, remove or rewrite duplicate, outdated and confusing content on their websites. 

The new guidance also requires the use of an onsite search function for government websites. The government will also be developing better search engine optimization best practices, said Martorana, noting that the “majority of people start outside of the government to find services – they are on search engines.”

The guidance also pushes agencies to develop “new digital options to get government services,” the fact sheet states.

“The American people should know when they’re interacting with an official government website, get the best answer to their top questions in language they can understand, access government online services no matter what their ability is, use government websites that work on mobile and interact with our government in a way that best works for them,” said Martorana. “It will never be digital only, but it will be digital by default.”

Martorana said that part of the reason it took years after the passage of the IDEA Act to issue the guidance was because of collaboration across government with GSA, U.S. Digital Services, the CIO Council and agencies in crafting the policy. 

“We wanted to make sure from the inception that we were aligned and are capable of delivering for the public,” Martorana said. 

“We are trying to do something different, which is use human-centered design on our policies, and I think that that is why this is going to be so important and impactful and is going to really drive this 10-year-plan and make it successful,” she said.

OMB will be watching to see that agencies meet deadlines, including designating digital experience leads and prioritizing services to digitize and improve. 

“This isn’t optional. This is guidance that we are going to manage, we are going to drive… Agencies are not going to be opting out of things like accessibility, of mobile-optimizing their web properties. This is something the administration has taken really seriously,” said Martorana. “This is our time to think big, to help agencies move faster [and] deliver a modern government to the American people.”

Article link: https://www.nextgov.com/modernization/2023/09/federal-cio-touts-10-year-plan-build-truly-digital-federal-government/390578/

Deputy Secretary of Defense Kathleen Hicks Announces $238M CHIPS and Science Act Award – DOD

Posted by timmreardon on 09/21/2023
Posted in: Uncategorized.

Sept. 20, 2023

Deputy Secretary of Defense Kathleen Hicks announced the award today of $238 million in “Creating Helpful Incentives to Produce Semiconductors (CHIPS) and Science Act” funding for the establishment of eight Microelectronics Commons (Commons) regional innovation hubs.

This is the largest award to date under President Biden’s CHIPS and Science Act.

“The Microelectronics Commons is focused on bridging and accelerating the lab-to-fab transition, that infamous valley of death between R&D and production,” said Deputy Secretary Hicks. “President Biden’s CHIPS Act will supercharge America’s ability to prototype, manufacture, and produce microelectronics scale. CHIPS and Science made clear to America — and the world — that the U.S. government is committed to ensuring that our industrial and scientific powerhouses can deliver what we need to secure our future in this era of strategic competition.”

The eight awardees are:

1. Northeast Microelectronics Coalition (NEMC) Hub

Awardee (Hub Lead): The Massachusetts Technology Collaborative (MassTech)
Hub Lead State: Massachusetts
FY23 Award:  $19.7 M
90 Hub Members

2. Silicon Crossroads Microelectronics Commons (SCMC) Hub

Awardee: The Applied Research Institute (ARI)
Hub Lead State: Indiana
FY23 Award:  $32.9 M
130 Hub Members

3. California Defense Ready Electronics and Microdevices Superhub (California DREAMS) Hub

Awardee:  The University of Southern California (USC)
Hub Lead State: California
FY23 Award:  $26.9 M
16 Hub members

4. Commercial Leap Ahead for Wide Bandgap Semiconductors (CLAWS) Hub

Awardee:  North Carolina State University (NCSU)
Hub Lead State: North Carolina
FY23 Award:  $39.4 M
7 Hub members

5. Southwest Advanced Prototyping (SWAP) Hub

Awardee:  Arizona Board of Regents on behalf of Arizona State University
Hub Lead State: Arizona
FY23 Award:  $39.8 M
27 Hub members

6. Midwest Microelectronics Consortium (MMEC) Hub

Awardee:  MMEC
Hub Lead State: Ohio
FY23 Award:  $24.3 M
65 Hub members

7. Northeast Regional Defense Technology Hub (NORDTECH)

Awardee :  The Research Foundation for the State University of New York (SUNY)
Hub Lead State: New York
FY23 Award:  $40.0 M
51 Hub members

8. California-Pacific-Northwest AI Hardware Hub (Northwest-AI Hub)

Awardee:  The Board of Trustees of the Leland Stanford Junior University
Hub Lead State: California
FY23 Award:  $15.3 M
44 Hub members

In all, over 360 organizations from over 30 states will be participating in the Commons.

With $2 billion in funding for Fiscal Years 2023 through 2027, the Microelectronics Commons program aims to leverage these Hubs to accelerate domestic hardware prototyping and “lab-to-fab” transition of semiconductor technologies. This will help mitigate supply chain risks and ultimately expedite access to the most cutting-edge microchips for our troops.

Six technology areas critical to the DoD mission were selected as focus areas for the Commons. Each Hub will be advancing U.S. technology leadership in one or more of these areas:

  • Secure Edge/Internet of Things (IoT) Computing
  • 5G/6G
  • Artificial Intelligence (AI) Hardware
  • Quantum Technology
  • Electromagnetic Warfare
  • Commercial Leap Ahead Technologies

Hubs are expected to spur economic growth across their respective regions and the economy at large. Hubs are charged with developing the physical, digital, and human infrastructure needed to support future success in microelectronics research and development. This includes building education pipelines and retraining initiatives to ensure the United States has the talent pool needed to sustain these investments. Hubs are expected to become self-sufficient by the end of their initial five-year awards.

“Consistent with our warfighter-centric approach to innovation,” said Deputy Secretary Hicks, “these hubs will tackle many technical challenges relevant to DoD’s missions, to get the most cutting-edge microchips into systems our troops use every day: ships, planes, tanks, long-range munitions, communications gear, sensors, and much more… including the kinds of all-domain, attritable autonomous systems that we’ll be fielding through the Department’s recently-announced Replicator initiative.”

The Microelectronics Commons program has been spearheaded by the Office of the Under Secretary of Defense for Research and Engineering, in conjunction with the Naval Surface Warfare Center, Crane Division and the National Security Technology Accelerator. On 30 November 2022, the Request for Solutions was released, with a deadline of 28 February 2023. The DoD received over 80 submissions, with over 600 unique organizations included as prospective team members. The DoD pulled together an interagency team of technical experts, including representatives from the Commerce Department, to make selections.

The Microelectronics Commons program will soon move into the project stage, at which point organizations can work with the Hubs to tackle key challenges. This includes organizations which were not selected for Hubs today. More information on the program will be shared at the Microelectronics Commons Annual Meeting on 17-18 October 2023 in Washington DC. Learn more at https://microelectronicscommons.org/.

Article link: https://www.defense.gov/News/Releases/Release/Article/3531768/deputy-secretary-of-defense-kathleen-hicks-announces-238m-chips-and-science-act/

The Top Programming Languages 2023 – IEEE Spectrum

Posted by timmreardon on 09/12/2023
Posted in: Uncategorized.
Python and SQL are on top, but old languages shouldn’t be forgotten

STEPHEN CASS

29 AUG 2023

3 MIN READ

Welcome to IEEE Spectrum’s 10th annual rankings of the Top Programming Languages. While the way we put the TPL together has evolved over the past decade, the basics remain the same: to combine multiple metrics of popularity into a set of rankings that reflect the varying needs of different readers.

This year, Python doesn’t just remain No. 1 in our general “Spectrum” ranking—which is weighted to reflect the interests of the typical IEEE member—but it widens its lead. Python’s increased dominance appears to be largely at the expense of smaller, more specialized, languages. It has become the jack-of-all-trades language—and the master of some, such as AI, where powerful and extensive libraries make it ubiquitous. And although Moore’s Law is winding down for high-end computing, low-end microcontrollers are still benefiting from performance gains, which means there’s now enough computing power available on a US $0.70 CPU to make Python a contender in embedded development, despite the overhead of an interpreter. Python also looks to be solidifying its position for the long term: Many children and teens now program their first game or blink their first LED using Python. They can then move seamlessly into more advanced domains, and even get a job, with the same language.

But Python alone does not make a career. In our “Jobs” ranking, it is SQL that shines at No. 1. Ironically though, you’re very unlikely to get a job as a pure SQL programmer. Instead, employers love, love, love, seeing SQL skills in tandem with some other language such as Java or C++. With today’s distributed architectures, a lot of business-critical data live in SQL databases, whether it’s the list of magic spells a player knows in an online game or the amount of money in their real-life bank account. If you want to to do anything with that information, you need to know how to get at it.

But don’t let Python and SQL’s rankings fool you: Programming is still far from becoming a monoculture. Java and the various C-like languages outweigh Python in their combined popularity, especially for high-performance or resource-sensitive tasks where that interpreter overhead of Python’s is still too costly (although there are a number of attempts to make Python more competitive on that front). And there are software ecologies that are resistant to being absorbed into Python for other reasons.

We saw more fintech developer positions looking for chops in Cobol than in crypto

For example, R, a language used for statistical analysis and visualization, came to prominence with the rise of big data several years ago. Although powerful, it’s not easy to learn, with enigmatic syntax and functions typically being performed on entire vectors, lists, and other high-level data structures. But although there are Python libraries that provide similar analytic and graphical functionality, R has remained popular, likely precisely because of its peculiarities. They make R scripts hard to port, a significant issue given the enormous body of statistical analysis and academic research built on R. Entire fields of researchers and analysts would have to learn a new language and rebuild their work. (Side note: We use R to crunch the numbers for the TPL.) 

This situation bears similarities to Fortran, where the value of the existing validated code for physics simulations and other scientific computations consistently outweighs the costs associated with using one of the oldest programming languages in existence. You can still get a job today as a Fortran programmer—although you’ll probably need to be able to get a security clearance, since those jobs are mostly at U.S. federal defense or energy labs like the Oak Ridge National Laboratory. 

If you can’t get a security clearance but still like languages with more than a few miles on them, Cobol is another possibility. This is for many of the same reasons we see with Fortran: There’s a large installed code base that does work where mistakes are expensive. Many large banks still need their Cobol programmers. Indeed, based on our review of hundreds of developer recruitment ads, it’s worth noting that we saw more fintech developer positions looking for chops in Cobol than in crypto. 

Veteran languages can also turn up in places you might not expect. Ladder Logic, created for industrial-control applications, is often associated with old-fashioned tech. Nonetheless, we spotted a posting from Blue Origin, one of the glamorous New Space outfits, looking for someone with Ladder Logic skills. Presumably that’s related to the clusters of ground equipment needed to fuel, energize, and test boosters and spacecraft, and which have much more in common with sprawling chemical refineries than soaring rockets.

Ultimately, the TPL represents Spectrum‘s attempt to measure something that can never be measured exactly, informed by our ongoing coverage of computing. Our guiding principle is not to get bogged down in debates about how programming languages are formally classified, but instead ground it in the practicalities relevant to folks tapping away at their keyboards, creating the magic that makes the modern world run. (You can read more about the metrics and methods we use to construct the rankings in our accompanying note.) We hope you find it useful and informative, and here’s to the next 10 years!

Article link: https://spectrum.ieee.org/the-top-programming-languages-2023

Military services’ zero trust plans to go under the Pentagon’s microscope -Breaking Defense

Posted by timmreardon on 09/10/2023
Posted in: Uncategorized.

By Jaspreet Gill on September 07, 2023 at 3:06 PM

WASHINGTON — Within weeks the Pentagon will begin reviewing zero trust plans from each of its components to make sure they align with the department’s vision of fielding a “targeted” level of zero trust by fiscal 2027, the department’s chief information officer (CIO) said today. 

“So next month, we’re getting their plans… [from] not only military services, but all the different components,” John Sherman said today at the Billington CyberSecurity Summit. “So I’d suspect each of the components — matter of fact I know they are — are taking a little bit different path to get there. So that’s a very important milestone coming out here next month to get these plans and start to assess them.”

Sherman said that the assessments will be led by Randy Resnick, director of the zero trust portfolio management office within the CIO. Resnick’s team will “be reviewing what these plans look like, consistent with what we’ve laid out with… the 91 capabilities to get to targeted zero trust by 2027,” Sherman added. 

DoD released its zero trust strategy last November outlining what it would take to achieve what it called a “targeted” level of zero trust, or a required minimal set of 91 activities DoD and its components need to achieve by FY27, to address threats, including those posed by cyberspace adversaries like China. An additional 61 activities outlined in the strategy will get the Pentagon to a more “advanced” level of zero trust later.

The 29-page strategy painted a concerning picture for DoD’s information enterprise, which is “under wide-scale and persistent attack from known and unknown malicious actors,” from individuals to state-sponsored adversaries, specifically China, who “often” breach the Pentagon’s “defensive perimeter.”

“With zero trust we are assuming that a network is already compromised and through recurring user authentication and authentic authorization, we will thwart and frustrate an adversary frommoving through a network and also quickly identify them and mitigate damage and the vulnerability they may have exploited,” Resnick told reporters ahead of the strategy’s release.

In June, Resnick said it was proving “hard to orchestrate” each military service’s individual zero trust efforts into something cohesive. As a result, DoD started doing weekly “huddles” and larger monthly meetings with the services and “communities of interest” in an effort to educate them on how to execute the department’s vision outlined in its zero trust strategy.

At the time, Resnick described the meetings as “deep dives into the technology and successes that some of our folks in the DoD have achieved up to this point.”

Article link: https://breakingdefense-com.cdn.ampproject.org/c/s/breakingdefense.com/2023/09/military-services-zero-trust-plans-to-go-under-the-pentagons-microscope/?

White House to bolster cybersecurity training for K-12 schools, with help from the FCC, Amazon Web Services and more – CNBC

Posted by timmreardon on 09/09/2023
Posted in: Uncategorized.

PUBLISHED TUE, AUG 8 2023 12:07 PM EDTUPDATED TUE, AUG 8 2023 2:30 PM EDT

Lauren Feiner

@LAUREN_FEINER

WATCH LIVE

KEY POINTS 

  • The Biden administration launched an effort to beef up cybersecurity safeguards and training for K-12 schools ahead of the school year.
  • Several companies joined in the initiative, including Amazon Web Services, which committed $20 million to fund a cyber grant program for school districts and state departments of education.
  • “We need to be taking the cyber attacks on school as seriously as we do the physical attacks on critical infrastructure,” Education Secretary Miguel Cardona said Tuesday.

The Biden administration is putting cybersecurity training on the back-to-school list with an initiative to beef up tech safeguards.

On Tuesday, the White House convened school administrators, educators and companies to explore how best to protect schools and students’ information from cyberattacks. At least eight K-12 school districts across the country experienced significant cyberattacks in the last academic year, the White House said, leading to disruptions in learning. Cyberattacks have led to anywhere from three days to three weeks of learning loss, a 2022 U.S. Government Accountability Office report found, with recovery of that learning loss taking two to nine months.

In remarks at the beginning of the summit, Education Secretary Miguel Cardona said the average number of “unique education technology tools accessed per school district was over 2,500.”

“So when schools face cyber attacks, the impacts can be huge,” Cardona said. “We need to be taking the cyber attacks on school as seriously as we do the physical attacks on critical infrastructure.”

The White House announced a series of actions from federal agencies and commitments from companies to help school districts secure their digital information.

On the government side, the Federal Communications Commission proposed a pilot program that would provide up to $200 million over three years for reinforcing cyber defenses in K-12 schools and libraries. The money would be allocated from the Universal Service Fund, which has been used in part to provide internet access for schools.

The Cybersecurity and Infrastructure Security Agency (CISA) plans to help train and assess cybersecurity practices at 300 new “K-12 entities” in the upcoming school year. And the Federal Bureau of Investigation and National Guard Bureau will release new resources explaining how to report cybersecurity incidents.

Several companies joined in the initiative as well. Amazon Web Services committed $20 million to fund a cyber grant program for school districts and state departments of education. To applyfor the grants, districts or departments must describe the relevant project they seek to work on and their intended goals. 

When CNBC asked how AWS would evaluate grant applications, the company said it would consider how many students are served by the public school district, education agency or department and the scope of the proposed solution.

It will also conduct free security reviews for U.S. education technology companies that provide “mission-critical applications” for K-12 schools. And if districts are attacked, AWS said it will provide cyber incident response assistance at no cost.

Other companies joining the initiative included Cloudflare, Google and more. Cloudflarecommitted to offer cybersecurity solutions for free to public school districts with less than 2,500 students, cloud-based education tech company PowerSchool said it would provide free and subsidized “security as a service” courses to schools and districts and Google created a new “K-12 Cybersecurity Guidebook.”

Article link: https://www-cnbc-com.cdn.ampproject.org/c/s/www.cnbc.com/amp/2023/08/08/white-house-launches-effort-to-secure-k-12-schools-from-cyberattacks.html

Navy’s new Neptune office to take charge of cloud management for the sea services – DefenseScoop

Posted by timmreardon on 09/07/2023
Posted in: Uncategorized.

The Neptune Cloud Management Office is intended to help centralize and streamline the acquisition and delivery of cloud capabilities for the Navy and Marine Corps.

BYJON HARPER

SEPTEMBER 5, 2023

The Department of the Navy is standing up a new Neptune Cloud Management Office to help centralize and streamline the acquisition and delivery of cloud capabilities across the sea services.

A memo formally establishing the organization was signed off in June by Ruth Youngs Lew, the program executive officer for digital and enterprise services. Neptune will be part of PEO Digital.

Within the department, the new cloud management office will have two components: one for the Navy and another for the Marine Corps. The Navy component is expected to start operations “at or around the start of” fiscal 2024, Louis Koplin, leader of the platform application services portfolio at PEO Digital, said in an email to DefenseScoop on Tuesday. The Marine Corps component is already operational.

Neptune is intended to serve as “the single point of entry” for the acquisition and delivery of cloud services across the Department of the Navy and facilitate the “digital transformation to cloud-native and zero-trust enterprise services,” according to Lew’s memo, which was obtained by DefenseScoop.

Notably, the office is expected to play a major role in the department’s accessing the Joint Warfighting Cloud Capability (JWCC) contract vehicle, the Pentagon’s $9 billion enterprise cloud effort that replaced the ill-fated Joint Enterprise Defense Infrastructure (JEDI) program. Google, Oracle, Amazon Web Services and Microsoft were all awarded under the contract last year and will each compete for task orders.

“Neptune cloud management office will guide and assist Marine Corps and Navy mission owners to appropriately leverage JWCC, to eventually include centralized and automated ordering from the cloud portal on the Naval Digital Marketplace,” officials involved in the effort said in a statement to DefenseScoop.

Justin Fanelli, acting chief technology officer of the Navy, told DefenseScoop: “If we do this very right with our new partners and existing strong partnerships within DOD, the best way will also be the easiest way. Our service to our warfighters will be measured by drastically reduced friction and improved mission outcomes.”

Neptune has been tasked with establishing enterprise capabilities in coordination with the sea services’ deputy CIOs, Marine Corps Forces Cyberspace Command and U.S. Fleet Cyber Command/Commander, U.S. 10th Fleet. It is also expected to improve the “customer service experience” for components seeking to consume cloud services, automate repetitive work and eliminate duplicative work, according to Lew’s memo.

The creation of the new office comes as the Defense Department is embracing the cloud as a key component of its IT modernization plans.

Policy guidance signed out in 2020 by the Navy CIO and assistant secretary for research, development and acquisition stated that the Department of the Navy “shall maintain its global strategic advantage by harnessing the power of data and information systems through cloud computing. Cloud computing is the primary approach to transforming how the DON delivers, protects, and manages access to data and applications across all mission areas. Cloud computing … shall be adopted and consumed in such a way as to maximize its inherent characteristics and advantages,”

Per Lew’s memo, the new Neptune office is tasked with maintaining the portfolio of available and authorized cloud service offerings on the Naval Digital Marketplace and managing the department’s consumption across that portfolio via an integrated cloud Financial Operations (FinOps) capability. It will also deliver a cloud solutions “guidebook” that tells people who buy and build information systems how to best employ the Navy’s cloud portfolio.

The organization is expected to “describe, automate, and enhance the customer journey for those seeking to consume cloud services from the DON Cloud Portfolio, following IT service management (ITSM) best practices” for the cloud when it comes to engaging, procuring, provisioning, migrating, operating, defending and decommissioning. It will also establish “additional plans and/or process(es) to execute those Cloud ITSM phases” as necessary, according to Lew’s memo.

Article link: https://defensescoop.com/2023/09/05/navys-new-neptune-office-to-take-charge-of-cloud-management-for-the-sea-services/

Posts navigation

← Older Entries
Newer Entries →
  • Search site

  • Follow healthcarereimagined on WordPress.com
  • Recent Posts

    • WHAT A QUBIT IS AND WHAT IT IS NOT. 01/25/2026
    • Governance Before Crisis We still have time to get this right. 01/21/2026
    • On the Eve of Davos: We’re Just Arguing About the Wrong Thing 01/18/2026
    • Are AI Companies Actually Ready to Play God? – RAND 01/17/2026
    • ChatGPT Health Is a Terrible Idea 01/09/2026
    • Choose the human path for AI – MIT Sloan 01/09/2026
    • Why AI predictions are so hard – MIT Technology Review 01/07/2026
    • Will AI make us crazy? – Bulletin of the Atomic Scientists 01/04/2026
    • Decisions about AI will last decades. Researchers need better frameworks – Bulletin of the Atomic Scientists 12/29/2025
    • Quantum computing reality check: What business needs to know now – MIT Sloan 12/29/2025
  • Categories

    • Accountable Care Organizations
    • ACOs
    • AHRQ
    • American Board of Internal Medicine
    • Big Data
    • Blue Button
    • Board Certification
    • Cancer Treatment
    • Data Science
    • Digital Services Playbook
    • DoD
    • EHR Interoperability
    • EHR Usability
    • Emergency Medicine
    • FDA
    • FDASIA
    • GAO Reports
    • Genetic Data
    • Genetic Research
    • Genomic Data
    • Global Standards
    • Health Care Costs
    • Health Care Economics
    • Health IT adoption
    • Health Outcomes
    • Healthcare Delivery
    • Healthcare Informatics
    • Healthcare Outcomes
    • Healthcare Security
    • Helathcare Delivery
    • HHS
    • HIPAA
    • ICD-10
    • Innovation
    • Integrated Electronic Health Records
    • IT Acquisition
    • JASONS
    • Lab Report Access
    • Military Health System Reform
    • Mobile Health
    • Mobile Healthcare
    • National Health IT System
    • NSF
    • ONC Reports to Congress
    • Oncology
    • Open Data
    • Patient Centered Medical Home
    • Patient Portals
    • PCMH
    • Precision Medicine
    • Primary Care
    • Public Health
    • Quadruple Aim
    • Quality Measures
    • Rehab Medicine
    • TechFAR Handbook
    • Triple Aim
    • U.S. Air Force Medicine
    • U.S. Army
    • U.S. Army Medicine
    • U.S. Navy Medicine
    • U.S. Surgeon General
    • Uncategorized
    • Value-based Care
    • Veterans Affairs
    • Warrior Transistion Units
    • XPRIZE
  • Archives

    • January 2026 (8)
    • December 2025 (11)
    • November 2025 (9)
    • October 2025 (10)
    • September 2025 (4)
    • August 2025 (7)
    • July 2025 (2)
    • June 2025 (9)
    • May 2025 (4)
    • April 2025 (11)
    • March 2025 (11)
    • February 2025 (10)
    • January 2025 (12)
    • December 2024 (12)
    • November 2024 (7)
    • October 2024 (5)
    • September 2024 (9)
    • August 2024 (10)
    • July 2024 (13)
    • June 2024 (18)
    • May 2024 (10)
    • April 2024 (19)
    • March 2024 (35)
    • February 2024 (23)
    • January 2024 (16)
    • December 2023 (22)
    • November 2023 (38)
    • October 2023 (24)
    • September 2023 (24)
    • August 2023 (34)
    • July 2023 (33)
    • June 2023 (30)
    • May 2023 (35)
    • April 2023 (30)
    • March 2023 (30)
    • February 2023 (15)
    • January 2023 (17)
    • December 2022 (10)
    • November 2022 (7)
    • October 2022 (22)
    • September 2022 (16)
    • August 2022 (33)
    • July 2022 (28)
    • June 2022 (42)
    • May 2022 (53)
    • April 2022 (35)
    • March 2022 (37)
    • February 2022 (21)
    • January 2022 (28)
    • December 2021 (23)
    • November 2021 (12)
    • October 2021 (10)
    • September 2021 (4)
    • August 2021 (4)
    • July 2021 (4)
    • May 2021 (3)
    • April 2021 (1)
    • March 2021 (2)
    • February 2021 (1)
    • January 2021 (4)
    • December 2020 (7)
    • November 2020 (2)
    • October 2020 (4)
    • September 2020 (7)
    • August 2020 (11)
    • July 2020 (3)
    • June 2020 (5)
    • April 2020 (3)
    • March 2020 (1)
    • February 2020 (1)
    • January 2020 (2)
    • December 2019 (2)
    • November 2019 (1)
    • September 2019 (4)
    • August 2019 (3)
    • July 2019 (5)
    • June 2019 (10)
    • May 2019 (8)
    • April 2019 (6)
    • March 2019 (7)
    • February 2019 (17)
    • January 2019 (14)
    • December 2018 (10)
    • November 2018 (20)
    • October 2018 (14)
    • September 2018 (27)
    • August 2018 (19)
    • July 2018 (16)
    • June 2018 (18)
    • May 2018 (28)
    • April 2018 (3)
    • March 2018 (11)
    • February 2018 (5)
    • January 2018 (10)
    • December 2017 (20)
    • November 2017 (30)
    • October 2017 (33)
    • September 2017 (11)
    • August 2017 (13)
    • July 2017 (9)
    • June 2017 (8)
    • May 2017 (9)
    • April 2017 (4)
    • March 2017 (12)
    • December 2016 (3)
    • September 2016 (4)
    • August 2016 (1)
    • July 2016 (7)
    • June 2016 (7)
    • April 2016 (4)
    • March 2016 (7)
    • February 2016 (1)
    • January 2016 (3)
    • November 2015 (3)
    • October 2015 (2)
    • September 2015 (9)
    • August 2015 (6)
    • June 2015 (5)
    • May 2015 (6)
    • April 2015 (3)
    • March 2015 (16)
    • February 2015 (10)
    • January 2015 (16)
    • December 2014 (9)
    • November 2014 (7)
    • October 2014 (21)
    • September 2014 (8)
    • August 2014 (9)
    • July 2014 (7)
    • June 2014 (5)
    • May 2014 (8)
    • April 2014 (19)
    • March 2014 (8)
    • February 2014 (9)
    • January 2014 (31)
    • December 2013 (23)
    • November 2013 (48)
    • October 2013 (25)
  • Tags

    Business Defense Department Department of Veterans Affairs EHealth EHR Electronic health record Food and Drug Administration Health Health informatics Health Information Exchange Health information technology Health system HIE Hospital IBM Mayo Clinic Medicare Medicine Military Health System Patient Patient portal Patient Protection and Affordable Care Act United States United States Department of Defense United States Department of Veterans Affairs
  • Upcoming Events

Blog at WordPress.com.
healthcarereimagined
Blog at WordPress.com.
  • Subscribe Subscribed
    • healthcarereimagined
    • Join 153 other subscribers
    • Already have a WordPress.com account? Log in now.
    • healthcarereimagined
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...