healthcarereimagined

Envisioning healthcare for the 21st century

  • About
  • Economics

Army lifts curtains on planned $1B software development contract

Posted by timmreardon on 05/29/2024
Posted in: Uncategorized.

GETTYIMAGES.COM / ANDREY SUSLOV

By ROSS WILKERSMAY 28

The Army calls out specific modern practices it wants to incorporate and asks industry about others that could work here too.

The Army has started to develop a new software development support services contract with a touted ceiling value north of $1 billion over 10 years.

A new sources sought notice describes the New Modern Software Development IDIQ as a means to hire a group of contractors that can perform the work on rapidly-awarded task orders as they come.

At this juncture, the Army plans to choose up to 10 companies in total that include those reserved for small businesses.

The Army also envisions holding an on-ramp process to bring more contractors into the fold, along with the potential of off-ramps to move away from those with high rates of unsuccessful task order proposals or no bids.

Customization is a key element of the requirements, whether that be in the development of a new product or the modification of a current offering. Awardees will also be responsible for enabling software-as-a-service hosting and security.

The Army is emphasizing modern software development practices that include DevSecOps, Agile, lean and continuous integration/continuous delivery.

At the same time, the Army’s list of questions in the request for information also asks about other areas of software development it should include in the contract.

A second key question worth highlighting is whether interested parties can provide examples or recommendations of how to use experience in coding challenges to establish employee qualifications. The Army sees that approach as a potential alternative to certifications and proposals, depending on the answers, of course.

Significant question number three to call out zeroes in on the two-phase evaluation approach the Army is mulling for this requirement.

For phase two, the Army is considering technology challenges versus demonstrations as the focus there. The Army wants to know what respondents think about that.

Responses to the RFI are due June 10.

Article link: https://washingtontechnology.com/contracts/2024/05/army-lifts-curtains-planned-1b-software-development-contract/396919/?

Defense Innovation Inflection Point? – Forbes

Posted by timmreardon on 05/28/2024
Posted in: Uncategorized.

Mike BrownContributor

Michael Brown is a partner at the VC firm Shield Capital.

Almost a decade ago, Secretary of Defense Ash Carter came to Stanford University in 2015 to announce a bridge between Silicon Valley and the Pentagon, a new organization he termed the Defense Innovation Unit Experimental or DIUx. Secretary Carter realized earlier than many of the technologies the military needed would come from commercial companies at the forefront of AI, autonomy, cyber and commercial space—not from government labs or defense primes.

Why? The Defense Department provides a much smaller proportion of global R&D than in the past (as other countries invested) and commercial businesses now invest more R&D dollars than the government. Sixty years ago, the Defense Department represented 36% of global R&D vs. 3% today. The five largest tech companies spend more than 10x on R&D in comparison to what the five largest defense primes spend.

Inauspicious Beginnings

Changing how the Department views and incorporates commercial technology has not been easy. DIU was initially conceived as a defense embassy to Silicon Valley. What became apparent was that Silicon Valley was not interested in an embassy but instead wanted opportunities to compete for defense contracts. In the fail-fast mode of Silicon Valley, Secretary Carter restarted the operation with a Silicon Valley entrepreneur and F-16 pilot, Raj Shah. Raj had the insight that DIU should be a front door to Pentagon projects which offered revenue contracts for companies that could solve military problems. His team pioneered the use of a fast-track acquisition authority and a competitive process that mirrored how most companies do sourcing.

However, there was a struggle getting Congress to approve and sustain a budget for this startup organization of about $30 million in 2017, and another hiccup when DIU’s reporting relationship changed from the Secretary to an Undersecretary whose primary focus was defense-developed technology—not commercial technology. I succeeded Raj as DIU director in 2018. Despite inconsistent support from defense leadership for budget and manpower, DIU gained approval for its own contracting capability and subsequently scaled to add 100 new vendors who, in turn, attracted 10-20x in venture capital for every $1 awarded in a DIU prototype contract. DIU also increased its transition rate for production contracts to 50% of every prototype effort begun, which in turn attracted 43 potential suppliers for every prototyping effort. Some of the companies now most associated with defense tech like Anduril, C3.ai, Rhombus Power, Shield AI and Vannevar Labs all got a start with DIU prototype contracts but have gone on to develop much more significant businesses serving the Defense Department.

Commercial Technology In The DoD Today

The Defense Department has again realized that accelerating the incorporation of disruptive technology is most effective when reporting to the Secretary of Defense. The Department is now applying commercial technology to its most important problems, specifically, what the military commands of the Indo-Pacific and Europe face in deterring China and Russia. Now with nearly $1 billion from Congress, DIU can field new capabilities in one to two years to support those military commands without waiting for multi-year budget alignment or traditional acquisition processes which on average field new capabilities in 17 years.

In addition, venture capital has begun to support defense tech in an unprecedented way. New firms like ours, Shield Capital, have formed to support national security applications while more established venture firms like Andreesen Horowitz, Lightspeed, and Bessemer Ventures have developed practices focused on defense within their firms. In aggregate, investment in defense tech is up 5x in the past six years and now represents $40 billion annually. With a larger amount of DoD’s budget focused on commercial tech, this is a win-win for the nation—new paths for businesses to ramp revenues faster while modernizing capabilities for the warfighter.

A New Age Bridge From The Pentagon To Silicon Valley

The early days of DIU began to attract the best and brightest of the Valley through contracts, but now with increased resources, there can be a much stronger flywheel effect when more companies receive larger production contracts and, in turn, the companies invest more in their defense product lines and production capacity. Expanding the defense industrial base is critical to any future war effort since the base has consolidated to such a degree that it would be severely constrained in wartime as production constraints in supplying Ukraine have highlighted. As more dollars flow to commercial companies, investors will back more entrepreneurs creating solutions for warfighters. A virtuous circle forms when the defense industrial base grows, is more competitive, offers more choice for the Department in terms of cost/performance and warfighters gain access to leading technology. For too long, our soldiers, sailors and airmen have had access to more modern technology in their civilian consumer lives than while in uniform.

Change is hard and it has taken dedicated civilians, investors, men and women in uniform as well as Congressional leaders years to realize the opportunity and momentum in leveraging commercial technology for defense applications. There is an opportunity at this inflection point to provide the nation and our allies a hedge that can complement the exquisite defense platforms today such as the F-35 and Columbia-class subs.

Today, DIU has the reporting relationship and widespread support to provide the best of what Silicon Valley can offer to equip our warfighters. In August 2023, Deputy Secretary of Defense Kathleen Hicks announced that DIU would be a focal point of the Replicator initiative with the aim of delivering thousands of autonomous, low-cost drones to the Indo-Pacific Command within 18-24 months. There appears to be progress 8 months in with an announcement this week that “the delivery of Replicator systems to the warfighter began earlier this month.”

Today, DIU has the reporting relationship and widespread support to provide the best of what Silicon Valley can offer to equip our warfighters. In August 2023, Deputy Secretary of Defense Kathleen Hicks announced that DIU would be a focal point of the Replicator initiative with the aim of delivering thousands of autonomous, low-cost drones to the Indo-Pacific Command within 18-24 months. There appears to be progress 8 months in with an announcement this week that “the delivery of Replicator systems to the warfighter began earlier this month.”

Article link: https://www.forbes.com/sites/mikebrown/2024/05/25/defense-innovation-inflection-point/?

The world needs a global AI observatory —

Posted by timmreardon on 05/24/2024
Posted in: Uncategorized.

Here’s why it Matters

So little is known about what’s happening in artificial intelligence and what might lie ahead. An international body could help fix that, experts argue. 

Across the globe, there is growing awareness of the risks of unchecked artificial intelligence research and development. Governments are moving fast in an attempt to address this, using existing legal frameworks or introducing new standards and assurance mechanisms. Recently, the White House proposed an AI Bill of Rights.

But the great paradox of a field founded on data is that so little is known about what’s happening in AI, and what might lie ahead.

This is why we believe the time is right for the creation of a global AI observatory — a GAIO — to better identify risks, opportunities, and developments and to predict AI’s possible global effects.

The world already has a model in the Intergovernmental Panel on Climate Change. Established in 1988 by the United Nations with member countries from around the world, the IPCC provides governments with scientific information they can use to develop climate policies. A comparable body for AI would provide a reliable basis of data, models, and interpretation to guide policy and broader decision-making about AI.

At present, numerous bodies collect valuable AI-related metrics. Nation-states track developments within their borders; private enterprises gather relevant industry data; and organizations like the OECD’s AI Policy Observatory focus on national AI policies and trends. While these initiatives are a crucial beginning, much about AI remains opaque, often deliberately. It is impossible to regulate what governments don’t understand. A GAIO could fill this gap through four main areas of activity.

It is impossible to regulate what governments don’t understand. A GAIO could fill this gap.

Thomas Malone et al. Advocates for a global AI observatory

1. Create a global, standardized incident-reporting databaseconcentrating on critical interactions between AI systems and the real world. For example, in the domain of biorisk, where AI could aid in creating dangerous pathogens, a structured framework for documenting incidents related to such risks could help mitigate threats. A centralized database would record essential details about specific incidents involving AI applications and their consequences in diverse settings — examining factors such as the system’s purpose, use cases, and metadata about training and evaluation processes. Standardized incident reports could enable cross-border coordination.

2. Assemble a registry of crucial AI systems focused on AI applications with the largest social and economic impacts, as measured by the number of people affected, the person-hours of interaction, and the stakes of their effects, to track their potential consequences.

3. Bring together global knowledge about the impacts of AI on critical areas such as labor markets, education, media, and health care. Subgroups could orchestrate the gathering, interpretation, and forecasting of data. A GAIO would also include metrics for both positive and negative impacts of AI, such as the economic value created by AI products and the impact of AI-enabled social media on mental health and political polarization.

4. Orchestrate global debate through an annual report on the state of AIthat analyzes key issues, patterns that arise, and choices governments and international organizations need to consider. This would involve rolling out a program of predictions and scenarios focused primarily on technologies likely to go live in the succeeding two to three years. The program could build on existing efforts, such as the AI Index produced by Stanford University.

A focus on facts rather than prescriptions

A GAIO would also need to innovate. Crucially, it would use collective intelligence methods to bring together inputs from thousands of scientists and citizens, which is essential in tracking emergent capabilities in a fast-moving and complex field. In addition, a GAIO would introduce whistleblowing mechanisms similar to the U.S. government’s incentives for employees to report on harmful or illegal actions.

RELATED ARTICLES

Why generative AI needs a creative human touch

net pioneer Geoffrey Hinton sounds the AI alarm 

3 ways to center humans in artificial intelligence efforts

To succeed, a GAIO would need a comparable legitimacy to the IPCC. This can be achieved through its members including governments, scientific bodies, and universities, among others, and by ensuring a sharp focus on facts and analysis more than prescription, which would be left in the hands of governments.

Contributors to the work of a GAIO would be selected, as with the IPCC, on the basis of nominations by member organizations, to ensure depth of expertise, disciplinary diversity, and global representativeness. Their selection would also require maximum transparency, to minimize both real and perceived conflicts of interest.

The AI community and businesses using AI tend to be suspicious of government involvement, often viewing it as a purveyor of restrictions. But the age of self-governance is now over. We propose an organization that exists in part for governments, but with the primary work undertaken by scientists. All international initiatives related to AI would be welcomed.

In order to grow, a GAIO will need to convince key players from the U.S., China, the U.K., the European Union, and India, among others, that it will fill a vital gap. The fundamental case for its creation is that no country will benefit from out-of-control AI, just as no country benefits from out-of-control pathogens.

The greatest risk now is multiple unconnected efforts. Unmanaged artificial intelligence threatens important infrastructure and the information space we all need to think, act, and thrive. Before nation-states squeeze radically new technologies into old legal and policy boxes, the creation of a GAIO is the most feasible step.

This article is written by Sir Geoff Mulgan, a professor of collective intelligence, public policy, and social innovation at University College London;Thomas Malone, a professor of information management at MIT Sloan and the director of the MIT Center for Collective Intelligence; Divya Siddharth and Saffron Huang of the Collective Intelligence Project; Joshua Tanof the Metagovernance Project; and Lewis Hammond of the Cooperative AI Foundation.

Article link: https://mitsloan.mit.edu/ideas-made-to-matter/world-needs-a-global-ai-observatory-heres-why?

MITRE to Establish New AI Experimentation and Prototyping Capability for U.S. Government Agencies

Posted by timmreardon on 05/22/2024
Posted in: Uncategorized.

MAY 7, 2024

MITRE Federal AI Sandbox to be Powered by NVIDIA DGX SuperPOD 

McLean, Va., May 7, 2024 – MITRE is building a new capability intended to give its artificial intelligence (AI) researchers and developers access to a massive increase in computing power. The new capability, MITRE Federal AI Sandbox, will provide better experimentation of next generation AI-enabled applications for the federal government. The Federal AI Sandbox is expected to be operational by year’s end and will be powered by an NVIDIA DGX SuperPOD™ that enables accelerated infrastructure scale and performance for AI enterprise work and machine learning.

As U.S. government agencies seek to apply AI across their operations, few have adequate access to supercomputers and the deep expertise required to operate the technology and test potential applications on secure infrastructure. 

“The recent executive order on AI encourages federal agencies to reduce barriers for AI adoptions, but agencies often lack the computing environment necessary for experimentation and prototyping,” says Charles Clancy, MITRE, senior vice president and chief technology officer. “Our new Federal AI Sandbox will help level the playing field, making the high-quality compute power needed to train and test custom AI solutions available to any agency.” 

MITRE will apply the Federal AI Sandbox to its work for federal agencies in areas including national security, healthcare, transportation, and climate. Agencies can gain access to the benefits of the Federal AI Sandbox through existing contracts with any of the six federally funded research and development centers MITRE operates.

Sandbox capabilities offer computing power to train cutting edge AI applications for government use including large language models (LLMs) and other generative AI tools. It can also be used to train multimodal perception systems that can understand and process information from multiple types of data at once such as images, audio, text, radar, and environmental or medical sensors, and reinforcement learning decision aids that learn by trial and error to help humans make better decisions.

“MITRE’s purchase of a DGX SuperPOD to assist the federal government in its development of AI initiatives will turbocharge the U.S. federal government’s efforts to leverage the power of AI,” says Anthony Robbins, vice president of public sector, NVIDIA. “AI has enormous potential to improve government services for citizens and solve big challenges, like transportation and cyber security.” 

The NVIDIA DGX SuperPOD powering the sandbox is capable of an exaFLOP of performance to train and deploy custom LLMs and other AI solutions at scale.

About MITRE

MITRE’s mission-driven teams are dedicated to solving problems for a safer world. Through our public-private partnerships and federally funded R&D centers, we work across government and in partnership with industry to tackle challenges to the safety, stability, and well-being of our nation.

Article link: https://www.mitre.org/news-insights/news-release/mitre-establish-new-ai-experimentation-and-prototyping-capability-us

U.S. Secretary of Commerce Gina Raimondo Releases Strategic Vision on AI Safety, Announces Plan for Global Cooperation Among AI Safety Institutes – NIST

Posted by timmreardon on 05/22/2024
Posted in: Uncategorized.

Raimondo announces plans for global network of AI Safety Institutes and future convening in the San Francisco area, where the U.S. AI Safety Institute recently established a presence.

May 21, 2024

Today, as the AI Seoul Summit begins, U.S. Secretary of Commerce Gina Raimondo released a strategic vision for the U.S. Artificial Intelligence Safety Institute (AISI), describing the department’s approach to AI safety under President Biden’s leadership. At President Biden’s direction, the National Institute of Standards and Technology (NIST) within the Department of Commerce launched the AISI, building on NIST’s long-standing work on AI. In addition to releasing a strategic vision, Raimondo also shared the department’s plans to work with a global scientific network for AI safety through meaningful engagement with AI Safety Institutes and other government-backed scientific offices, and to convene the institutes later this year in the San Francisco area, where the AISI recently established a presence.

COMMERCE DEPARTMENT AI SAFETY INSTITUTE STRATEGIC VISION

The strategic vision released today, available here, outlines the steps that the AISI plans to take to advance the science of AI safety and facilitate safe and responsible AI innovation. At the direction of President Biden, NIST established the AISI and has since built an executive leadership team that brings together some of the brightest minds in academia, industry and government.

The strategic vision describes the AISI’s philosophy, mission and strategic goals.  Rooted in two core principles — first, that beneficial AI depends on AI safety; and second, that AI safety depends on science — the AISI aims to address key challenges, including a lack of standardized metrics for frontier AI, underdeveloped testing and validation methods, limited national and global coordination on AI safety issues, and more. 

The AISI will focus on three key goals: 

  1. Advance the science of AI safety;
  2. Articulate, demonstrate, and disseminate the practices of AI safety; and
  3. Support institutions, communities, and coordination around AI safety.  

Read the full Department of Commerce release.

Article link: https://www.linkedin.com/posts/nist_ai-artificialintelligence-responsibleai-activity-7198709049062760448-6SMp?

DoD’s acquisition workforce is stretched thin – Federal News Network

Posted by timmreardon on 05/20/2024
Posted in: Uncategorized.

“If I worry about one workforce, it’s the contracting workforce,” said Doug Bush.


Anastasia Obis

May 16, 2024 7:03 am

While the Defense Department’s acquisition budgets have grown significantly in recent years, there are not enough contracting officersto manage their ever-increasing workload. 

The Army, for example, has about 9,000 acquisition workers, but they have been stretched thin since their workload has doubled in the last couple of years.

Doug Bush, the Army’s acquisition chief, said the service is currently focused on giving its contracting officers better technology and tools to increase efficiency, but more people are needed to provide acquisition support to the Army.

“If I worry about one workforce, it’s the contracting workforce,” Bush said during a Senate Appropriations Subcommittee on Defense hearing on Wednesday.

“They doubled their workload, frankly. They did COVID and then they rolled straight into Ukraine. I think [we need] a little help in both realms. Efficiency investment and perhaps some more people would be warranted.”

Bush said they would not be able to beef up their acquisition workforce in the coming year with the current level of funding.

Undersecretary of Defense for Acquisition and Sustainment William LaPlante said that the department is in the midst of rebuilding some parts of its contracting workforce. 

Several factors contribute to the department’s acquisition worker shortage. The Defense Department’s acquisition workers are highly sought after by private companies and other agencies within the federal government, which creates a gap for the department. In addition, many contracting officers were deployed to active conflict zones in Iraq and Afghanistan, which led to burnout.

“I would say in pockets we still have work to do,” said LaPlante.

Nickolas Guertin, the assistant secretary of the Navy for research, development and acquisition, said he has long advocated for a better implementation of modular and open architectures in military equipment and systems, which increases competition and fosters innovation. It also increases the volume of contracts that need to be managed.

“Contract officers are a key component of our future. We need contract officers to grow,” said Guertin. “Honestly, the Navy grows great contract officers because people keep hiring them. That’s something we continually have to refresh.”

Guertin said while there is a plan in place to increase the number of its contracting officers, the service needs help from Congress.

        Read more: Defense 

The Air Force’s main area of need is the software workforce expertise, said Andrew Hunter, the Air Force’s acquisition chief. But the service has been utilizing its acquisition workforce development pilot for the last two decades to bring in its acquisition workforce.

“I personally believe it shouldn’t be permanent. But at a minimum, we need to extend it because that is a key way of how we keep our talent,” said Hunter.

A study from Rand found that the department’s acquisition workforce grew by 57,677, from 128,187 people in 2006 to 185,864 people in 2021. And almost all of the increase was in the civilian acquisition workforce. 

The age distribution of the workforce shifted, with 29% under age 40 in fiscal 2011 to 35% in 2021, indicating that the department has improved the generational profile of its acquisition workforce. And the majority of contracting officers meet or exceed the certification requirements they need to fill their positions. 

Article link: https://federalnewsnetwork.com/defense-main/2024/05/dods-acquisition-workforce-is-stretched-thin/

Army changing the color of money used to modernize software – Federal News Network

Posted by timmreardon on 05/19/2024
Posted in: Uncategorized.

The Army will keep most software development efforts in ongoing development mode and not transition them to sustainment as part of its modernization efforts.

Jason Miller@jmillerWFED

May 14, 2024 11:58 

When it comes to software development, the Army is going to stop worrying about the color of money.

That’s because as part of its new approach to software modernization, the Army is rethinking what sustainment means.

Margaret Boatner is the deputy assistant secretary of the Army for strategy and acquisition reform, said one of the main tenets of the policy signed by Army Secretary Christine Wormuth in March is to reform several legacy processes that is keeping the service from adopting modern software development approaches.

“We are targeting a couple of really key processes like our test and evaluation processes, and importantly, our cybersecurity processes. We really are trying to modernize and streamline those as well as changing the way we think about sustainment because software is really never done. We really have to retrain ourselves to think about and to acknowledge the fact that software really needs to stay in development all the time,” Boatner said in an exclusive interview with Federal News Network. “Right now, our systems and our acquisition programs, once they’re done being developed, they go through a process that we call transition to sustainment, meaning they’ve been fully developed and are now going to live in our inventory for 10, 20, 30 years. We’re going to sustain them for a long period of time. When a system makes that transition, the financial management regulations dictate that they use a certain color of money, operations and maintenance dollars. With that color of money, we can really only do minor patches, fixes and bug updates. So that’s an example of a legacy process that, when you’re talking about a software system, really tied our arms behind our back. It really prevented us from doing true development over the long term with the software solutions.”

Boatner said under the new policy, software will no longer make the transition to sustainment. Instead, the program office will keep operating under research, development, test and evaluation (RDT&E) funding.

“It’s recognizing that a continuous integration/continuous delivery (CI/CD) model software is never done. That way, our program managers can plan to use the appropriate color of money, which in many cases might be RDT&E, which is the color money you need to do true development,” she said. “So, that will give our program managers a lot more flexibility to determine the appropriate color money based on what they want to do, such that our software systems can really continue to be developed over time.”

The Army has been on this path to software modernization path for several years, with it culminating with the March memo.

With the lessons from the 11 software pathways to testing out a new approach to a continuous authority to operate to the broad adoption of the Adaptive Acquisition Framework, Boatner and Leo Garciga, the Army’s chief information officer, are clearing obstacles, modernizing policies and attempting to change the culture of how the Army buys, builds and manages software.

Army updating ATO policy

Garciga said by keeping programs under the RDT&E bucket, the Army is recognizing the other changes it needs to complete to make these efforts more successful.

“We need to relook at processes like interoperability. Historically, that was not a parallel process, but definitely a series process. How do we change the way we look at that to bring it into this model where we’re developing at speed and scale all the time?” he said. “I think we’re starting to see the beginnings of the second- and third-order effects of some of these decisions. The software directive really encapsulated some big rocks that need to move. We’re finding things in our processes that we’re going to have to quickly change to get to the end state we’re looking for.”

Since taking over the CIO role in July, Garciga has been on a mission to modernize IT policiesthat are standing in the way. The latest one is around a continuous ATO (C-ATO).

He said the new policy could be out later this summer.

        Read more: Army 

“We’ve told folks to do DevSecOps and to bring agile into how they deliver software, so how do we accredit that? How do we certify that? What does that model look like? We’re hyper-focused on building out a framework that we can push out to the entire Army,” Garciga said. “Whether you’re at a program of record, or you’re sitting at an Army command, who has an enterprise capability, we will give some guidelines on how we do that, or at least an initial operational framework that says these are the basic steps you need to be certified to do DevSecOps, which really gets to the end state that we’re shooting for.”

He added the current approach to obtaining an ATO is too compliance focused and not risk based.

Pilot demonstrated what is possible

Garciga highlighted a recent example of the barriers to getting C-ATO.

“We started looking at some initial programs with a smart team and we found some interesting things. There was some things that were holding us back like a program that was ready to do CI/CD and actually could do releases every day, but because of interoperability testing and the nature of how we were implementing that in the Army, it was causing them to only release two times a year, which is insane,” he said. “We very quickly got together and rewickered the entire approach for how we were going to do interoperability testing inside the Army. We’re hoping that leads to the department also taking a look at that as we look at the joint force and joint interoperability and maybe they follow our lead, so we can break down some of those barriers.”

Additionally, the Army undertook a pilot to test out this new C-ATO approach.

Garciga said the test case proved a program could receive at least an initial C-ATO in less than 90 days by bringing in red and purple teams to review the code.

“I’d say about three months ago, we actually slimmed down the administrative portion and focused on what were the things that would allow us to protect our data, protect access to a system and make a system survivable. We really condensed down the entire risk management framework (RMF) process to six critical controls,” he said. “On top of that, we added a red team and a purple team to actually do penetration testing in real time against that system as it was deployed in production. What that did is it took our entire time from no ATO to having at least an ATO with conditions down to about less than 90 days. That was really our first pilot to see if we can we actually do this, and what are our challenges in doing that.”

Garciga said one of the big challenges that emerged was the need to train employees to take a more threat-based approach to ATOs. Another challenge that emerged was the Army applied its on-premise ATO approach to the cloud, which Garciga said didn’t make a lot of sense.

“We put some new policy out to really focus on what it means to accredit cloud services and to make that process a lot easier. One of our pilots, as we looked at how do we speed up the process and get someone to a viable CI/CD pipeline, we found things that were really in the way like interoperability testing and how do we get that out of the way and streamline that process,” he said. “In our pilots, the one part that we did find very interesting was this transition of our security control assessors from folks that have historically looked at some very specific paperwork to actually now getting on a system and looking at code, looking at triggers that have happened inside some of our CI/CD tools and making very difficult threshold decisions based on risk and risk that an authorizing official would take to make those decisions. We’re still very much working on what our training plan would be around that piece. That’ll be a big portion of how we’re going to certify CI/CD work and DevSecOps pipelines in the Army moving forward.”


Jason Miller

Jason Miller is executive editor of Federal News Network and directs news coverage on the people, policy and programs of the federal government.  

Follow @jmillerWFED

Article link: https://federalnewsnetwork.com/army/2024/05/army-changing-the-color-of-money-used-to-modernize-software/

The Future of Medicine Is in Your Poop – NIST

Posted by timmreardon on 05/15/2024
Posted in: Uncategorized.

In the fall of 2023, NIST’s scientists in Charleston, South Carolina, received a special shipment of containers packed with baggies full of frozen human feces.

Teams of scientists there and at an outside lab worked together to grind the material into fine dust and blend it with water until it had the consistency of a smoothie. It was then poured into 10,000 tubes and distributed among NIST’s staff in Charleston and Gaithersburg, Maryland.

Scientists in both cities have been rigorously analyzing and studying the waste-matter mixture ever since.

All this excretory experimentation is helping to lay the groundwork for a new generation of treatments and medicines derived from human feces.

The power of poop comes from the microbes it contains. They are a rich sampling of the trillions of microbes living inside our gut, all part of the gut microbiome. In the last decade, scientists have linked the gut microbiome to a raft of human diseases, including inflammatory bowel disease, bacterial infections, autoimmune disorders, obesity and even cancer and mental illness.

Isolating fecal microbes and then turning them into therapies may be a way to treat many of these diseases. In fact, the FDA has recently approved two drugs for treating recurring bacterial infections, both of which are derived from highly processed human stool samples.

“This isn’t just wishful thinking. It’s already happening,” said NIST molecular geneticist Scott Jackson. “We are at the beginning of a new era of medicine.”

Why Poop?

Scott Jackson wears safety glasses and a lab coat as he looks up a two petri dishes he is holding.
Scott Jackson has spearheaded NIST’s efforts to develop a fecal reference material.Credit: R. Wilson/NIST

While human feces also contain water, undigested food and assorted inorganic matter, anywhere from 30% to 50% is made up of bacteria, viruses, fungi and other organisms that once lived in our guts.

We could not survive without these fellow travelers. They play a critical role in metabolism, vitamin production and digestion. By regulating the immune system, they help ward off harmful bacteria and toxins.

Their activity also impacts the nervous system via the gut-brain connection, affecting mood and mental health and influencing many neurological conditions, including Alzheimer’s and autism.

We are only beginning to understand the relationship between microbes and diseases. We have significant gaps in our knowledge about how microbes affect other systems and processes in the body. And certainly, just because changes in the gut microbiome correlate with a particular disease doesn’t mean they cause it.

Still, it’s clear that the signals gut microbes send to each other and cells elsewhere in the body significantly impact our health.

Doctors could get a sample of your microbiome directly from your gut, but that means undergoing an invasive procedure like a colonoscopy or biopsy.

Getting a specimen of stool is (ironically) less messy.

“Fecal material is convenient,” Jackson said. “Everybody poops.”

But Really, Poop Medicine?

Everyone’s stool is different. The amount and types of microorganisms vary based on your genes, environment, health and diet. But scientists have discerned similarities in the poop of individuals with certain diseases. People with Parkinson’s disease, for example, show both higher and lower concentrations of certain bacterial species. For people with asthma, poop has reduced levels of microbial diversity.

These correlations, some quite clear and others still unclear, may make it possible to use stool samples to diagnose a wide range of illnesses and conditions. You’d send a fecal sample to a lab, which would then identify the microorganisms in it by decoding, or “sequencing,” their DNA.

Jackson said the results could be used to not only diagnose certain illnesses but also evaluate the risk of getting the illnesses in the future.

“We would analyze microbial DNA in stool for the same reason we test human DNA — to tell us about your risk of disease,” he said.

Beyond their use in diagnosing diseases, could feces-derived microbes be used to treat them?

This is actually already happening.

Fecal microbiota transplants (FMTs) are now used to treat recurrent Clostridioides difficile infection (CDI), a sometimes-deadly bacterial infection commonly picked up in hospitals. FMT is like any other kind of transplant, though in this case, it’s someone else’s fecal matter that’s transferred into the sick patient.

The microorganisms from the transplant help regenerate the healthy ecosystem in the gut microbiome, assisting the immune system to fight off the infection. On recurrent CDI, the procedure has a success rate of 95%, a remarkable result for just about any therapy.

Research into other uses for FMT is exploding. According to clinicaltrials.gov, there are dozens of studies in the United States right now involving FMT, with clinicians and researchers testing it out on everything from cancer to colitis to alcoholic hepatitis.

Researchers are also exploring alternative approaches that involve genetically modifying fecal bacteria, creating disease-fighting microbes that would take root in your gut and help restore the microbiome to full health. There’s even the possibility of altering your own fecal microbes, a personalized medicine approach that customizes the therapy to the individual patient.

In addition to the gut microbiome, there are multitudes of microorganisms in the nose, skin, throat and vagina, all part of what’s known as the human microbiome.

Jackson said that the next generation of microbial medicines will be derived from all over the human microbiome. They will be much more scientifically proven and effective for treating diseases than today’s probiotics, which are bacteria derived from fermented foods and categorized as dietary supplements.

“If things keep going the way they are now, I think in  30 years, medical doctors will have an arsenal of new microbial therapies to treat a broad spectrum of diseases,” Jackson said.

What’s NIST’s Role in All This?

Two researchers wearing protective gear are cutting open a bag labeled "Omnivore 1" over an icy freezer container.
NIST researchers handling frozen stool samples in Charleston. Credit: T. Schock/NIST

NIST produces reference materials (RMs) that help laboratories and manufacturers calibrate their instruments.

For example, the agency sells peanut butter that comes with a detailed analysis and measurements of its compounds and chemicals. Food companies need to know how much fat is in their products. To ensure they are measuring the amount correctly, they can perform tests on NIST’s peanut butter. If they get the same result as NIST, they know their equipment is accurate.

Having widely trusted and accepted reference materials, especially for complex materials, ensures quality control and accuracy across entire industries and research fields.

NIST is now developing an RM for human feces, officially called the Human Gut Microbiome (Whole Stool) Reference Material. NIST’s peanut butter lists its amount of calcium, copper, tetradecanoic acid and many other components. Similarly, the new RM, expected to be released this year, will itemize and describe the ingredients in feces.  

It will identify hundreds of species of microorganisms and detail the concentration of thousands of different metabolites in the gut, many of which are produced by microorganisms and help to convert nutrients into energy or synthesize molecules for cellular functions.

Other ingredients listed include many compounds you might not even have known were in feces: cholesterol (a type of metabolite), for example, and serotonin, most of which is found in the cells lining the gastrointestinal tract.

The RM aims to become the gold standard in human gut microbiome research and drug development. A single unit of the RM will consist of a milliliter tube filled with slurry fecal matter accompanied by a lengthy report that labs can use to check their measurements and fine-tune their instruments.

Many scientists believe a reference material for feces is desperately needed. Right now, “If you give two different laboratories the same stool sample for analysis, you’ll likely get strikingly different results,” Jackson said. Many discrepancies arise from the different protocols and tools the labs use. Others are the result of differing standards and definitions.

“NIST’s RM will help researchers develop, benchmark and harmonize their measurements,” Jackson said. “It’s the most detailed and comprehensive microbiological and biochemical breakdowns ever produced for human feces.” 

Article link: https://www.nist.gov/health/future-medicine-your-poop

What’s next in chips – MIT Technology Review

Posted by timmreardon on 05/15/2024
Posted in: Uncategorized.

How Big Tech, startups, AI devices, and trade wars will transform the way chips are made and the technologies they power.

By James O’Donnellarchive page

    May 13, 2024

    MIT Technology Review’s What’s Next series looks across industries, trends, and technologies to give you a first look at the future. You can read the rest of them here.

    Thanks to the boom in artificial intelligence, the world of chips is on the cusp of a huge tidal shift. There is heightened demand for chips that can train AI models faster and ping them from devices like smartphones and satellites, enabling us to use these models without disclosing private data. Governments, tech giants, and startups alike are racing to carve out their slices of the growing semiconductor pie. 

    Here are four trends to look for in the year ahead that will define what the chips of the future will look like, who will make them, and which new technologies they’ll unlock.

    CHIPS Acts around the world

    On the outskirts of Phoenix, two of the world’s largest chip manufacturers, TSMC and Intel, are racing to construct campuses in the desert that they hope will become the seats of American chipmaking prowess. One thing the efforts have in common is their funding: in March, President Joe Biden announced $8.5 billion in direct federal funds and $11 billion in loans for Intel’s expansions around the country. Weeks later, another $6.6 billion was announced for TSMC. 

    The awards are just a portion of the US subsidies pouring into the chips industry via the $280 billion CHIPS and Science Act signed in 2022. The money means that any company with a foot in the semiconductor ecosystem is analyzing how to restructure its supply chains to benefit from the cash. While much of the money aims to boost American chip manufacturing, there’s room for other players to apply, from equipment makers to niche materials startups.

    But the US is not the only country trying to onshore some of the chipmaking supply chain. Japan is spending $13 billion on its own equivalent to the CHIPS Act, Europe will be spending more than $47 billion, and earlier this year India announced a $15 billion effort to build local chip plants. The roots of this trend go all the way back to 2014, says Chris Miller, a professor at Tufts University and author of Chip War: The Fight for the World’s Most Critical Technology. That’s when China started offering massive subsidies to its chipmakers. 

    “This created a dynamic in which other governments concluded they had no choice but to offer incentives or see firms shift manufacturing to China,” he says. That threat, coupled with the surge in AI, has led Western governments to fund alternatives. In the next year, this might have a snowball effect, with even more countries starting their own programs for fear of being left behind.

    The money is unlikely to lead to brand-new chip competitors or fundamentally restructure who the biggest chip players are, Miller says. Instead, it will mostly incentivize dominant players like TSMC to establish roots in multiple countries. But funding alone won’t be enough to do that quickly—TSMC’s effort to build plants in Arizona has been mired in missed deadlines and labor disputes, and Intel has similarly failed to meet its promised deadlines. And it’s unclear whether, whenever the plants do come online, their equipment and labor force will be capable of the same level of advanced chipmaking that the companies maintain abroad.

    “The supply chain will only shift slowly, over years and decades,” Miller says. “But it is shifting.”

    More AI on the edge

    Currently, most of our interactions with AI models like ChatGPT are done via the cloud. That means that when you ask GPT to pick out an outfit (or to be your boyfriend), your request pings OpenAI’s servers, prompting the model housed there to process it and draw conclusions (known as “inference”) before a response is sent back to you. Relying on the cloud has some drawbacks: it requires internet access, for one, and it also means some of your data is shared with the model maker.  

    That’s why there’s been a lot of interest and investment in edge computing for AI, where the process of pinging the AI model happens directly on your device, like a laptop or smartphone. With the industry increasingly working toward a future in which AI models know a lot about us (Sam Altman described his killer AI app to me as one that knows “absolutely everything about my whole life, every email, every conversation I’ve ever had”), there’s a demand for faster “edge” chips that can run models without sharing private data. These chips face different constraints from the ones in data centers: they typically have to be smaller, cheaper, and more energy efficient. 

    The US Department of Defense is funding a lot of research into fast, private edge computing. In March, its research wing, the Defense Advanced Research Projects Agency (DARPA), announced a partnership with chipmaker EnCharge AI to create an ultra-powerful edge computing chip used for AI inference. EnCharge AI is working to make a chip that enables enhanced privacy but can also operate on very little power. This will make it suitable for military applications like satellites and off-grid surveillance equipment. The company expects to ship the chips in 2025.

    AI models will always rely on the cloud for some applications, but new investment and interest in improving edge computing could bring faster chips, and therefore more AI, to our everyday devices. If edge chips get small and cheap enough, we’re likely to see even more AI-driven “smart devices” in our homes and workplaces. Today, AI models are mostly constrained to data centers.

    “A lot of the challenges that we see in the data center will be overcome,” says EnCharge AI cofounder Naveen Verma. “I expect to see a big focus on the edge. I think it’s going to be critical to getting AI at scale.”

    Big Tech enters the chipmaking fray

    In industries ranging from fast fashion to lawn care, companies are paying exorbitant amounts in computing costs to create and train AI models for their businesses. Examples include models that employees can use to scan and summarize documents, as well as externally facing technologies like virtual agents that can walk you through how to repair your broken fridge. That means demand for cloud computing to train those models is through the roof. 

    The companies providing the bulk of that computing power are Amazon, Microsoft, and Google. For years these tech giants have dreamed of increasing their profit margins by making chips for their data centers in-house rather than buying from companies like Nvidia, a giant with a near monopoly on the most advanced AI training chips and a value larger than the GDP of 183 countries. 

    Amazon started its effort in 2015, acquiring startup Annapurna Labs. Google moved next in 2018 with its own chips called TPUs. Microsoft launched its first AI chips in November, and Meta unveiled a new version of its own AI training chips in April.

    That trend could tilt the scales away from Nvidia. But Nvidia doesn’t only play the role of rival in the eyes of Big Tech: regardless of their own in-house efforts, cloud giants still need its chips for their data centers. That’s partly because their own chipmaking efforts can’t fulfill all their needs, but it’s also because their customers expect to be able to use top-of-the-line Nvidia chips.

    “This is really about giving the customers the choice,” says Rani Borkar, who leads hardware efforts at Microsoft Azure. She says she can’t envision a future in which Microsoft supplies all chips for its cloud services: “We will continue our strong partnerships and deploy chips from all the silicon partners that we work with.”

    As cloud computing giants attempt to poach a bit of market share away from chipmakers, Nvidia is also attempting the converse. Last year the company started its own cloud service so customers can bypass Amazon, Google, or Microsoft and get computing time on Nvidia chips directly. As this dramatic struggle over market share unfolds, the coming year will be about whether customers see Big Tech’s chips as akin to Nvidia’s most advanced chips, or more like their little cousins. 

    Nvidia battles the startups 

    Despite Nvidia’s dominance, there is a wave of investment flowing toward startups that aim to outcompete it in certain slices of the chip market of the future. Those startups all promise faster AI training, but they have different ideas about which flashy computing technology will get them there, from quantum to photonics to reversible computation. 

    But Murat Onen, the 28-year-old founder of one such chip startup, Eva, which he spun out of his PhD work at MIT, is blunt about what it’s like to start a chip company right now.

    “The king of the hill is Nvidia, and that’s the world that we live in,” he says.

    Many of these companies, like SambaNova, Cerebras, and Graphcore, are trying to change the underlying architecture of chips. Imagine an AI accelerator chip as constantly having to shuffle data back and forth between different areas: a piece of information is stored in the memory zone but must move to the processing zone, where a calculation is made, and then be stored back to the memory zone for safekeeping. All that takes time and energy. 

    Related Story

    What’s next for generative video

    OpenAI’s Sora has raised the bar for AI moviemaking. Here are four things to bear in mind as we wrap our heads around what’s coming.

    Making that process more efficient would deliver faster and cheaper AI training to customers, but only if the chipmaker has good enough software to allow the AI training company to seamlessly transition to the new chip. If the software transition is too clunky, model makers such as OpenAI, Anthropic, and Mistral are likely to stick with big-name chipmakers.That means companies taking this approach, like SambaNova, are spending a lot of their time not just on chip design but on software design too.

    Onen is proposing changes one level deeper. Instead of traditional transistors, which have delivered greater efficiency over decades by getting smaller and smaller, he’s using a new component called a proton-gated transistor that he says Eva designed specifically for the mathematical needs of AI training. It allows devices to store and process data in the same place, saving time and computing energy. The idea of using such a component for AI inference dates back to the 1960s, but researchers could never figure out how to use it for AI training, in part because of a materials roadblock—it requires a material that can, among other qualities, precisely control conductivity at room temperature. 

    One day in the lab, “through optimizing these numbers, and getting very lucky, we got the material that we wanted,” Onen says. “All of a sudden, the device is not a science fair project.” That raised the possibility of using such a component at scale. After months of working to confirm that the data was correct, he founded Eva, and the work was published in Science.

    But in a sector where so many founders have promised—and failed—to topple the dominance of the leading chipmakers, Onen frankly admits that it will be years before he’ll know if the design works as intended and if manufacturers will agree to produce it. Leading a company through that uncertainty, he says, requires flexibility and an appetite for skepticism from others.

    “I think sometimes people feel too attached to their ideas, and then kind of feel insecure that if this goes away there won’t be anything next,” he says. “I don’t think I feel that way. I’m still looking for people to challenge us and say this is wrong.”

    Article link: https://www.technologyreview.com/2024/05/13/1092319/whats-next-in-chips/

    DISA unveils strategic plan for next five years

    Posted by timmreardon on 05/03/2024
    Posted in: Uncategorized.

    The document provides a series of strategic and operational imperatives as well as eight goals the agency seeks to achieve by 2030.

    BYMARK POMERLEAU

    MAY 1, 2024

    The Defense Information Systems Agency, charged with operating and maintaining the Department of Defense’s network, unveiled on Wednesday its strategic plan that articulates the organization’s goals over the next five years.

    Building upon the previous plan, released in 2022, the DISA Next Strategy, as it’s called, seeks to align the combat support agency with the 2022 National Defense Strategy and five-year budgeting process to help department leadership and industry partners make more informed decisions for allocating resources.

    The National Defense Strategy articulates a highly dynamic security environment with a multitude of simultaneous threats across the globe — from sophisticated nation-states seeking to undermine U.S. interests and power to non-state actors that still aim to cause disruption. Chief among those threats is China, referred to as the pacing threat.

    “As a combat support agency and the premier IT service provider for the Department of Defense, we will continue to provide world-class services. At the same time, we are changing. We are re-organizing, optimizing and transforming to deliver resilient, survivable and secure capabilities to enable department success and warfighter lethality,” Lt. Gen. Robert Skinner, DISA’s director, said in the forward to the strategy.

    “I am confident the agency will succeed – we have no other choice. We are the combat support agency entrusted with connecting senior leaders and warfighters across the globe 24/7 – often during the most stressful and dangerous moments of their lives. We must continue to deliver while being challenged by great powers. From great power competition with the People’s Republic of China, to supporting operations in emerging geographic areas of national strategic importance, both home and abroad,” he added.

    Within this enhanced security environment, simplicity and speed will be paramount. Thus, priority number one in DISA’s new strategy is the need to simplify the network globally with large-scale adoption of command IT environments. The plan is to consolidate combatant commands and defense agencies and field activities into this environment, serving as a key first step in providing a DOD-wide warfighting information system, Skinner wrote.

    “Communicating our strategy enables a more focused effort across the DOD and ensures that we have addressed the unique challenges of the [combatant commands], their multitude of mission sets, their warfighting functions and account for any domain-specific equities. We seek to avoid unnecessary duplication of capabilities between the CCMDs, DAFAs and military departments,” the strategy states. “We must capitalize on opportunities to simplify the IT environment, experiment with emerging technologies and test our solutions in the environments in which the Joint and Coalition Forces operate. We seek to partner with industry and academia to shape IT innovation towards solving our information system challenges.”

    Furthermore, the plan aims to develop a fully functional enterprise cloud environment and integrate identity, credential and access management (ICAM) and zero-trust capabilities with this common IT and cloud environment.

    The document lists four strategic imperatives that are overarching and important functions the agency must perform, each consisting of more specific operational imperatives. They include:

    • Operate and secure the DISA portion of the DOD Information Network
    • Support strategic command, control and communications
    • Optimize the network
    • Operationalize data

    Additionally, the plan outlines eight goals that provide areas of transformation the agency is focused on over the next five years. They include:

    • The defense information system network: By 2030 DISA has a globally accessible, software defined, transport environment that is unconstrained by bandwidth and impervious to denial or disruption.
    • Hybrid cloud environment: By 2030 DISA is operating a resilient, globally accessible hybrid cloud environment.
    • National leadership command capabilities: By 2030 DISA has modernized its portion of the NLCC fabric to enable national leadership and strategic coordination between allies and partners.
    • Joint and coalition warfighting tools: By 2030 DISA has delivered the right suite of capabilities to enable joint and coalition warfighting and has produced data standards for interoperability of IT solutions.
    • Consolidated network: By 2030 DISA has consolidated DAFAs and CCMDs into a common IT environment that offers seamless access to information at all classification levels.
    • Zero-trust tools: By the fourth quarter of fiscal 2027 DISA’s portion of the DODIN complies with the ZT reference architecture.
    • Data management: By 2030 DISA has a modern data platform for its defensive cyber and network operations data and has implemented standards for data management.
    • Workforce: By 2030 DISA will continue to upskill its workforce to remain “lethal” in the IT environment.

    Article link: https://defensescoop.com/2024/05/01/disa-next-unveil-strategic-plan-2025-2029/

    Posts navigation

    ← Older Entries
    Newer Entries →
    • Search site

    • Follow healthcarereimagined on WordPress.com
    • Recent Posts

      • Introduction: Disinformation as a multiplier of existential threat – Bulletin of the Atomic Scientists 03/12/2026
      • AI is reinventing hiring — with the same old biases. Here’s how to avoid that trap – MIT Sloan 03/08/2026
      • Fiscal Year 2025 Year In Review – PEO DHMS 02/26/2026
      • “𝗦𝗼𝗰𝗶𝗮𝗹 𝗠𝗲𝗱𝗶𝗮 𝗠𝗮𝗻𝗶𝗽𝘂𝗹𝗮𝘁𝗶𝗼𝗻 𝗳𝗼𝗿 𝗦𝗮𝗹𝗲” – NATO Strategic Communications COE 02/26/2026
      • Claude Can Now Do 40 Hours of Work in Minutes. Anthropic Says Its Safety Systems Can’t Keep Up – AJ Green 02/19/2026
      • Agentic AI, explained – MIT Sloan 02/18/2026
      • Anthropic’s head of AI safety Mrinank Sharma resigns, says ‘world is in peril’ in resignation letter 02/10/2026
      • Moltbook was peak AI theater 02/09/2026
      • WHAT A QUBIT IS AND WHAT IT IS NOT. 01/25/2026
      • Governance Before Crisis We still have time to get this right. 01/21/2026
    • Categories

      • Accountable Care Organizations
      • ACOs
      • AHRQ
      • American Board of Internal Medicine
      • Big Data
      • Blue Button
      • Board Certification
      • Cancer Treatment
      • Data Science
      • Digital Services Playbook
      • DoD
      • EHR Interoperability
      • EHR Usability
      • Emergency Medicine
      • FDA
      • FDASIA
      • GAO Reports
      • Genetic Data
      • Genetic Research
      • Genomic Data
      • Global Standards
      • Health Care Costs
      • Health Care Economics
      • Health IT adoption
      • Health Outcomes
      • Healthcare Delivery
      • Healthcare Informatics
      • Healthcare Outcomes
      • Healthcare Security
      • Helathcare Delivery
      • HHS
      • HIPAA
      • ICD-10
      • Innovation
      • Integrated Electronic Health Records
      • IT Acquisition
      • JASONS
      • Lab Report Access
      • Military Health System Reform
      • Mobile Health
      • Mobile Healthcare
      • National Health IT System
      • NSF
      • ONC Reports to Congress
      • Oncology
      • Open Data
      • Patient Centered Medical Home
      • Patient Portals
      • PCMH
      • Precision Medicine
      • Primary Care
      • Public Health
      • Quadruple Aim
      • Quality Measures
      • Rehab Medicine
      • TechFAR Handbook
      • Triple Aim
      • U.S. Air Force Medicine
      • U.S. Army
      • U.S. Army Medicine
      • U.S. Navy Medicine
      • U.S. Surgeon General
      • Uncategorized
      • Value-based Care
      • Veterans Affairs
      • Warrior Transistion Units
      • XPRIZE
    • Archives

      • March 2026 (2)
      • February 2026 (6)
      • January 2026 (8)
      • December 2025 (11)
      • November 2025 (9)
      • October 2025 (10)
      • September 2025 (4)
      • August 2025 (7)
      • July 2025 (2)
      • June 2025 (9)
      • May 2025 (4)
      • April 2025 (11)
      • March 2025 (11)
      • February 2025 (10)
      • January 2025 (12)
      • December 2024 (12)
      • November 2024 (7)
      • October 2024 (5)
      • September 2024 (9)
      • August 2024 (10)
      • July 2024 (13)
      • June 2024 (18)
      • May 2024 (10)
      • April 2024 (19)
      • March 2024 (35)
      • February 2024 (23)
      • January 2024 (16)
      • December 2023 (22)
      • November 2023 (38)
      • October 2023 (24)
      • September 2023 (24)
      • August 2023 (34)
      • July 2023 (33)
      • June 2023 (30)
      • May 2023 (35)
      • April 2023 (30)
      • March 2023 (30)
      • February 2023 (15)
      • January 2023 (17)
      • December 2022 (10)
      • November 2022 (7)
      • October 2022 (22)
      • September 2022 (16)
      • August 2022 (33)
      • July 2022 (28)
      • June 2022 (42)
      • May 2022 (53)
      • April 2022 (35)
      • March 2022 (37)
      • February 2022 (21)
      • January 2022 (28)
      • December 2021 (23)
      • November 2021 (12)
      • October 2021 (10)
      • September 2021 (4)
      • August 2021 (4)
      • July 2021 (4)
      • May 2021 (3)
      • April 2021 (1)
      • March 2021 (2)
      • February 2021 (1)
      • January 2021 (4)
      • December 2020 (7)
      • November 2020 (2)
      • October 2020 (4)
      • September 2020 (7)
      • August 2020 (11)
      • July 2020 (3)
      • June 2020 (5)
      • April 2020 (3)
      • March 2020 (1)
      • February 2020 (1)
      • January 2020 (2)
      • December 2019 (2)
      • November 2019 (1)
      • September 2019 (4)
      • August 2019 (3)
      • July 2019 (5)
      • June 2019 (10)
      • May 2019 (8)
      • April 2019 (6)
      • March 2019 (7)
      • February 2019 (17)
      • January 2019 (14)
      • December 2018 (10)
      • November 2018 (20)
      • October 2018 (14)
      • September 2018 (27)
      • August 2018 (19)
      • July 2018 (16)
      • June 2018 (18)
      • May 2018 (28)
      • April 2018 (3)
      • March 2018 (11)
      • February 2018 (5)
      • January 2018 (10)
      • December 2017 (20)
      • November 2017 (30)
      • October 2017 (33)
      • September 2017 (11)
      • August 2017 (13)
      • July 2017 (9)
      • June 2017 (8)
      • May 2017 (9)
      • April 2017 (4)
      • March 2017 (12)
      • December 2016 (3)
      • September 2016 (4)
      • August 2016 (1)
      • July 2016 (7)
      • June 2016 (7)
      • April 2016 (4)
      • March 2016 (7)
      • February 2016 (1)
      • January 2016 (3)
      • November 2015 (3)
      • October 2015 (2)
      • September 2015 (9)
      • August 2015 (6)
      • June 2015 (5)
      • May 2015 (6)
      • April 2015 (3)
      • March 2015 (16)
      • February 2015 (10)
      • January 2015 (16)
      • December 2014 (9)
      • November 2014 (7)
      • October 2014 (21)
      • September 2014 (8)
      • August 2014 (9)
      • July 2014 (7)
      • June 2014 (5)
      • May 2014 (8)
      • April 2014 (19)
      • March 2014 (8)
      • February 2014 (9)
      • January 2014 (31)
      • December 2013 (23)
      • November 2013 (48)
      • October 2013 (25)
    • Tags

      Business Defense Department Department of Veterans Affairs EHealth EHR Electronic health record Food and Drug Administration Health Health informatics Health Information Exchange Health information technology Health system HIE Hospital IBM Mayo Clinic Medicare Medicine Military Health System Patient Patient portal Patient Protection and Affordable Care Act United States United States Department of Defense United States Department of Veterans Affairs
    • Upcoming Events

    Blog at WordPress.com.
    healthcarereimagined
    Blog at WordPress.com.
    • Subscribe Subscribed
      • healthcarereimagined
      • Join 153 other subscribers
      • Already have a WordPress.com account? Log in now.
      • healthcarereimagined
      • Subscribe Subscribed
      • Sign up
      • Log in
      • Report this content
      • View site in Reader
      • Manage subscriptions
      • Collapse this bar
     

    Loading Comments...