healthcarereimagined

Envisioning healthcare for the 21st century

  • About
  • Economics

Human operators must be held accountable for AI’s use in conflicts, Air Force secretary says – Nextgov

Posted by timmreardon on 12/07/2023
Posted in: Uncategorized.

By EDWARD GRAHAMDECEMBER 4, 2023

The Pentagon needs “to find a way to hold people accountable” for what artificial intelligence technologies do in future conflicts, according to Air Force Secretary Frank Kendall.

Humans will ultimately be held responsible for the use or misuse of artificial intelligence technologies during military conflicts, a top Department of Defense official said during a panel discussion at the Reagan National Defense Forum on Saturday.

Air Force Secretary Frank Kendall dismissed the notion “of the rogue robot that goes out there and runs around and shoots everything in sight indiscriminately,” highlighting the fact that AI technologies — particularly those deployed on the battlefields of the future — will be governed by some level of human oversight.

“I care a lot about civil society and the rule of law, including laws of armed conflict,” he said. “Our policies are written around compliance with those laws. You don’t enforce laws against machines; you enforce them against people. And I think our challenge is not to somehow limit what we can do with AI, but it’s to find a way to hold people accountable for what the AI does.”

Even as the Pentagon continues to experiment with AI, the department has worked to establish safeguards around its use of the technologies. DOD updated its decades-old policy on autonomous weapons in February to clarify, in part, that weapons with AI-enabled capabilities need to follow the department’s AI guidelines. 

The Pentagon previously issued a series of ethical AI principles in 2020 governing its use of the technologies, and released a data, analytics and AI adoption strategy in November that positioned quality of data as key to the department’s implementation of the advanced tech.

The goal for now, Kendall said, is to build confidence and trust in the technology and then “get it into field capabilities as quickly as we can.” 

“The critical parameter on the battlefield is time,” he added. “And AI will be able to do much more complicated things much more accurately and much faster than human beings can.”

Kendall pointed to two specific mistakes that AI could make “in a lethal area,” including not engaging a target that it should have engaged or engaging civilian targets and U.S. military assets and allies. These possibilities, he said, necessitate more defined rules for holding operators responsible when they do occur. 

“We are still going to have to find ways to manage this technology, manage its application and hold human beings accountable for when it doesn’t comply with the rules that we already have,” he added. “I think that’s the approach we need to take.”

For the time being, however, the Pentagon’s uses of AI are largely focused on processing large amounts of data for more administrative-oriented tasks.

For the time being, however, the Pentagon’s uses of AI are largely focused on processing large amounts of data for more administrative-oriented tasks.

“There are enormous possibilities here, but it is not anywhere near general human intelligence equivalents,” Kendall said, citing pattern recognition and “deep data analytics to associate things from an intelligence perspective” as AI’s most effective applications.

During a discussion last month, Schuyler Moore — the chief technology officer for U.S. Central Command — cited AI’s uneven performance and said that during military conflicts, officials “will more frequently than not put it to the side or use it in very, very select contexts where we feel very certain of the risks associated.”

But concerns still remain about how these tools will ultimately be used to enhance future warfighting capabilities, and the specific policies that are needed to enforce safeguards.

Rep. Mike Gallagher, R-Wis. — who chairs the House Select Committee on the Chinese Communist Party and was a former co-chair of the Cyberspace Solarium Commission — said “we need to have a plan for whether and how we are going to quickly adopt [AI] across multiple battlefield domains and warfighting capabilities.” 

“I’m not sure we’ve thought through that,” Gallagher added.

Article link: https://www.nextgov.com/artificial-intelligence/2023/12/human-operators-must-be-held-accountable-ais-use-conflicts-air-force-secretary-says/392457/?

Bipartisan bill strives for ‘more nimble and meaningful’ federal contracting – Nextgov

Posted by timmreardon on 12/04/2023
Posted in: Uncategorized.

By EDWARD GRAHAMJANUARY 22, 2024

Legislation from Sens. Gary Peters, D-Mich., and Joni Ernst, R-Iowa, would “streamline procedures” for both solicitation and awards by slimming down the procurement process.

A new bipartisan proposal seeks to simplify the federal contracting process — and potentially allow for more small businesses to work with the government — by reducing burdensome requirements and creating “a more nimble and meaningful bidding process and evaluation of proposals.”

The Conforming Procedures for Federal Task and Delivery Order Contracts Act was introduced by Sens. Gary Peters, D-Mich., and Joni Ernst, R-Iowa, on Jan. 19. 

The bill seeks “to streamline procedures for solicitation and the awarding of task and delivery order contracts for agencies” by shrinking “the procurement process for contractors bidding on work as well as for the government, ensuring necessary due diligence is done while allowing awards to be made faster and to a wider array of contractors, including small businesses.”

This includes reducing “duplication of documentation requirements for agencies” and applying some of the contracting measures that the Department of Defense “currently has in place to all federal agencies.”

Ernst — the ranking member of the Senate Small Business and Entrepreneurship Committee — said in a statement that “too much bureaucratic red tape stands in the way” when it comes to smaller companies effectively competing for federal contracts.

“By making the award process faster and wider, Iowa’s small businesses and entrepreneurs can better compete and succeed,” she added, referencing the benefits the bill would have for her Hawkeye State constituents. 

In a statement, Peters also said the legislation “streamlines the contracting process for federal government agencies, and as a result will boost small businesses trying to stay competitive and will increase efficiency for all government agencies, benefitting people across the nation.”

This isn’t the first time that Peters and Ernst have teamed up on legislation to improve the government’s procurement process, which is receiving renewed attention as lawmakers discuss the role that emerging technologies can play in bolstering the capabilities of federal services. 

The senators previously authored legislation, known as the PRICE Act, to “promote innovative acquisition techniques and procurement strategies” to improve the contracting process for small businesses. Their bill was signed into law in February 2022. 

Peters and Ernst also introducedlegislation in July 2022 that would require the Office of Management and Budget and the General Services Administration “to streamline the ability of the federal government to purchase commercial technology and provide specific training for information and communications technology acquisition.” 

Following a Jan. 10 Senate Homeland Security and Governmental Affairs Committee hearing on how artificial intelligence can be used to improve government services, Peters — who chairs the panel — also told Nextgov/FCW “how the federal government procures AI… is going to have a big impact on AI throughout the economy.”

“And I think that’s a very effective way for us to think about AI regulation, through the procurement process,” he said.

Article link: https://www.nextgov.com/acquisition/2024/01/bipartisan-bill-strives-more-nimble-and-meaningful-federal-contracting/393508/?

The hardware and software for the era of quantum utility is here – IBM

Posted by timmreardon on 12/04/2023
Posted in: Uncategorized.

Welcome to a new era of quantum computing. https://ibm.co/3sU23xh

It’s the first day of IBM Quantum Summit 2023, and we are thrilled to share a bevy of announcements and updates with you.

At today’s event, we’re presenting new capabilities we’re developing in order to support the next wave of quantum users: quantum computational scientists. In addition to unveiling an operational IBM Quantum System Two, we’re sharing our newest, highest-performing quantum processor yet, Heron—and demonstrating how we’re scaling quantum processors with the 1,121-qubit IBM Condor processor.

We’re also introducing Qiskit 1.0, Qiskit’s first stable release, alongside Qiskit Patterns—a framework for quantum computational scientists to do meaningful scalable work with quantum algorithms. With Qiskit Patterns, users can seamlessly create quantum algorithms and applications from a collection of foundational building blocks and execute those Patterns using heterogeneous computing infrastructure such as Quantum Serverless, now available as a beta release. We’re also deploying new execution modes so computational scientists can maximize performance from our hardware while they run utility-scale workloads.

And finally we’re sharing our new roadmap, laying out a vision for quantum computing all the way to 2033. This is the most exciting time in quantum computing to date, and we’re so proud to share with you. Head over to the IBM blog for more details.

Article link: https://www.linkedin.com/posts/ibm-quantum_ibm-quantum-system-two-activity-7137411949729869824-fRNI?

Federal Low Code SCOP Event 13 December – Federal CIO Council

Posted by timmreardon on 12/03/2023
Posted in: Uncategorized.

Calling all Federal Low Code practitioners and IT Modernization Champions!  I’m excited to share that, in partnership with GSA, we have formed the US Federal Low Code Subcommunity of Practice under the U.S. Federal Chief Information Officers (CIO) Council framework and our inaugural meeting will be held on 13 December 2023!

Key objectives for this group include:

  • Evaluating the low code landscape, including available platforms, tools, best practices, and success stories relevant to federal agencies
  • Developing guidelines, best practices, and educational resources for federal agencies to support the identification, selection, adoption and implementation of low code platforms, tools and services
  • Fostering collaboration among federal agencies, industry partners, and subject matter experts to share experiences, lessons learned, and success stories related to low code adoption
  • Identifying opportunities for pilot projects to demonstrate the value, challenges, and benefits of low code platforms in federal agencies
  • Providing inputs and recommendations to federal agencies and policymakers regarding policies, regulations, and standards that may impact the adoption and use of low code platforms
  • Engaging with low code platform and service vendors to understand their capabilities, roadmaps, and potential areas of collaboration to meet federal agency requirements

Together, our main goal is to promote the adoption and effective use of low code development methodologies and tools to accelerate digital transformation and improved citizen services across government.

Please join us on 13 December 2023 for our inaugural meeting where we will baseline current state, dive deeper into low code adoption challenges, begin to share best practices for low code platform adoption, and explore the low code product and services landscape.

Currently, we have representatives from every branch of government scheduled to attend so we are excited about the early response to this initiative to say the least!  This is an event that you will not want to miss! 

Deputy Federal Chief Information Officer, Office of Management and Budget and keynote speaker Drew Myklegard, will set the stage and then we’ll jump right in to our informative agenda.  To attend either in person in DC or virtually, please send an email (from your government domain) requesting US Federal LC SCoP membership to LCNC-subscribe-request@listserv.gsa.gov and a follow-on registration link to this event will be provided.

 Note: This kickoff event will be for Federal Government Employees and badged contractors only but we will open future events to industry partners and we will announce those as they are planned.

#digitaltransformation #lowcode #lcnc #oneteam #itmodernization

Article link: https://www.linkedin.com/posts/activity-7132757702534987776-3bSm?

AWS Unveils Next Generation AWS-Designed Chips – Businesswire

Posted by timmreardon on 12/02/2023
Posted in: Uncategorized.

AWS Graviton4 is the most powerful and energy-efficient AWS processor to date for a broad range of cloud workloads

AWS Trainium2 will power the highest performance compute on AWS for training foundation models faster and at a lower cost, while using less energy

Anthropic, Databricks, Datadog, Epic, Honeycomb, and SAP among customers using new AWS-designed chips


November 28, 2023 11:25 AM Eastern Standard Time

LAS VEGAS–(BUSINESS WIRE)–At AWS re:Invent, Amazon Web Services, Inc. (AWS), an Amazon.com, Inc. company (NASDAQ: AMZN), today announced the next generation of two AWS-designed chip families—AWS Graviton4 and AWS Trainium2—delivering advancements in price performance and energy efficiency for a broad range of customer workloads, including machine learning (ML) training and generative artificial intelligence (AI) applications. Graviton4 and Trainium2 mark the latest innovations in chip design from AWS. With each successive generation of chip, AWS delivers better price performance and energy efficiency, giving customers even more options—in addition to chip/instance combinations featuring the latest chips from third parties like AMD, Intel, and NVIDIA—to run virtually any application or workload on Amazon Elastic Compute Cloud (Amazon EC2).

  • Graviton4 provides up to 30% better compute performance, 50% more cores, and 75% more memory bandwidth than current generation Graviton3 processors, delivering the best price performance and energy efficiency for a broad range of workloads running on Amazon EC2.
  • Trainium2 is designed to deliver up to 4x faster training than first generation Trainium chips and will be able to be deployed in EC2 UltraClusters of up to 100,000 chips, making it possible to train foundation models (FMs) and large language models (LLMs) in a fraction of the time, while improving energy efficiency up to 2x.

“Silicon underpins every customer workload, making it a critical area of innovation for AWS,” said David Brown, vice president of Compute and Networking at AWS. “By focusing our chip designs on real workloads that matter to customers, we’re able to deliver the most advanced cloud infrastructure to them. Graviton4 marks the fourth generation we’ve delivered in just five years, and is the most powerful and energy efficient chip we have ever built for a broad range of workloads. And with the surge of interest in generative AI, Tranium2 will help customers train their ML models faster, at a lower cost, and with better energy efficiency.”

Graviton4 raises the bar on price performance and energy efficiency for a broad range of workloads

Today, AWS offers more than 150 different Graviton-powered Amazon EC2 instance types globally at scale, has built more than 2 million Graviton processors, and has more than 50,000 customers—including the top 100 EC2 customers—using Graviton-based instances to achieve the best price performance for their applications. Customers including Datadog, DirecTV, Discovery, Formula 1 (F1), NextRoll, Nielsen, Pinterest, SAP, Snowflake, Sprinklr, Stripe, and Zendesk use Graviton-based instances to run a broad range of workloads, such as databases, analytics, web servers, batch processing, ad serving, application servers, and microservices. As customers bring larger in-memory databases and analytics workloads to the cloud, their compute, memory, storage, and networking requirements increase. As a result, they need even higher performance and larger instance sizes to run these demanding workloads, while managing costs. Furthermore, customers want more energy-efficient compute options for their workloads to reduce their impact on the environment. Graviton is supported by many AWS managed services, including Amazon Aurora, Amazon ElastiCache, Amazon EMR, Amazon MemoryDB, Amazon OpenSearch, Amazon Relational Database Service (Amazon RDS), AWS Fargate, and AWS Lambda, bringing Graviton’s price performance benefits to users of those services.

Graviton4 processors deliver up to 30% better compute performance, 50% more cores, and 75% more memory bandwidth than Graviton3. Graviton4 also raises the bar on security by fully encrypting all high-speed physical hardware interfaces. Graviton4 will be available in memory-optimized Amazon EC2 R8g instances, enabling customers to improve the execution of their high-performance databases, in-memory caches, and big data analytics workloads. R8g instances offer larger instance sizes with up to 3x more vCPUs and 3x more memory than current generation R7g instances. This allows customers to process larger amounts of data, scale their workloads, improve time-to-results, and lower their total cost of ownership. Graviton4-powered R8g instances are available today in preview, with general availability planned in the coming months. To learn more about Graviton4-based R8g instances, visit aws.amazon.com/ec2/instance-types/r8g.

EC2 UltraClusters of Trainum2 are designed to deliver the highest performance, most energy efficient AI model training infrastructure in the cloud

The FMs and LLMs behind today’s emerging generative AI applications are trained on massive datasets. These models make it possible for customers to completely reimagine user experiences through the creation of a variety of new content, including text, audio, images, video, and even software code. The most advanced FMs and LLMs today range from hundreds of billions to trillions of parameters, requiring reliable high-performance compute capacity capable of scaling across tens of thousands of ML chips. AWS already provides the broadest and deepest choice of Amazon EC2 instances featuring ML chips, including the latest NVIDIA GPUs, Trainium, and Inferentia2. Today, customers including Databricks, Helixon, Money Forward, and the Amazon Search team use Trainium to train large-scale deep learning models, taking advantage of Trainium’s high performance, scale, reliability, and low cost. But even with the fastest accelerated instances available today, customers want more performance and scale to train these increasingly sophisticated models faster, at a lower cost, while simultaneously reducing the amount of energy they use.

Trainium2 chips are purpose-built for high performance training of FMs and LLMs with up to trillions of parameters. Trainium2 is designed to deliver up to 4x faster training performance and 3x more memory capacity compared to first generation Trainium chips, while improving energy efficiency (performance/watt) up to 2x. Trainium2 will be available in Amazon EC2 Trn2 instances, containing 16 Trainium chips in a single instance. Trn2 instances are intended to enable customers to scale up to 100,000 Trainium2 chips in next generation EC2 UltraClusters, interconnected with AWS Elastic Fabric Adapter (EFA) petabit-scale networking, delivering up to 65 exaflops of compute and giving customers on-demand access to supercomputer-class performance. With this level of scale, customers can train a 300-billion parameter LLM in weeks versus months. By delivering the highest scale-out ML training performance at significantly lower costs, Trn2 instances can help customers unlock and accelerate the next wave of advances in generative AI. To learn more about Trainum, visit aws.amazon.com/machine-learning/trainium/.

A leading advocate for the responsible deployment of generative AI, Anthropic is an AI safety and research company that creates reliable, interpretable, and steerable AI systems. An AWS customer since 2021, Anthropic recently launched Claude–an AI assistant focused on being helpful, harmless, and honest. “Since launching on Amazon Bedrock, Claude has seen rapid adoption from AWS customers,” said Tom Brown, co-founder of Anthropic. “We are working closely with AWS to develop our future foundation models using Trainium chips. Trainium2 will help us build and train models at a very large scale, and we expect it to be at least 4x faster than first generation Trainium chips for some of our key workloads. Our collaboration with AWS will help organizations of all sizes unlock new possibilities, as they use Anthropic’s state-of-the-art AI systems together with AWS’s secure, reliable cloud technology.”

More than 10,000 organizations worldwide—including Comcast, Condé Nast, and over 50% of the Fortune 500—rely on Databricks to unify their data, analytics, and AI. “Thousands of customers have implemented Databricks on AWS, giving them the ability to use MosaicML to pre-train, finetune, and serve FMs for a variety of use cases,” said Naveen Rao, vice president of Generative AI at Databricks. “AWS Trainium gives us the scale and high performance needed to train our Mosaic MPT models, and at a low cost. As we train our next generation Mosaic MPT models, Trainium2 will make it possible to build models even faster, allowing us to provide our customers unprecedented scale and performance so they can bring their own generative AI applications to market more rapidly.”

Datadog is an observability and security platform that provides full visibility across organizations. “At Datadog, we run tens of thousands of nodes, so balancing performance and cost effectiveness is extremely important. That’s why we already run half of our Amazon EC2 fleet on Graviton,” said Laurent Bernaille, principal engineer at Datadog. “Integrating Graviton4-based instances into our environment was seamless, and gave us an immediate performance boost out of the box, and we’re looking forward to using Graviton4 when it becomes generally available.”

Epic is a leading interactive entertainment company and provider of 3D engine technology. Epic operates Fortnite, one of the world’s largest games with over 350 million accounts and 2.5 billion friend connections. “AWS Graviton4 instances are the fastest EC2 instances we’ve ever tested, and they are delivering outstanding performance across our most competitive and latency sensitive workloads,” said Roman Visintine, lead cloud engineer at Epic. “We look forward to using Graviton4 to improve player experience and expand what is possible within Fortnite.”

Honeycomb is the observability platform that enables engineering teams to find and solve problems they couldn’t before. “We are thrilled to have evaluated AWS Graviton4-based R8g instances,” said Liz Fong-Jones, Field CTO at Honeycomb. “In recent tests, our Go-based OpenTelemetry data ingestion workload required 25% fewer replicas on the Graviton4-based R8g instances compared to Graviton3-based C7g/M7g/R7g instances—and additionally achieved a 20% improvement in median latency and 10% improvement in 99th percentile latency. We look forward to leveraging Graviton4-based instances once they become generally available.”

SAP HANA Cloud, SAP’s cloud-native in-memory database, is the data management foundation of SAP Business Technology Platform (SAP BTP). “Customers rely on SAP HANA Cloud to run their mission-critical business processes and next-generation intelligent data applications in the cloud,” said Juergen Mueller, CTO and member of the Executive Board of SAP SE. “As part of the migration process of SAP HANA Cloud to AWS Graviton-based Amazon EC2 instances, we have already seen up to 35% better price performance for analytical workloads. In the coming months, we look forward to validating Graviton4, and the benefits it can bring to our joint customers.”

About Amazon Web Services

Since 2006, Amazon Web Services has been the world’s most comprehensive and broadly adopted cloud. AWS has been continually expanding its services to support virtually any workload, and it now has more than 240 fully featured services for compute, storage, databases, networking, analytics, machine learning and artificial intelligence (AI), Internet of Things (IoT), mobile, security, hybrid, virtual and augmented reality (VR and AR), media, and application development, deployment, and management from 102 Availability Zones within 32 geographic regions, with announced plans for 15 more Availability Zones and five more AWS Regions in Canada, Germany, Malaysia, New Zealand, and Thailand. Millions of customers—including the fastest-growing startups, largest enterprises, and leading government agencies—trust AWS to power their infrastructure, become more agile, and lower costs. To learn more about AWS, visit aws.amazon.com.

About Amazon

Amazon is guided by four principles: customer obsession rather than competitor focus, passion for invention, commitment to operational excellence, and long-term thinking. Amazon strives to be Earth’s Most Customer-Centric Company, Earth’s Best Employer, and Earth’s Safest Place to Work. Customer reviews, 1-Click shopping, personalized recommendations, Prime, Fulfillment by Amazon, AWS, Kindle Direct Publishing, Kindle, Career Choice, Fire tablets, Fire TV, Amazon Echo, Alexa, Just Walk Out technology, Amazon Studios, and The Climate Pledge are some of the things pioneered by Amazon. For more information, visit amazon.com/about and follow @AmazonNews.

Contacts

Amazon.com, Inc.
Media Hotline
Amazon-pr@amazon.com
www.amazon.com/pr

Article link: https://www.businesswire.com/news/home/20231128145465/en/AWS-Unveils-Next-Generation-AWS-Designed-Chips

Army moving away from compliance-based cybersecurity

Posted by timmreardon on 11/30/2023
Posted in: Uncategorized.

As the Army modernizes its network, it is looking at evolving the way it protects and defends critical IT and cyber terrain.

BYMARK POMERLEAU

NOVEMBER 30, 2023

As the Army modernizes its network, it is looking to emphasize cybersecurity operations as the next step in maturity, moving beyond compliance.

Officials have described becoming more proactive against cyber threats as opposed to a reactive posture, which involves enhancing the training and abilities of the signal corps, improving policies, and developing new concepts and capabilities such as the central delivery of services.

“We’ve been doing cybersecurity operations, but it’s been exceedingly compliance based. Meaning, fill out the checklist … you’re cyber secure. Against a thinking adversary, we know that won’t work,” Lt. Gen. John Morrison, deputy chief of staff, G6, said in an interview. “We’re really shifting from a compliance-based approach to really be active in cybersecurity operations. That is the big shift that I think you’re seeing not just inside the Army, but across the entire Department of Defense … I think the reason that we pound on cybersecurity operations is really making sure that folks know that we are transitioning from a compliance-based, very passive approach to cybersecurity and rapidly moving to something that’s much more active in the day to day.”

The Army has been on a multiyear journey to mature its network, consolidating the various instantiations from the tactical level and the enterprise to create what the service calls the unified network that soldiers can access all over the world regardless of theater or echelon.

As part of this push, the Army wants to better integrate the functions of cybersecurity and cybersecurity operations — which in some circles are thought of as defensive cyber ops that seek to be more proactive and hunt malicious activity on the network rather than being more reactive to threats.

“This thing that we called cybersecurity operations, really does bleed over into what is defensive cyber operations. I think the big thing that it does is it starts focusing us on being less focused on the administrivia of the day and work focused on the technical risks,” Leonel Garciga, the Army’s chief information officer, told DefenseScoop. “I think that’s really what it boils down to and that’s the distinction. It’s how do we start moving in a direction where we’re more holistically focused on understanding the data that’s being delivered on the network, right, the unified network, and being able to react to that data, whether it be from a threat, or a status of our posture. That’s different.”

Bucketing these notions in this way allows the Army to begin to reduce complexity.

“It allows us to take a look at, okay, so for the basic, the threat agnostic, defensive of our network — read cybersecurity operations — then if we layer that across the unified network, we’re able to now layer in capabilities and force structure, right, so we can put complexity in the right spot,” Morrison said.

One of the major efforts associated with the unified network approach is moving complexity from lower echelons so they can focus on warfighting, not getting their communications or IT established.

Part of that is centralizing the delivery of services and capabilities like Unified Security Incident and Event Monitoring, which aims to provide end-to-end network visibility across all echelons, spanning the strategic enterprise level all the way to tactical formations.

“By moving towards this notion of unified net ops and defense capabilities, we’re now able to layer that on echelon that, quite frankly, we had not been able to see at any other time. We introduced new capabilities like Unified Security Incident and Event Monitoring, that now go across all echelons from strategic, operational, down to the tactical edge, where everybody can see the same thing and then the person with the time to act on it, can then act on it,” Garciga said. “It helps us from a budgetary perspective, it’s going to help us from how we actually organize our forces to conduct cybersecurity operations. And then it’s, quite frankly, going to take that complexity off the edge and give it to folks that actually have time to manage.”

Garciga noted that efforts to modernize the network — whether it’s Risk Management Framework 2.0, new software or new policies — change the nature of cybersecurity itself and thus change the skill sets that are needed.

“As we look moving forward and getting not reactive, but proactive against cyber threats and you’re starting to see that scale out to the traditional part of the Signal Corps and how we deliver services — and that’s an important distinction and change that’s happening as we move to the Army in 2030,” Garciga said. “In many ways as we are moving forward, right, taking this traditional approach to cybersecurity and operationalizing it, is, in effect, increasing the size of what we would call that defensive cyber operating force.”

Morrison described upskilling and reskilling efforts that the Army needs to look hard at for both the military and civilian side of the workforce.

“We’re just in the nascent stages of changing several of our specialties over to be data engineers to really start helping bring all that together at the strategic, operational and tactical spaces,” he said. “Training is really, really, really important. We got the depth we need right now, but I will tell you as this gets more and more inculcated across our Army, we need to build that technical depth across all of our formations.”

Article link: https://defensescoop.com/2023/11/30/army-moving-away-from-compliance-based-cybersecurity/?

Big Companies Find a Way to Identify A.I. Data They Can Trust – NYT

Posted by timmreardon on 11/30/2023
Posted in: Uncategorized.
Thi Montalvo, a data scientist at Transcarent, sees the potential for significant time savings from using the Data & Trust Alliance’s labeling standards in A.I. projects.Credit…Rachel Woolf for The New York Times

By Steve Lohr

Steve Lohr has covered data and software for more than 20 years.

Nov. 30, 2023, 6:00 a.m. ET

Data is the fuel of artificial intelligence. It is also a bottleneck for big businesses, because they are reluctant to fully embrace the technology without knowing more about the data used to build A.I. programs.

Now, a consortium of companies has developed standards for describing the origin, history and legal rights to data. The standards are essentially a labeling system for where, when and how data was collected and generated, as well as its intended use and restrictions.

The data provenance standards, announced on Thursday, have been developed by the Data & Trust Alliance, a nonprofit group made up of two dozen mainly large companies and organizations, including American Express, Humana, IBM, Pfizer, UPS and Walmart, as well as a few start-ups.

The alliance members believe the data-labeling system will be similar to the fundamental standards for food safety that require basic information like where food came from, who produced and grew it and who handled the food on its way to a grocery shelf.

Greater clarity and more information about the data used in A.I. models, executives say, will bolster corporate confidence in the technology. How widely the proposed standards will be used is uncertain, and much will depend on how easy the standards are to apply and automate. But standards have accelerated the use of every significant technology, from electricity to the internet.

“This is a step toward managing data as an asset, which is what everyone in industry is trying to do today,” said Ken Finnerty, president for information technology and data analytics at UPS. “To do that, you have to know where the data was created, under what circumstances, its intended purpose and where it’s legal to use or not.”

Surveys point to the need for greater confidence in data and for improved efficiency in data handling. In one poll of corporate chief executives, a majority cited “concerns about data lineage or provenance” as a key barrier to A.I. adoption. And a survey of data scientists found that they spent nearly 40 percent of their time on data preparation tasks.

The data initiative is mainly intended for business data that companies use to make their own A.I. programs or data they may selectively feed into A.I. systems from companies like Google, OpenAI, Microsoft and Anthropic. The more accurate and trustworthy the data, the more reliable the A.I.-generated answers.

For years, companies have been using A.I. in applications that range from tailoring product recommendations to predicting when jet engines will need maintenance.

But the rise in the past year of the so-called generative A.I. that powers chatbots like OpenAI’s ChatGPT has heightened concerns about the use and misuse of data. These systems can generate text and computer code with humanlike fluency, yet they often make things up — “hallucinate,” as researchers put it — depending on the data they access and assemble.

Companies do not typically allow their workers to freely use the consumer versions of the chatbots. But they are using their own data in pilot projects that use the generative capabilities of the A.I. systems to help write business reports, presentations and computer code. And that corporate data can come from many sources, including customers, suppliers, weather and location data.

“The secret sauce is not the model,” said Rob Thomas, IBM’s senior vice president of software. “It’s the data.”

In the new system, there are eight basic standards, including lineage, source, legal rights, data type and generation method. Then there are more detailed descriptions for most of the standards — such as noting that the data came from social media or industrial sensors, for example.

The data documentation can be done in a variety of widely used technical formats. Companies in the data consortium have been testing the standards to improve and refine them, and the plan is to make them available to the public early next year.

Labeling data by type, date and source has been done by individual companies and industries. But the consortium says these are the first detailed standards meant to be used across all industries.

“My whole life I’ve spent drowning in data and trying to figure out what I can use and what is accurate, ” said Thi Montalvo, a data scientist and vice president of reporting and analytics at Transcarent.

Transcarent, a member of the data consortium, is a start-up that relies on data analysis and machine-learning models to personalize health care and speed payment to providers.

The benefit of the data standards, Ms. Montalvo said, comes from greater transparency for everyone in the data supply chain. That work flow often begins with negotiating contracts with insurers for access to claims data and continues with the start-up’s data scientists, statisticians and health economists who build predictive models to guide treatment for patients.

At each stage, knowing more about the data sooner should increase efficiency and eliminate repetitive work, potentially reducing the time spent on data projects by 15 to 20 percent, Ms. Montalvo estimates.

The data consortium says the A.I. market today needs the clarity the group’s data-labeling standards can provide. “This can help solve some of the problems in A.I. that everyone is talking about,” said Chris Hazard, a co-founder and the chief technology officer of Howso, a start-up that makes data-analysis tools and A.I. software.

Steve Lohr

Steve Lohr covers technology, economics and work force issues. He was part of the team awarded the Pulitzer Prize for explanatory reporting in 2013. More about Steve Lohr

Article link: https://www.nytimes.com/2023/11/30/business/ai-data-standards.html

8 Data Provenance Standards to foster Trust in Data and AI

Posted by timmreardon on 11/30/2023
Posted in: Uncategorized.

Based on work from experts across nineteen leading enterprises including IBM, the Data & Trust Alliance announced eight proposed data provenance standards to help foster trust in data and #AI.

Learn how these cross-industry standards aim to bring transparency to the origin of #data, including data used to train AI models: https://ibm.co/3R4Yyfc

Article link: https://www.linkedin.com/posts/ibmdata_ai-data-activity-7135990653766823936-0wa6?

Reorganizing government acquisition for the digital age – Government Executive

Posted by timmreardon on 11/30/2023
Posted in: Uncategorized.
The General Services Administration recently reorganized its Federal Acquisition Service, eliminating its regional structure, a move its commissioner Sonny Hashmi says is already yielding positive results.

NATALIE ALMS | 

NOVEMBER 29, 2023 02:10 PM ET

The self-described “procurement arm of the federal government” is undergoing a reorganization years in the making.

The Federal Acquisition Service, housed in the General Services Administration, supports over $87 billion in contracts across the government as part of its mission to help provide products and services to federal agencies.

In September, the agency announced that it would be implementing a reorganization in fiscal 2024 to simplify the experience for agencies using FAS by replacing a legacy, regional structure with something more centralized.

The old structure dates to the late 1990s, when much of the government operated regionally, given the need for paper-based and in-person interactions, FAS Commissioner Sonny Hashmi told Nextgov/FCW in an interview on the sidelines of ACT-IAC’s leadership conference in October.

As FAS digitized, the regional structure started creating an “artificial barrier, where a region that is maybe based and headquartered in Chicago is serving a customer who’s based in St. Louis using personnel that may be located in Fort Worth,” he explained. “So this whole concept around what is a region becomes increasingly difficult to explain.”

Now, FAS is organized by the customer agencies being served and their missions, meaning that “our customers now know that this is the one person I need to go to anytime I have any need for all of FAS,” said Hashmi. “Whether it’s fleet-related or technology-related, professional services or Technology Transformation Service, they have one person to call.”

The reorganization started three years ago with the question over the ethos of FAS, said Hashmi, and later became a team to lead a more customer-centric redesign for the service.

“It became very clear quickly that the way we were organized… was actually getting in the way of us serving our customers,” he said. “Often times, our customers have had to navigate our internal organization structure before they could get service, and that’s unacceptable.”

GSA will retain local offices and expertise, it says. As for the feds working in FAS, the changes will help them access new opportunities, since they won’t have to wait for an opportunity within their region anymore to move up, Hashmi argued, noting that in October alone over 40 new positions were opened. There is also the hope that the reorganization can help GSA hire acquisition professionals by widening the talent pool. 

Although GSA still has work to do “downstream,” said Hashmi — such as aligning contracts — the reorganization has also already helped FAS work with agencies to find common needs that might not otherwise be obvious to siloed organizations.

“Now we have dedicated teams serving our key customer segments, and over time, those teams will continue to learn more about their mission [and] develop specific solutions,” he said.

Article link: https://www.nextgov.com/acquisition/2023/11/reorganizing-government-acquisition-digital-age/392314/

A new chapter for Qiskit algorithms and applications

Posted by timmreardon on 11/29/2023
Posted in: Uncategorized.

In recent months, the Qiskit Ecosystem has undergone a series of changes, additions, migrations, and upgrades all driven by a common goal: to provide the community with a greater role in the development of algorithms and applications. To align with this goal, we have moved the applications repositories, relocated the algorithms in Qiskit into their own repository, and welcomed external partners as additional maintainers to the repositories. This blog post delves into the hows and whys of these changes, and introduces the new maintainers.

Algorithms as an independent package

If you have been following our release updates, you may already be familiar with this change. Since Qiskit 0.44.0, qiskit.algorithms has been migrated to a new standalone package, qiskit-algorithms. The new library can be found in the qiskit-community GitHuborganization and PyPi, and is the place to go for updated, primitive-based algorithm implementations.

This move is significant because it separates the circuit-building tools from the libraries built on top of them. This separation allows algorithm development to move at a different pace than that of the core library. While Qiskit itself aims to provide a stable foundation layer for quantum development, algorithms are still an evolving field of research, and will benefit from more flexible development cycles.

Community-oriented algorithms and applications

In the case of the Qiskit applications modules, the repository changes have been a bit more subtle. qiskit-nature, qiskit-machine-learning, qiskit-optimization, and qiskit-finance are now also part of the qiskit-community GitHub organization. The package installation path has remained unchanged, so the impact of this migration on end-users has been minimal. This move does however symbolize the strengthened community focus of the projects, which also involves the newly created qiskit-algorithms.

While these packages have always been open-source and welcomed external contributors, most feature development and maintenance efforts were sourced from within IBM Quantum. By joining forces with external partners, we enable the community to have a stronger impact on the direction of these libraries, bringing in new perspectives and areas of expertise. At the same time, we can focus more resources on improving the performance and stability of the core Qiskit package. The documentation of the applications can be found in their corresponding repositories as well as the Ecosystem page.

Welcoming new maintainers

The algorithms and applications libraries have onboarded new code-owners and maintainers from IBM Quantum partner institutions:

Algorithmiq (qiskit-nature): “Our mission is to revolutionise life sciences by exploiting the potential of quantum computing to solve currently inaccessible problems. Algorithmiq’s top quantum chemistry team and their knowledge of state-of-the-art quantum chemistry methods will, together with qiskit’s expert community, tackle some of the greatest quantum chemistry simulation challenges that lie ahead.”

STFC Hartree Centre (qiskit-machine-learning): “We help UK businesses and organisations of any size to explore and adopt supercomputing, data analytics, AI and emerging technologies for enhanced productivity, smarter innovation and economic growth. Our Qiskit work will be supported by the Hartree National Centre for Digital Innovation (HNCDI) — a collaboration with IBM Research that bridges the gap between academic research and the adoption of new technologies to solve industry challenge and transfer the skills needed to adopt digital solutions.”

Quantagonia (qiskit-optimization): “Quantagonia’s mission is to democratize quantum computing, making it accessible and manageable for businesses across sectors, enabling them to leverage this powerful technology for transformative growth and a competitive edge.”

These are domain experts in chemistry, machine learning and optimization, as well as active community contributors.

Cleaning up legacy dependencies

To facilitate the community’s greater role in the development of algorithms and applications, legacy dependencies in the applications have been cleaned up to simplify the code base and make the libraries more accessible to external contributors and new maintainers.

This means the newly released versions of the application modules: qiskit-finance 0.4, qiskit-machine-learning 0.7, qiskit-optimization 0.6 and qiskit-nature 0.7 no longer depend on, or support, the now deprecated modules from Qiskit, such as opflow and quantum instance. These modules will no longer be available in the upcoming Qiskit 1.0 release.

In addition to this, and in anticipation of the upcoming removal of qiskit.algorithms, which will also not be part of Qiskit 1.0, the applications modules have been updated to use only qiskit-algorithmsinstead.

Conclusion

In summary, the new algorithm and application libraries in the Qiskit Ecosystem have undergone a significant shift towards community-led development. Establishing an independent algorithms package, relocating applications, and simplifying legacy code were all steps designed to facilitate and enhance community engagement. As always, community contributions are most welcome, and — together with the new maintainers — we invite all of you to get involved via the GitHub repositories or the corresponding Qiskit Slack channels, such as #applications.

Article link: https://medium.com/qiskit/a-new-chapter-for-qiskit-algorithms-and-applications-5baff541e826

Posts navigation

← Older Entries
Newer Entries →
  • Search site

  • Follow healthcarereimagined on WordPress.com
  • Recent Posts

    • Introduction: Disinformation as a multiplier of existential threat – Bulletin of the Atomic Scientists 03/12/2026
    • AI is reinventing hiring — with the same old biases. Here’s how to avoid that trap – MIT Sloan 03/08/2026
    • Fiscal Year 2025 Year In Review – PEO DHMS 02/26/2026
    • “𝗦𝗼𝗰𝗶𝗮𝗹 𝗠𝗲𝗱𝗶𝗮 𝗠𝗮𝗻𝗶𝗽𝘂𝗹𝗮𝘁𝗶𝗼𝗻 𝗳𝗼𝗿 𝗦𝗮𝗹𝗲” – NATO Strategic Communications COE 02/26/2026
    • Claude Can Now Do 40 Hours of Work in Minutes. Anthropic Says Its Safety Systems Can’t Keep Up – AJ Green 02/19/2026
    • Agentic AI, explained – MIT Sloan 02/18/2026
    • Anthropic’s head of AI safety Mrinank Sharma resigns, says ‘world is in peril’ in resignation letter 02/10/2026
    • Moltbook was peak AI theater 02/09/2026
    • WHAT A QUBIT IS AND WHAT IT IS NOT. 01/25/2026
    • Governance Before Crisis We still have time to get this right. 01/21/2026
  • Categories

    • Accountable Care Organizations
    • ACOs
    • AHRQ
    • American Board of Internal Medicine
    • Big Data
    • Blue Button
    • Board Certification
    • Cancer Treatment
    • Data Science
    • Digital Services Playbook
    • DoD
    • EHR Interoperability
    • EHR Usability
    • Emergency Medicine
    • FDA
    • FDASIA
    • GAO Reports
    • Genetic Data
    • Genetic Research
    • Genomic Data
    • Global Standards
    • Health Care Costs
    • Health Care Economics
    • Health IT adoption
    • Health Outcomes
    • Healthcare Delivery
    • Healthcare Informatics
    • Healthcare Outcomes
    • Healthcare Security
    • Helathcare Delivery
    • HHS
    • HIPAA
    • ICD-10
    • Innovation
    • Integrated Electronic Health Records
    • IT Acquisition
    • JASONS
    • Lab Report Access
    • Military Health System Reform
    • Mobile Health
    • Mobile Healthcare
    • National Health IT System
    • NSF
    • ONC Reports to Congress
    • Oncology
    • Open Data
    • Patient Centered Medical Home
    • Patient Portals
    • PCMH
    • Precision Medicine
    • Primary Care
    • Public Health
    • Quadruple Aim
    • Quality Measures
    • Rehab Medicine
    • TechFAR Handbook
    • Triple Aim
    • U.S. Air Force Medicine
    • U.S. Army
    • U.S. Army Medicine
    • U.S. Navy Medicine
    • U.S. Surgeon General
    • Uncategorized
    • Value-based Care
    • Veterans Affairs
    • Warrior Transistion Units
    • XPRIZE
  • Archives

    • March 2026 (2)
    • February 2026 (6)
    • January 2026 (8)
    • December 2025 (11)
    • November 2025 (9)
    • October 2025 (10)
    • September 2025 (4)
    • August 2025 (7)
    • July 2025 (2)
    • June 2025 (9)
    • May 2025 (4)
    • April 2025 (11)
    • March 2025 (11)
    • February 2025 (10)
    • January 2025 (12)
    • December 2024 (12)
    • November 2024 (7)
    • October 2024 (5)
    • September 2024 (9)
    • August 2024 (10)
    • July 2024 (13)
    • June 2024 (18)
    • May 2024 (10)
    • April 2024 (19)
    • March 2024 (35)
    • February 2024 (23)
    • January 2024 (16)
    • December 2023 (22)
    • November 2023 (38)
    • October 2023 (24)
    • September 2023 (24)
    • August 2023 (34)
    • July 2023 (33)
    • June 2023 (30)
    • May 2023 (35)
    • April 2023 (30)
    • March 2023 (30)
    • February 2023 (15)
    • January 2023 (17)
    • December 2022 (10)
    • November 2022 (7)
    • October 2022 (22)
    • September 2022 (16)
    • August 2022 (33)
    • July 2022 (28)
    • June 2022 (42)
    • May 2022 (53)
    • April 2022 (35)
    • March 2022 (37)
    • February 2022 (21)
    • January 2022 (28)
    • December 2021 (23)
    • November 2021 (12)
    • October 2021 (10)
    • September 2021 (4)
    • August 2021 (4)
    • July 2021 (4)
    • May 2021 (3)
    • April 2021 (1)
    • March 2021 (2)
    • February 2021 (1)
    • January 2021 (4)
    • December 2020 (7)
    • November 2020 (2)
    • October 2020 (4)
    • September 2020 (7)
    • August 2020 (11)
    • July 2020 (3)
    • June 2020 (5)
    • April 2020 (3)
    • March 2020 (1)
    • February 2020 (1)
    • January 2020 (2)
    • December 2019 (2)
    • November 2019 (1)
    • September 2019 (4)
    • August 2019 (3)
    • July 2019 (5)
    • June 2019 (10)
    • May 2019 (8)
    • April 2019 (6)
    • March 2019 (7)
    • February 2019 (17)
    • January 2019 (14)
    • December 2018 (10)
    • November 2018 (20)
    • October 2018 (14)
    • September 2018 (27)
    • August 2018 (19)
    • July 2018 (16)
    • June 2018 (18)
    • May 2018 (28)
    • April 2018 (3)
    • March 2018 (11)
    • February 2018 (5)
    • January 2018 (10)
    • December 2017 (20)
    • November 2017 (30)
    • October 2017 (33)
    • September 2017 (11)
    • August 2017 (13)
    • July 2017 (9)
    • June 2017 (8)
    • May 2017 (9)
    • April 2017 (4)
    • March 2017 (12)
    • December 2016 (3)
    • September 2016 (4)
    • August 2016 (1)
    • July 2016 (7)
    • June 2016 (7)
    • April 2016 (4)
    • March 2016 (7)
    • February 2016 (1)
    • January 2016 (3)
    • November 2015 (3)
    • October 2015 (2)
    • September 2015 (9)
    • August 2015 (6)
    • June 2015 (5)
    • May 2015 (6)
    • April 2015 (3)
    • March 2015 (16)
    • February 2015 (10)
    • January 2015 (16)
    • December 2014 (9)
    • November 2014 (7)
    • October 2014 (21)
    • September 2014 (8)
    • August 2014 (9)
    • July 2014 (7)
    • June 2014 (5)
    • May 2014 (8)
    • April 2014 (19)
    • March 2014 (8)
    • February 2014 (9)
    • January 2014 (31)
    • December 2013 (23)
    • November 2013 (48)
    • October 2013 (25)
  • Tags

    Business Defense Department Department of Veterans Affairs EHealth EHR Electronic health record Food and Drug Administration Health Health informatics Health Information Exchange Health information technology Health system HIE Hospital IBM Mayo Clinic Medicare Medicine Military Health System Patient Patient portal Patient Protection and Affordable Care Act United States United States Department of Defense United States Department of Veterans Affairs
  • Upcoming Events

Blog at WordPress.com.
healthcarereimagined
Blog at WordPress.com.
  • Subscribe Subscribed
    • healthcarereimagined
    • Join 153 other subscribers
    • Already have a WordPress.com account? Log in now.
    • healthcarereimagined
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar
 

Loading Comments...