An AI-produced video could show Donald Trump saying or doing something extremely outrageous and inflammatory. It would be only too believable, and in a worst-case scenario it might sway an election, trigger violence in the streets, or spark an international armed conflict.
Fortunately, a new digital forensics technique promises to protect President Trump, other world leaders, and celebrities against such deepfakes—for the time being, at least. The new method uses machine learning to analyze a specific individual’s style of speech and movement, what the researchers call a “softbiometric signature.”
The researchers, from UC Berkeley and the University of Southern California, used an existing tool to extract the face and head movements of individuals. They also created their own deepfakes for Donald Trump, Barack Obama, Bernie Sanders, Elizabeth Warren, and Hillary Clinton using generative adversarial networks.
The team then used machine learning to distinguish the head and face movements that characterize the real person. These subtle signals—the way Bernie Sanders nods while saying a particular word, perhaps, or the way Trump smirks after a comeback—are not currently modeled by deepfake algorithms.
In experiments the technique was at least 92% accurate in spotting several variations of deepfakes, including face swaps and ones in which an impersonator is using a digital puppet. It was also able to deal with artifacts in the files that come from recompressing a video, which can confuse other detection techniques. The researchers plan to improve the technique by accounting for characteristics of a person’s speech as well. The research, which was presented at a computer vision conference in California this week, was funded by Google and DARPA, a research wing of the Pentagon. DARPA is funding a program to devise better detection techniques.
The problem facing world leaders (and everyone else) is that it has become ridiculously simple to generate video forgeries with artificial intelligence. False news reports, bogus social-media accounts, and doctored videos have already undermined political news coverage and discourse. Politicians are especially concerned that fake media could be used to sow misinformation during the 2020 presidential election.
Some tools for catching deepfake videos have been produced already, but forgers have quickly adapted. For example, for a while it was possible to spot a deepfake by tracking the speaker’s eye movements, which tended to be unnatural in deepfakes. Shortly after this method was identified, however, deepfake algorithms were tweaked to include better blinking.
“We are witnessing an arms race between digital manipulations and the ability to detect those, and the advancements of AI-based algorithms are catalyzing both sides,” says Hao Li, a professor at the University of Southern California who helped develop the new technique. For this reason, his team has not yet released the code behind the method .
Li says it will be particularly difficult for deepfake-makers to adapt to the new technique, but he concedes that they probably will eventually. “The next step to go around this form of detection would be to synthesize motions and behaviors based on prior observations of this particular person,” he says.
Li also says that as deepfakes get easier to use and more powerful, it may become necessary for everyone to consider protecting themselves. “Celebrities and political figures have been the main targets so far,” he says. “But I would not be surprised if in a year or two, artificial humans that look indistinguishable from real ones can be synthesized by any end user.”
By Brandi Vincent, Staff Correspondent June 20, 2019
Though some fruits of quantum information science (think atomic clocks and CAT scan technology) are increasingly prevalent in Americans’ daily lives, there is still a great deal of progress to be made across the quantum computing space, scientific experts from government, industry and academia said in Washington Wednesday.
At the end of last year, President Trump signed the National Quantum Initiative into law, which granted more than a billion in quantum research funding and followed a large effort led by the executive and legislative branches and many others from the quantum research and development community. The initiative calls on the nation to substantially increase its reasonably expansive QIS efforts.
“So we are in the process of going through that expansion, and part of that is through the president’s budget, part of that is by having individual agencies of the United States government take on expanded roles,” Jake Taylor, assistant director for quantum information science in the White House’s Office of Science and Technology Policy, said.
Taylor said the policy’s base is vast, but his team is expressly working to ensure that economic growth opportunities and opportunities for improving the world are baked into quantum policies and systems. He said they are also working with international collaborators to advance and govern the emerging technology.
“When you’re taking the scientific space and you’re trying to really push it forward, you’re seeking the best answers around the world, and so the way you do that is by building and making strong collaborations to keep that culture of discovery moving,” Taylor said.
The assistant director said his team participated in a “wonderful dialogue” with the European Union last month and will be engaging in a workshop this summer to look further into expanding its collaboration with the EU. He said the nations fundamentally support the same “science first” approach and understanding that policies should benefit individual citizens across diverse societies.
“What I can say is that by choosing to maintain a leadership role and to work with international collaborators and to cooperate across the world, we have the opportunity to realize that and it’s up to us to maintain that strength,” Taylor said. “This is what this bill is about and what we’re up to—come back to me in seven years and I’ll have more details.”
And as for the next decade, other officials said the U.S. can expect to witness a whirlwind of progress and quantum advancements.
“We are in the late ‘40s and early 1950s of information technology, when machines were big and ugly and had lots of wires hanging off of them,” said Bob Wisnieff, chief technology officer of quantum computing at IBM Research. “At this point, we should expect rapid generations of machines, growing capabilities, and machines that, like the IBM360 in the 1960s, are better suited to machine rooms than to your wrist.”
But Wisnieff said as America moves toward the quantum advantage, experts will need to establish new fields of thought and put the right people in place to explore how the machines will evolve and influence society over time.
“We are now getting into the regime where we need to deal with a lot of the engineering challenges, because as you scale up not only will we need to continue to make scientific advances here, but we will need to develop an entire new field of engineering—how one builds these systems and constructs them and tests them and operates them on a daily basis to provide value,” he said.
While there’s a long way to go, insiders said the potential impacts of the technology will be monumental.
For example, Associate Director of the Quantum Economic Development Consortium Celia Merzbacher said she sees exciting possibilities at the convergence of quantum computing, artificial intelligence, 5G and other technologies connected to the internet of things.
“Some of the most exciting things that come to my mind are in the ability to optimize certain problems that will allow for driverless cars and management of these big systems of things that need to be interacting with each other,” she said. “So whether it’s on the energy grid space or transportation or the financial sector, there are applications I think that are going to blow all our minds.”
Look across the Potomac River toward Rosslyn, where the corporate logos of government contractors crown a parade of office towers that follows the river past the Pentagon. The skyline, like America’s defense industrial landscape, is changing. Soon, 25,000 Amazon employees will be climbing the Metro escalators to work in Crystal City each morning along with the tens of thousands of workers from military, intelligence, and the defense industry organizations.
The arrival of Amazon’s HQ2 in the cradle of U.S. government contracting comes at a portentous time for the Defense Department. Technology is altering what makes us strong, prosperous, and secure. The defense industrial base is becoming the strategic innovation base. Today’s leading digital companies have disrupted every industry they have touched, from publishing to automotive. Could Amazon and the rest of the “FAANG companies”—Facebook, Apple, Netflix, and Google—or one of a handful of pure-play artificial-intelligence companies, such as the authors’ SparkCognition, become fixtures of this new industrial base?
While that remains to be seen, the Pentagon supplier that can master robotics and AI will become the most essential of the firms that build America’s arsenal. Moreover, the Defense Department’s practices will increasingly resemble those of this new wave of strategically important companies because that is what the current revolution in warfare requires.
The world is on the doorstep of an artificial intelligence- and robotics-driven revolution in conflict that, after decades of looming just over the horizon, now is a near-term certainty. Just as industrial-age tanks and machine guns devastated World War I battlefields and the U.S. Air Force’s GPS-guided weapons headlined the 1991 Gulf War, social media algorithms and AI-equipped robotic swarms will decide conflicts. Data is not just the new oil, as the saying goes. Data is also the new ammunition.
The Pentagon is preparing accordingly, doing everything from standing up an Army Futures Command to engaging technology luminaries with the Defense Innovation Board to establishing a Joint Artificial Intelligence Center to reforming mid-tier acquisitions policy. But it needs to do more — and do it faster — if the U.S. military is to prevail in future machine-speed conflicts. Fortunately, the Pentagon and its suppliers can learn from the digital disruptors in areas such as robotics, acquiring groundbreaking capabilities, software ecosystems, data management, and symbiotic innovation strategies.
Taken together, today’s leading digital companies have many of the traits for a reimagined, expanded defense industrial base, one that reflects the social, political, and strategic power of companies such as Amazon, Google, and Facebook. Moreover, the most strategically important machine learning and robotics technologies will likely originate in non-defense firms based on their overall investment, market-driven innovation cycles, and talent acquisition. U.S. defense policy is shifting but the speed of technological advancement remains far faster. In recent House testimony, DOD Chief Information Officer Dana Deasy said acquisition changes will come from asking “how do we move to a more startup mentality when moving to technologies like AI?”
Well, here is how.
Be aggressively robotized. While autonomous robotic swarms will become a staple of future battlefields, the nations that can harness automation for logistics, supply, and maintenance will have a decisive operational – and economic – edge.
As of 2017, Amazon reportedly had more than 100,000 robots on the job. This is especially relevant to the Defense Department and the future strategic innovation base because the shift to process automation is driven by the speed-and-cost expectations of “divinely discontent” customers, as CEO Jeff Bezos called them in the company’s 2017 annual report. Such automation — though unlikely to go as viral as a bounding robot biped — is of extreme consequence in the business world, to politics, and to American society.
Be highly acquisitive. In 2012, Facebook purchased Instagram for $1 billion; two years later it bought WhatsApp for $19 billion and virtual-reality company Oculus for $2 billion. With deep corporate coffers, Facebook could have built and marketed its own competing platforms. But this approach allowed Facebook to establish itself immediately with a suite of technologies it could integrate into its central products, while still allowing these separate entities to develop in parallel into innovative platforms in their own right.
Be software-driven. As machines learn to code faster and more accurately than humans, smaller and smaller organizations will be able to develop their own software applications to best suit their mission requirements. For all the beauty of an iPhone or iMac, Apple’s products are only as good as the software that runs on them —which gets better every day as developers bring new ideas to market through the App Store.
Think data, data, data. The importance of being able to effectively wield data at every level of Defense Department operations – from recruiting at home to finding targets abroad – cannot be overstated. Precision is expected in American military operations, which generally reduces the reliance on rampant destructive, kinetic military force. Groups that glean insight from what will be essentially a limitless pool of data will tip the balance even further toward precision and decreased destruction; to be sure, getting the right data at the right moment will be an enduring challenge during conflicts but the ability to develop such an insight with alternative data sets is an altogether novel paradigm. Whether it is creating the world’s most popular search engine or developing open-source machine learning algorithms, data is at the center of everything Google pursues. Currently, Google’s search engine processes tens of thousands of searches each second. Meanwhile, a broader and more decentralized array of devices, from smart TVs to mobile phones to fitness trackers, gather more and more data each day.
Pivot purposefully: A successful organization must be able to hold the virtual and physical in tension, while finding ways to develop both in a symbiotic manner underpinned by data analytics. In yet another sign of the sea-change in studio-developed entertainment, Netflix won 23 Emmy Awards this fall — as many as Amazon. While Netflix will still send you DVDs by mail, the company moved beyond that revenue stream years ago as advances in servers and telecom bandwidth allowed them to supplant the physical with the virtual. Its entry into the capital-intensive and hands-on business of film and show production is backed by years of data about customer tastes and preferences.
These are just a few of the lessons from leading technology companies in what it will take for America to have a globally dominant strategic industrial base. Fortunately, with the varied examples of innovation U.S. digital disruptors have already pioneered and demonstrated, the Pentagon need not look much farther for inspiration than its own backyard.
June 17, 2019 05:23 PM ET
Shannon Sartin and her team are working to cut through convoluted tech and give patients easier access to their data.
For decades, healthcare providers have doubled down on technology to improve the way they treat patients, but Shannon Sartin worries the industry’s expansive IT ecosystem is getting in the way of better care.
As executive director of the Digital Service at the Health and Human Services Department, Sartin is on the front lines of the government’s efforts to revamp healthcare for the 21st century. In recent years, much of the battle has played out at the Centers for Medicare and Medicaid Services, which oversees the care of more than 100 million people across the country.
Like other U.S. Digital Service outposts, she and her team work to bring tech industry best practices—like agile development, cloud services and user testing—to federal IT shops that are sometimes far behind the times. But while many digital services branches focus on modernizing processes unique to their respective agencies, Sartin sees the potential for her work to reach far beyond the borders of HHS, and even the government itself.
“We want to be in a place where our technology and the choices that we’re making are actually the ones that are impacting the way you and I receive healthcare,” Sartin said in a conversation with Nextgov. “That’s how much economic force CMS has. It could slow down or speed up the entire healthcare industry.”
HHS Digital Service’s efforts have been so successful that the Partnership for Public Service nominated Sartin and her team for one of its annual Service to America awards.
In Sartin’s eyes, technology can work both for and against patient care. If used correctly, IT can give people more control over their own health and allow doctors to focus more energy on treating patients. But incorporating too many different systems can overburden providers with administrative tasks, leading to impersonal, inefficient and expensive care.
Unfortunately, Sartin said, the modern healthcare industry seems to more closely resemble the second model than the first. But by overhauling its outdated infrastructure and making healthcare data more readily accessible, she said, CMS could free up more resources for treating patients ultimately drive change across the broader industry.
Last year, Sartin and her team launched an application programming interface called Blue Button 2.0, which contained detailed health information on some 53 million Medicare recipients. The API, which originated from an earlier CMS program, allows patients to connect their medical records to the numerous health-centric applications and services that have popped up in recent years. They can also opt to donate their data to inform medical research efforts around the globe.
So far, more than 1,800 developers have signed up for the platform and dozens of apps are already using the API to give patients a better handle on their healthcare.
Before Blue Button, Medicare claims data was only available to recipients in a PDF format, something “none of us really want” in the age of smartphones and wearable tech, Sartin said. Future, tech-savvier generations of beneficiaries will want to pay bills, access data and manage healthcare through their devices, and according to Sartin, APIs like Blue Button are laying the groundwork for that reality, she said.
“I think it’s really about … empowering innovators and entrepreneurs with the data,” she said. “How do we make sure it’s available in a way that not only aligns with what industry is already doing but actually sets the standard” for where it goes in the future.
On the provider side of the equation, Sartin and her team also launched a tool that sped up payments to doctors and offers rapid feedback on their treatment of patients. The platform came as part of a governmentwide effort to promote “value-based” care—the concept of paying doctors based on performance instead of the specific services they provide—and incentivizes doctors to take “a broader, holistic” approach to treating patients, according to Sartin.
When developing the tool, she and her team conducted extensive testing with doctors who would ultimately be using the system, a practice common in industry but not at CMS, Sartin said. Their feedback played a large role in shaping the platform, she said, and since its launch, the Quality Payment Program has been embraced throughout the medical community.
The system “ended up being one that doctors not only didn’t hate but were giving praise to, which is a rarity at CMS,” she said. “Our team really believes that partnership is how we achieve success.”
Looking ahead, Sartin said she expects her team to continue improving the platform and modernizing the backend payment system, which is still running on outdated COBOL software. While there won’t be “crazy, flashy launches” every few months, she said, incremental improvements to CMS’ payment processes will have significant impacts on both providers and patients.
Sartin said her team also plans to explore ways to make it easier to share patients’ data among HIPAA-covered entities like doctors and insurers.
In a recent report, GAO named the health IT system in use at HHS among the 10 federal legacy systems most in need of modernization to prevent security problems.
By Kate Monica
June 12, 2019 – The Government Accountability Office (GAO) urged HHS to modernize its 50 year-old health IT system in a report naming the ten federal legacy systems most in need of significant upgrades.
The health IT system, implemented in 1969, was ranked third on the list of ten outdated IT systems in need of revamping. The EHR system supports healthcare delivery for the Indian Health Service (IHS) and is considered a highly susceptible to security threats.
IHS uses the clinical and patient administrative information system to gather, store, and display clinical, administrative, and financial information about patients visiting its clinics and hospitals. Due to the sensitive nature of this information, HHS officials stated that a modernization effort is “imperative.”
Specifically, HHS officials interviewed by GAO said the system’s technical architecture and infrastructure are outdated, obstructing the federal agency’s efforts to fulfill regulatory requirements.
The EHR system software includes 50 additional modules implemented over time to keep pace with the changing demands of the evolving healthcare system.
“The software is installed on hundreds of separate computers, which has led to variations in the configurations at each site,” wrote GAO in the report. “According to HIS, this type of add-on development becomes detrimental over time and eventually requires a complete redesign to improve database design efficiency, process efficiency, workflow integration, and graphical user interfaces.”
System variations negatively affect interoperability and health data exchange between IHS employees. Replacing or modernizing the EHR system would help to boost interoperability with other healthcare organizations and facilities and improve care coordination for patients within the IHS network.
IHS officials expressed an interest in modernizing the EHR system earlier this year, but officials have not yet developed concrete modernization plans.
In September 2018, HHS awarded a contract to conduct research related to modernizing its health IT infrastructure, applications, and capabilities.
“According to the department, the research will be conducted in several stages over the next year, and a substantial part of the research will be an evaluation of the current state of health IT across IHS’s health facilities,” wrote GAO.
“Once the research is conducted, in consultation with IHS and its stakeholders, the contractor will use the findings and recommendations to propose a prioritized roadmap for modernization.”
HHS officials stated the agency expects to complete its modernization initiative in the next five years, and anticipates launching an EHR implementation plan as early as 2020.
While the federal agency has taken some steps to initiate a modernization effort, HHS has not calculated a potential cost for the project.
“With regards to potential cost savings, HHS noted that the modernization will take significant capital investment to complete and it is unknown whether the modernization will lead to cost savings,” noted GAO.
Launching an EHR modernization project at IHS would help to promote care coordination and data sharing between IHS, VA, and the Department of Defense (DoD.)
VA and DoD are currently in the process of replacing their legacy health IT systems with a Cerner EHR platform.
“HHS officials stated that this modernization could improve interoperability with its health care partners, the Department of Veterans Affairs and the Department of Defense, and significantly enhance direct patient care,” GAO officials wrote.
In 2017, IHS issued a request for information (RFI) to inform an EHR modernization effort. The RFI was part of the federal agency’s exploratory market research effort to assess health IT industry innovations and capabilities.
IHS provides healthcare services to American Indians and Alaska Natives per provisions established by the relationship between the federal government and Indian tribes in 1787.
By Nathan Eddy June 13, 2019 11:50 AM
The Federal Electronic Health Record Modernization office will replace the current Interagency Program Office.
The Department of Defense and the Veterans Affairs department announced the creation of a special office to help centralize decision-making as the VA makes a multi-billion dollar electronic health records upgrade.
The Federal Electronic Health Record Modernization (FEHRM) office will replace the current Interagency Program Office, headed by Lauren Thompson.
WHY IT MATTERS
“This management model creates a centralized structure for interagency decisions related to EHR modernization, accountable to both the VA and the DoD Deputy Secretaries,” FCW reported Thompson as saying during her testimony at a June 12 hearing of the Subcommittee on Technology Modernization of the House Veterans Affairs Committee.
Ranking member of the subcommittee Rep. Jim Banks (R-Ind.) noted during the hearing that lack of interoperability and clear management had been a continuing struggle, and it remains unclear if the new office will help keep the transition on schedule.
Plans were also revealed to implement a pilot program that will see all veterans receive a unique ID number to help track individuals’ medical records after they leave service.
THE LARGER TREND
The announcement comes as the VA department wrangles with a $16 billion implementation of the Cerner electronic health records system, which is slated to go live across care sites by 2028, but the lack of interoperability between DoD and VA remains a major stumbling block.
While progress is being made, the sheer vastness of the task – 1,700 sites, training of 300,000 VA employees and aggregation of decades of clinical data – delays in decision making and workflow difficulties have been challenges.
The VA signed a contract with Cerner in 2018 to replace the department’s 40-year-old legacy Veterans Integrated System Technology Architecture (Vista) healthcare records technology over the next 10 years with the Cerner system, which is currently in the pilot phase at DoD.
However, an array of major challenges with the DoD’s EHR modernization came to light last spring, with many IT, training and workflow hiccups reportedly causing inaccurate prescriptions, misdirected patient referrals and other complaints that clinicians said could put patient safety at risk.
Further complicating matters was the first House VA Subcommittee on Technology Modernization hearing, held in September 2018, which revealed that officials and congressional members are not on the same page when it comes to governance and EHR interoperability.
In January, the Senate confirmed James Gfrerer, a former marine and cybersecurity executive at Ernst & Young, to head the VA’s IT department — a position that has been without a permanent leader for the past two years.
The VA has also partnered with Microsoft with the aim of improving how veterans living in rural areas can access the VA’s online services and benefits.
Nathan Eddy is a healthcare and technology freelancer based in Berlin.
Email the writer: email@example.com