DARPA Announces Research Teams Selected to Semantic Forensics Program

Media manipulation capabilities are advancing at a rapid pace while also becoming increasingly accessible to everyone from the at-home photo editor to nation state actors. As the technology evolves so does the national security threat posed by compelling media manipulations. While the issue today may be Deepfake videos, the ability to generate falsified multimodal assets – such as news stories with embedded photos and videos – from whole cloth may not be far off. To take on this growing threat, DARPA created the Semantic Forensics (SemaFor) program. SemaFor seeks to give analysts the upper hand in the fight between detectors and manipulators by developing technologies that are capable of automating the detection, attribution, and characterization of falsified media assets.

“From a defense standpoint, SemaFor is focused on exploiting a critical weakness in automated media generators,” said Dr. Matt Turek, the DARPA program manager leading SemaFor. “Currently, it is very difficult for an automated generation algorithm to get all of the semantics correct. Ensuring everything aligns from the text of a news story, to the accompanying image, to the elements within the image itself is a very tall order. Through this program we aim to explore the failure modes where current techniques for synthesizing media break down.”

Today, DARPA announced the research teams selected to take on SemaFor’s research objectives. Teams from commercial companies and academic institutions will work to develop a suite of semantic analysis tools capable of automating the identification of falsified media. Arming human analysts with these technologies should make it difficult for manipulators to pass altered media as authentic or truthful.

Read more