Developing Virtual Partners to Assist Military Personnel

Increasing worker knowledge, productivity, and efficiency has been a seemingly never ending quest for the military as well as commercial companies. Today, military personnel are expected to perform a growing number of complex tasks while interacting with increasingly sophisticated machines and platforms. Artificial intelligence (AI) enabled assistants have the potential to aid users as they work to expand their skillsets and increase their productivity. However, the virtual assistants of today are not designed to provide advanced levels of individual support or real-time knowledge sharing.

“In the not too distant future, you can envision military personnel having a number of sensors on them at any given time – a microphone, a head-mounted camera – and displays like augmented reality (AR) headsets,” said Dr. Bruce Draper, a program manager in DARPA’s Information Innovation Office (I2O). “These sensor platforms generate tons of data around what the user is seeing and hearing, while AR headsets provide feedback mechanisms to display and share information or instructions. What we need in the middle is an assistant that can recognize what you are doing as you start a task, has the prerequisite know-how to accomplish that task, can provide step-by-step guidance, and can alert you to any mistakes you’re making.”

DARPA developed the Perceptually-enabled Task Guidance (PTG) program to explore the development of methods, techniques, and technology for AI assistants capable of helping users perform complex physical tasks. The goal is to develop virtual “task guidance” assistants that can provide just-in-time visual and audio feedback to help human users expand their skillsets and minimize their errors or mistakes. To develop these technologies, PTG seeks to exploit recent advances in deep learning for video and speech analysis, automated reasoning for task and/or plan monitoring, and augmented reality for human-computer interfaces.

Read more