Page 44 - Federal Computer Week, May/June 2019
P. 44

FCWQ&A
Lt. Col. Philip Root (left) joined the Defense Advanced Research Projects Agency in 2017 and was previously director of the Center for Innovation and Engineering at the U.S. Military Academy at West Point. According to his DARPA bio, his research interests include unmanned vehicles, collaborative autonomy and machine learning algorithms.
to let everyone know in advance. But they can’t all just leave. We have to operate with noncombatants around and provide them every opportu- nity to remove themselves from the environment. Anyone left would then have hostile intent.
We might send a message via a drone, for instance, and say, “Today is not a good day to be outside. We recommend you go to the nearest building.”
So a drone would come
down and start talking?
Could be. We just started so I don’t presume to know. It could come down and say, “U.S. forces are approaching. Not a good day to be outside.” Anyone who stays outside might have a really good reason to be outside; it doesn’t mean they’re hostile in any way.
Could be they didn’t hear us, are deaf, it’s noisy out — so we have to seek a different method. Maybe we put a laser on the ground to confirm
they’re seeing it. Perhaps we play a popping sound, and combatants and noncombatants respond differently. But at no point is the autonomy doing this on its own.
We just want to collect as much information [as possible] so if some- one with non-hostile intent wanders into a U.S. patrol, we can provide a folder of information before a soldier takes their finger out of the trigger well. Nobody wants to be in the situ- ation where a soldier and a noncom- batant come in contact and both are surprised.
There’s a personal and emotional component to this. Do you have a team of people working on
it, including psychologists and behavioral specialists?
We have a team of behavioral psychologists and social science models of how people respond. But unfortunately, there’s not a whole lot of data on these types of drone
interactions. Nobody’s tried this. We’re going to watch social science develop at the same time as AI and machine learning. I’m not con- vinced that it’s going to work, but I’m convinced someone should be trying so we can take these lessons learned and apply them to whatever comes next.
We have to be committed to this problem. We can’t [shrink] away from it because the outcome is
far more perilous with the current problem — where soldiers and non- combatants are put in harm’s way.
One lesson that we’ve learned is that under a real interrogation, sus- pects who are angry are often those who are innocent because they’re so mad they’re caught up in this.
To your point, if someone is hav- ing a bad day and a drone gets in their face, they might throw a rock at it. We have to understand and factor that in. It might mean that we’re terrorizing the population. We
44 May/June 2019 FCW.COM
DAVID KAMM


































































































   42   43   44   45   46