Page 32 - GCN, April/May 2018
P. 32

                                AI
The strategic plan released by the Obama administration in 2016 has gone stale, said Josh Elliot, Booz Allen Ham- ilton’s director of machine intelligence and data science solutions. He is also the co-author of a report released earlier this month by the Center for Strategic and In- ternational Studies and underwritten by Booz Allen. It stresses the “hard power” implications of robotics and other forms of automation, particularly as the Defense Department and the Chinese People’s Liberation Army recognize that the next generation of military technologies will be driven by machine intelligence.
The report also seeks to address the often heated rhetoric about the promise and pitfalls of AI. Elliot said it focuses on the overarching concept of machine intel- ligence, defined as machines augmenting humans to accomplish a specific task.
“AI often connotes ‘killer robots,’” El-
liot said in an interview, adding that the report seeks to tamp down the hyperbole around the technology.
BEYOND PREDICTIVE ANALYTICS
Among CSIS’ recommendations is a fed- eral role in establishing ethical and safety frameworks for implementing machine intelligence. The report also focuses on the impact of machine intelligence on na- tional security, in terms of its ability to af- fect economic competitiveness and trans- form the battlefield. Among the predicted outcomes is a seismic shift from today’s information warfare strategy to what ex- perts call “algorithmic warfare.”
A case in point is Project Maven, the moniker for a fast-tracked Defense De- partment effort called the Algorithmic Warfare Cross-Functional Team. Project Maven was launched in April 2017 to ac-
celerate DOD’s integration of big data and machine learning into its intelligence op- erations. The first algorithms for parsing full-motion video were released at the end of last year and can be updated almost daily, said Graham Gilmer, a member of Booz Allen’s machine intelligence team.
“We have analysts looking at full- motion video, staring at screens \[six to 11\] hours at a time,” said Lt. Gen. John Shanahan, DOD’s director of defense in- telligence, at an industry conference last November. “They’re doing the same thing photographic interpreters were doing in World War II.”
Project Maven aims to “let the ma- chines do what machines do well, and let humans do what only humans can do” — the cognitive, analytical portion of video interpretation, Shanahan added.
Gilmer said the early “indications and warnings” application of machine learn-
Securing the internet of battlefield things
BY PATRICK MARSHALL
Even as developers are struggling with how to integrate the various pieces of the emerging internet of things, the federal government is investing millions of dollars into a five-year brainstorming project to explore how to develop and secure connected devices on the battlefield.
The Army Research Lab recently awarded $25 million to the Alliance for Internet of Battlefield Things Research on Evolving Intelligent Goal-driven Networks (IoBT REIGN) to develop new predictive battlefield analytics.
“It is a really big project,” said Tancrède Lepoint, a computer scientist and cryptographer at SRI International and a member of the IoBT REIGN team. In all, 20 researchers
from six universities and SRI are involved in the project,
along with an expected 10 researchers from the Army Research Lab.
According to an SRI announcement, “The IoBT will connect soldiers with smart technology in armor, radios, weapons and other objects to give troops ‘extra
sensory’ perception, offer situational understanding, endow fighters with prediction powers, provide better risk assessment and develop shared intuitions.”
The project will also explore development of collaboration between autonomous agents and human warriors on the battlefield.
“One scenario could be to have smarter ammunition,” Lepoint said, which would allow commanders to configure the technology to target only, say, condemned buildings. “If you’re tempted to fire on something that is not
in the category, it would not fire,” he added. Similarly, demolition charges might be configured not to go off if sensors detect that humans are nearby.
“It is not about \[having devices\] pulling the trigger instead of the human,” Lepoint said. “It is about letting the human have a better understanding of the full environment.”
His primary work with the group focuses on securing what is becoming an increasingly complex battlefield using a combination of data-driven inductive learning and deductive reasoning techniques.
“We need to think about adversaries from the design of the algorithm and from everything that we develop,” he said.
“We need to focus on how to design algorithms to resist malicious adversaries.”
In addition to encrypting communications, he said, the team wants to develop algorithms that can detect when malicious data is being inserted. “We need to account for the fact that there will be conflicting and deceptive data received by those
algorithms, so let’s design them so that if you get enough data, you will still be able to carry out your
mission,” Lepoint said.
He acknowledged that the project is unusual in that it
doesn’t have any specified deliverables. “This is a collaborative research alliance,” he said. “It’s brainstorming over five years.”
In addition to researchers from the Army Research Lab and SRI, the project includes the University of Illinois at Urbana-Champaign; Carnegie Mellon University; University of California, Berkeley; University of California, Los Angeles; University of Massachusetts; and University of Southern California.•
       32 GCN APRIL/MAY 2018 • GCN.COM
































































   30   31   32   33   34