Page 47 - Federal Computer Week, May/June 2019
P. 47

FCWIdeas
Unlocking the black box of AI reasoning
Research into the inner workings of generative adversarial networks reveals a process that closely matches how humans analyze images
BY PATRICK MARSHALL
Although artificial intelligence has proved effective at many tasks critical to government — such as protecting power grids against hack- ing — some agencies have been reluctant to use AI tools because their inner workings are unintelli- gible to humans. How can a solution be trusted if nobody knows how it works?
David Bau, a Ph.D. student at the Massachusetts Institute of Technol- ogy, believes generative adversarial networks could help show how AI algorithms reach their conclusions. He and others are testing GANs as tools not only for performing tasks such as pattern recognition, but for examining how neural networks make decisions.
GANs gained recent notoriety
for being used to create the first AI painting sold at auction and, more disturbingly, for superimposing
the faces of celebrities on porn stars in videos. According to Bau, GANs work differently from other AI algorithms.
“Most neural network training is done as a one-player game, where we set up...rules and the network learns to beat the game,” he said. By contrast, training a GAN involves a two-player model in which the goal is for the GAN to achieve better results than a neural network adver- sary in generating accurate images.
“Under normal training, we would train a neural network to mimic the human behavior,” Bau said. In this case, we “show a neural network a huge number of images, completely unlabeled by a person.” It’s up to the GAN to find any discernible struc-
tures in the images — airplanes, cars, gates, trees — without cues from humans.
Bau said there are only two ways an algorithm can accomplish the task. It can memorize the pixel-by- pixel characteristics of images and then compare them to freshly pro- vided images. The other approach is one that humans use — a composi- tional strategy that involves analyz- ing an image and breaking it into its parts.
network to look for patterns and cor- relations in the GAN’s process.
The team found that, indeed, the GAN had isolatable number sequenc- es, which researchers dubbed “neurons,” that related to objects in images. “We found that this neural network has neurons that corre- spond to trees, other ones that corre- spond to doors, other ones that cor- respond to rooftops,” Bau said. “The correlations are high enough that it is really suggestive that the network is breaking down the images in terms of objects that humans would call compositional, things that make up the scene.”
The team tested the finding by erasing selected neurons and launch- ing the GAN again. Sure enough, when they erased the neurons that correlated with “tree,” the image eventually generated by the GAN was missing trees.
Perhaps even more surprising, “when we get rid of the trees, the neural network keeps on painting a reasonable picture,” Bau said. “It doesn’t just draw blotches all over the place. Instead of the tree, it will draw the building that was behind the tree.”
The researchers’ paper, which is posted online, includes an interactive photo-manipulation app that dem- onstrates that kind of object-level control. Each brush stroke activates a set of neurons in a GAN that has learned to draw scenes.
In short, the algorithm seems to be learning how to capture the structure of the world without human inter- vention. “That’s one of the holy grails of machine learning,” Bau said. n
To see how the GAN made its choices, the team cracked it open partway through a job. “We stopped the computation halfway through, and we looked at the internal vari- ables that the GAN produced,” Bau said.
The researchers wanted to see
if the GAN was processing certain patterns of numbers that correlated with structured data in the image. For instance, did one sequence of numbers represent the concept of “tree”?
Bau acknowledged that analyzing the processing of a GAN would chal- lenge even veteran programmers,
so the team used another neural
May/June 2019 FCW.COM 47


































































































   45   46   47   48   49