Page 64 - MSDN Magazine, August 2017
P. 64

58 msdn magazine
TesT Run JAMES MCCAFFREY Deep Neural Network IO Using C#
Many of the recent advances in machine learning (making predictions using data) have been realized using deep neural networks. Examples include speech rec- ognition in Microsoft Cortana and Apple Siri, and the image recognition that helps enable self-driving automobiles.
The term deep neural network (DNN) is general and there are several specific variations, including recurrent neural networks (RNNs) and convolutional neural networks (CNNs). The most basic form of a DNN, which I explain in this article, doesn’t have a special name, so I’ll refer to it just as a DNN.
This article will introduce you to DNNs
so you’ll have a concrete demo program
to experiment with, which will help you
to understand literature on DNNs. I won’t present code that can be used directly in a production system, but the code can be extended to create such a system, as I’ll explain. Even if you never intend to implement a DNN, you might find the explanation of how they work interesting for its own sake.
A DNN is best explained visually. Take a look at Figure 1. The deep network has two input nodes, on the left, with values (1.0, 2.0). There are three output nodes on the right, with values (0.3269, 0.3333, 0.3398). You can think of a DNN as a complex math func- tion that typically accepts two or more numeric input values and returns one or more numeric output values.
The DNN shown might correspond to a problem where the goal is to predict the political party affiliation (Democrat, Republican, Other) of a person based on age and income, where the input values are scaled in some way. If Democrat is encoded as (1,0,0) and Republican is encoded as (0,1,0) and Other is encoded as (0,0,1), then the DNN in Figure 1 predicts Other for someone with age = 1.0 and income = 2.0 because the last output value (0.3398) is the largest.
A regular neural network has a single hidden layer of processing nodes. A DNN has two or more hidden layers and can handle very difficult prediction problems. Specialized types of DNNs, such as RNNs and CNNs, also have multiple layers of processing nodes, but more complicated connection architectures, as well.
Code download available at msdn.com/magazine/0717magcode.
input
1.0
2.0
hidden (3)
output
.02
.01
.08
.09 .10
.16
.31
.4711
.32
.4915
.17
.20
.33
.4649
.34
.4801
.21
.26
.5628
.5823
.6017
.3269
.3333
.3398*
Figure 1 A Basic Deep Neural Network
.27
.3627
.3969
.4301
.30
.4621
.35
The DNN in Figure 1 has three hidden layers of processing nodes. The first hidden layer has four nodes, the second and third hidden layers have two nodes. Each long arrow pointing from left to right represents a numeric constant called a weight. If nodes are zero-base indexed with [0] at the top of the figure, then the weight connecting input[0] to hidden[0][0] (layer 0, node 0) has value 0.01 and the weight connecting input[1] to hidden[0][3] (layer 0, node 3) has value 0.08 and so on. There are 26 node-node weight values.
Each of the eight hidden and three output nodes has a small arrow that represents a numeric constant called a bias. For example, hidden[2][0] has bias value of 0.33 and output[1] has a bias value of 0.36. Not all of the weights and bias values are labeled in the dia- gram, but because the values are sequential between 0.01 and 0.37, you can easily determine the value of a non-labeled weight or bias.
In the sections that follow, I explain how the DNN input-output mechanism works and show how to implement it. The demo pro- gram is coded using C#, but you shouldn’t have too much trouble refactoring the code to another language, such as Python or JavaScript, if you wish to do so. The demo program is too long to present in its entirety in this article, but the complete program is available in the accompanying code download.
The Demo Program
A good way to see where this article is headed is to examine the screen- shot of the demo program in Figure 2. The demo corresponds to the
.37













































   62   63   64   65   66