Page 66 - MSDN Magazine, August 2017
P. 66

Figure 2 Basic Deep Neural Network Demo Run
DNN shown in Figure 1 and illustrates the input-output mechanism by displaying the values of the 13 nodes in the network. The demo code that generated the output begins with the code shown in Figure 3.
Notice that the demo program uses only plain C# with no namespaces except for System. The DNN is created by passing the number of nodes in each layer to a DeepNet program-defined class constructor. The number of hidden layers, 3, is passed implicitly as the number of items in the numHidden array. An alternative design is to pass the number of hidden layers explicitly.
The values of the 26 weights and the 11 biases are set like so:
int nw = DeepNet.NumWeights(numInput, numHidden, numOutput); Console.WriteLine("Setting weights and biases to 0.01 to " +
(nw/100.0).ToString("F2") ); double[] wts = new double[nw];
for (int i = 0; i < wts.Length; ++i)
wts[i] = (i + 1) * 0.01; dn.SetWeights(wts);
The total number of weights and biases is calculated using a static class method NumWeights. If you refer back to Figure 1, you can see that because each node is connected to all nodes in the layer to the right, the number of weights is (2*4) + (4*2) + (2*2) + (2*3) =8+8+4+6=26.Becausethere’sonebiasforreachhiddenand outputnode,thetotalnumberofbiasesis4+2+2+3=11.
Figure 3 Beginning of Output-Generating Code
An array named wts is instantiated with 37 cells and then the values are set to 0.01 through 0.37. These values are inserted into the DeepNet object using the SetWeights method. In a realistic, non-demo DNN, the values of the weights and biases would be determined using a set of data that has known input values and known, correct output values. This is called training the network. The most common training algorithm is called back-propagation.
The Main method of the demo program concludes with:
...
Console.WriteLine("Computing output for [1.0, 2.0] "); double[] xValues = new double[] { 1.0, 2.0 }; dn.ComputeOutputs(xValues);
dn.Dump(false);
Console.WriteLine("End demo");
Console.ReadLine();
} // Main
} // Class Program
Method ComputeOutputs accepts an array of input values and then uses the input-output mechanism, which I’ll explain shortly, to calculate and store the values of the output nodes. The Dump helper method displays the values of the 13 nodes, and the “false” argu- ment means to not display the values of the 37 weights and biases.
The Input-Output Mechanism
The input-output mechanism for a DNN is best explained with a concrete example. The first step is to use the values in the input nodes to calculate the values of the nodes in the first hidden layer. The value of the top-most hidden node in the first hidden layer is:
tanh( (1.0)(0.01) + (2.0)(0.05) + 0.27 ) =
tanh(0.38) = 0.3627
In words, “compute the sum of the products of each input node
and its associated weight, add the bias value, then take the hyper- bolic tangent of the sum.” The hyperbolic tangent, abbreviated tanh, is called the activation function. The tanh function accepts any value from negative infinity to positive infinity, and returns a value between -1.0 and +1.0. Important alternative activation functions include the logistic sigmoid and rectified linear (ReLU) functions, which are outside the scope of this article.
The values of the nodes in the remaining hidden layers are cal- culated in exactly the same way. For example, hidden[1][0] is:
tanh( (0.3627)(0.09) + (0.3969)(0.11) + (0.4301)(0.13) + (0.4621) (0.15) + 0.31 ) =
tanh(0.5115) = 0.4711
And hidden[2][0] is:
tanh( (0.4711)(0.17) + (0.4915)(0.19) + 0.33 ) = tanh(0.5035) = 0.4649
The values of the output nodes are calculated using a different activation function, called softmax. The preliminary, pre-activation sum-of-products plus bias step is the same:
pre-activation output[0] = (.4649)(0.21) + (0.4801)(0.24) + 0.35 = 0.5628
pre-activation output[1] = (.4649)(0.22) + (0.4801)(0.25) + 0.36 = 0.5823
pre-activation output[2] = (.4649)(0.23) + (0.4801)(0.26) + 0.37 = 0.6017
using System;
namespace DeepNetInputOutput {
class DeepInputOutputProgram {
static void Main(string[] args) {
Console.WriteLine("Begin deep net IO demo"); Console.WriteLine("Creating a 2-(4-2-2)-3 deep network"); int numInput = 2;
int[] numHidden = new int[] { 4, 2, 2 };
int numOutput = 3;
DeepNet dn = new DeepNet(numInput, numHidden, numOutput);
...
60 msdn magazine
Test Run





















































   64   65   66   67   68