SKEDSOFT

Neural Network & Fuzzy Systems

Introduction:- The hidden layer allows ANN to develop its own internal representation of input-output mapping. The complex internal representation capability allows the hierarchical network to learn any mapping and not just the linearly separable ones.

Algorithm for Training Network

• Basic algorithm loop structure

Initialize the weights

Repeat

                For each training pattern

"Train on that pattern"

End

Until the error is acceptably low.

Back-Propagation Algorithm - Step-by-step procedure

Ø  Step 1: Normalize the I/P and O/P with respect to their maximum values.

For each training pair, assume that in normalized form there are

  •    ℓ inputs given by { I }I and

§  ℓ x 1

  •    n outputs given by { O}O

§  n x 1

Ø  Step 2 : Assume that the number of neurons in the hidden layers lie between 1 < m < 21

Ø  Step 3 :Let [ V ] represents the weights of synapses connecting input neuron and hidden neuron

Let [ W ] represents the weights of synapses connecting hidden neuron and output neuron Initialize the weights to small random values usually from -1 to 1;

[ V ] 0 = [ random weights ]

[ W ] 0 = [ random weights ]

[ Δ V ] 0 = [ Δ W ] 0 = [ 0 ]

For general problems λ can be assumed as 1 and threshold value as 0.

Ø  Step 4 : For training data, we need to present one set of inputs and outputs. Present the pattern as inputs to the input layer { I }I . then by using linear activation function, the output of the input layer may be evaluated as

Ø  Step 5 : Compute the inputs to the hidden layers by multiplying corresponding weights of synapses as

Ø  Step 6 :Let the hidden layer units, evaluate the output using the sigmoidal function as

Ø  Step 7 : Calculate the error using the difference between the network output and the desired output as for the j th training set as