SKEDSOFT

Neural Network & Fuzzy Systems

Introduction:-Neural networks are universal function approximators. They are "model-free estimators" in the sense that the type of function is not required to be known in order for the function to be approximated. One difficulty, though, is how to choose the best neural network architecture, that is, theneural network model with the smallest approximation error. When a multilayer perceptrons (MLPs) are used, this is the problem of finding the optimal number of hidden nodes.

In addition to the heuristics given there, some other techniques are applicable, such as:-

Growing neural networks. Training starts with a small number of hidden nodes and, subject to the error calculated, the number of the hidden nodes may increase during the training procedure.

Pruning. This technique is based on gradually removing from the network the weak connectionsand the neurons connected by them during the training procedure. After removing redundant connections and nodes, the whole network continues to be trained and the rest of the connections "take the functions which the pruned ones might have been doing." Pruning may be implemented through learning-with-forgetting methods, when the weak connections gradually fade away and eventually get pruned.

Growing and pruning are also applicable to input neurons, thus making the whole neural network dynamically changing according to the existing information in the data set.