SKEDSOFT

Neural Network & Fuzzy Systems

Neural networks (NNs) are the adaptive system that changes its structure based on external or internal information that flows through the network. Neural network solve problems by self-learning and self-organizing.

Back Propagation Network (BPN) is a method of training multi-layer neural networks. Here learning occurs during this training phase.

The steps involved are:

− The pattern of activation arriving at the output layer is compared with the correct output pattern to calculate an error signal.

− The error signal is then back-propagated from output to input for adjusting the weights in each layer of the BPN.

− The Back-Propagation searches on the error surface using gradient descent method to minimize error E = 1/2 Σ ( T j – O j )2 where Tjis target output and Ojis the calculated output by the network.

Limitations of BPN :

− BPN can recognize patterns similar to those they have learnt, but do not have the ability to recognize new patterns.

− BPN must be sufficiently trained to extract enough general features applicable to both seen and unseen; over training to network may have undesired effects.

Genetic Algorithms (GAs)are adaptive search and optimization algorithms, mimic the principles of nature.

− GAs are different form traditional search and

− Optimization exhibit simplicity, ease of operation, minimal requirements, and global perspective.

Hybridization of BPN and GAs

− The BPN determines its weight based on gradient search technique and therefore it may encounter a local minima problem.

− GAs do not guarantee to find global optimum solution, but are good in finding quickly good acceptable solution.

− Therefore, hybridization of BPN and GAs are expected to provide many advantages compare to what they alone can.