Brian CarseTony PipeTerence C. FogartyTim Hill
Most research to date using genetic algorithms to evolve neural networks has focused on the multi-layer perceptron. Altemative neural network approaches such as the radial basis function network, and their representations appear to have received relatively little attention as grist for the GA mill. This is perhaps surprising since, for example, the radial basis function network has also been proved to be universal function approximator. Here we focus on evolution of radial basis function networks. While the multilayer perceptron network approximates functions through global interaction between network nodes, the radial basis function network uses local interactions between network nodes. It is suggested, that this difference may be of significance in tem of epistatic interactions in encoded genomes for the two types of network, which affects the ability of the genetic algorithm to evolve successful networks. A representation and attendant genetic operators for evolving radial basis function networks are proposed, drawing on recent work on evolutionary fuzzy logic systems. Experimental results in applying a hybrid leaming technique, using a genetic algorithm for evolving the radial basis function hidden layer (number of hidden nodes and hidden node centres and widths) and supervised leaming for tuning of network connection weights, are presented.
Taoufyq ElansariMohammed OuananHamid Bourray