The derivative matrix, or the Jacobian matrix, of the output vector with respect to the input vector is obtained for multilayer feedforward neural networks (MFNNs). This matrix represents the sensitivity to small perturbations in the input of an MFNN. The expression for the matrix describes the performance of the MFNN, such as the generalization capabilities, as well as error-correcting properties. Analysis shows how these aspects of performance are affected by the weight matrices, the sigmoid functions, and the number of layers and nodes of the network. Suggestions are made for the design of MFNNs with good generalization and error-correction.< >
Feng LiJacek M. ŻuradaYan LiuWei Wu
Andries P. EngelbrechtL. Royal FletcherI. Cloete
Chidchanok LursinsapV. Coowanitwong
Mercedes Fernández RedondoCarlos Hernández Espinosa