JOURNAL ARTICLE

Multilayer perceptron and neural networks

Marius-Constantin PopescuValentina Emilia BălaşLiliana Perescu-PopescuNikos E. Mastorakis

Year: 2009 Journal:   WSEAS Transactions on Circuits and Systems archive Vol: 8 (7)Pages: 579-588

Abstract

The attempts for solving linear inseparable problems have led to different variations on the number of layers of neurons and activation functions used. The backpropagation algorithm is the most known and used supervised learning algorithm. Also called the generalized delta algorithm because it expands the training way of the adaline network, it is based on minimizing the difference between the desired output and the actual output, through the downward gradient method (the gradient tells us how a function varies in different directions). Training a multilayer perceptron is often quite slow, requiring thousands or tens of thousands of epochs for complex problems. The best known methods to accelerate learning are: the momentum method and applying a variable learning rate. The paper presents the possibility to control the induction driving using neural systems.

Keywords:
Backpropagation Artificial neural network Activation function Multilayer perceptron Computer science Perceptron Variable (mathematics) Artificial intelligence Function (biology) Algorithm Momentum (technical analysis) Machine learning Mathematics

Metrics

590
Cited By
1.91
FWCI (Field Weighted Citation Impact)
13
Refs
0.91
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Neural Networks and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
Fuzzy Logic and Control Systems
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.