JOURNAL ARTICLE

Magnetic Optimization Algorithm for training Multi Layer Perceptron

Abstract

Recently, feedforward neural network (FNN), especially Multi Layer Perceptron (MLP) has become one of the most widely-used computational tools, applied to many fields. Back Propagation is the most common method to learn MLP. This learning algorithm is a gradient-based algorithm, but it suffers some drawbacks such as trapping in local minima and slow convergence. These weaknesses make MLP unreliable in solving real-world problems. Using heuristic optimization algorithms is a popular approach to improve the drawbacks of BP. Magnetic Optimization Algorithm (MOA) is a novel heuristic optimization algorithm, inspired from the magnetic field theory. It has been proven that this algorithm is capable of solving optimization problems quickly and accurately. In this paper, MOA is employed as a new training method for MLP in order to improve the aforementioned shortcomings. The proposed learning method was compared with PSO and GA-based learning algorithms using 3-bit XOR and function approximation benchmark problems. The results prove the high performance of this new learning algorithm for large numbers of training samples.

Keywords:
Computer science Algorithm Perceptron Maxima and minima Artificial neural network Benchmark (surveying) Backpropagation Feedforward neural network Heuristic Multilayer perceptron Convergence (economics) Optimization problem Artificial intelligence Machine learning Mathematics

Metrics

44
Cited By
1.96
FWCI (Field Weighted Citation Impact)
13
Refs
0.89
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Neural Networks and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
Blind Source Separation Techniques
Physical Sciences →  Computer Science →  Signal Processing
Machine Learning and ELM
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Training multi-layer perceptron with artificial algae algorithm

Bahaeddin TürkoğluErsin Kaya

Journal:   Engineering Science and Technology an International Journal Year: 2020 Vol: 23 (6)Pages: 1342-1350
JOURNAL ARTICLE

Training Multi-Layer Perceptron Using Harris Hawks Optimization

Erdal EkerMurat KayriSerdar EkinciDavut İzci

Journal:   2020 International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA) Year: 2020 Pages: 1-5
BOOK-CHAPTER

Multi-Layer Perceptron Training

Adrian J. Shepherd

Perspectives in neural computing Year: 1997 Pages: 1-22
© 2026 ScienceGate Book Chapters — All rights reserved.