Abstract

The optimization of architecture of feed-forward neural networks is a complex task of high importance in supervised learning because it has a great impact on the convergence of learning methods. In this paper, we propose a multi-objective mathematical formulation in order to determine the optimal number of hidden layers, the number of neurons in each layer and good values of weights. We will solve our mathematical modeling using a hybridation of the famous genetic algorithm and the back-prop training algorithm. For evaluating our approach, we apply it to benchmark classification problems data-iris, seed, and wine. The obtained results demonstrated the effectiveness of the proposed approach.

Keywords:
Benchmark (surveying) Computer science Artificial neural network Convergence (economics) Genetic algorithm Artificial intelligence Algorithm Machine learning Backpropagation Task (project management) Layer (electronics) Engineering

Metrics

61
Cited By
3.95
FWCI (Field Weighted Citation Impact)
14
Refs
0.97
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Neural Networks and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
Fuzzy Logic and Control Systems
Physical Sciences →  Computer Science →  Artificial Intelligence
Fault Detection and Control Systems
Physical Sciences →  Engineering →  Control and Systems Engineering
© 2026 ScienceGate Book Chapters — All rights reserved.