BOOK-CHAPTER

Pruning Fuzzy Neural Network Using Group Sparsity Regularization

Abstract

As one interpretable technique, fuzzy neural network (FNN) can be equipped into any deep models, but it faces the problem of high dimension when FNN is used for designing deep networks. To break the curse of dimensionality, in this paper, we use regularization method to reduce the influence of dimensionality on FNN, and based on L2,1 norm (Group Lasso) two different regularizer terms are designed. Using the gradient method, FNN can learn to evaluate the importance of features and rules, respectively, and then realizes feature selection (FS) and rule generation (RG). By dint of the simulation results on the benchmark classification problems Iris and Sonar, the validity of our proposed fuzzy classifier is verified, i.e., without decreasing the classification performance, the structure of the fuzzy model can be simplified. The regularized FNN can be easily used for interpretable deep model design.

Keywords:
Curse of dimensionality Artificial intelligence Fuzzy logic Artificial neural network Regularization (linguistics) Classifier (UML) Computer science Pattern recognition (psychology) Benchmark (surveying) Feature selection Machine learning

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
19
Refs
0.54
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Neural Networks and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
Fuzzy Logic and Control Systems
Physical Sciences →  Computer Science →  Artificial Intelligence
Face and Expression Recognition
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.