JOURNAL ARTICLE

Mutual information based on Renyi's entropy feature selection

Abstract

Feature selection problem has become the focus of much pattern classification research and mutual information is more and more important in the feature selection algorithms. We proposed normalized mutual information based on Renyi's quadratic entropy feature selection, which reduces the computational complexity, relying on the efficient estimation of the mutual information. Then we combine NMIFS with wrappers into a two-stage feature selection algorithm. This helps us find more charactering feature subset. We perform some experiments to compare the efficiency and classification accuracy to other MI-based feature selection algorithm. Results show that our method leads to promising improvement on computation complexity.

Keywords:
Mutual information Feature selection Entropy (arrow of time) Computer science Feature (linguistics) Pattern recognition (psychology) Artificial intelligence Rényi entropy Computation Information theory Computational complexity theory Information gain Selection (genetic algorithm) Data mining Algorithm Mathematics Principle of maximum entropy Statistics

Metrics

11
Cited By
1.24
FWCI (Field Weighted Citation Impact)
30
Refs
0.84
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Face and Expression Recognition
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Neural Networks and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
Fuzzy Logic and Control Systems
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.