JOURNAL ARTICLE

Feature selection with mutual information for regression problems

Abstract

Selecting relevant features for machine learning \nmodeling improves the performance of the learning methods. \nMutual information (MI) is known to be used as relevant \ncriterion for selecting feature subsets from input dataset with a \nnonlinear relationship to the predicting attribute. However, \nmutual information estimator suffers the following limitation; it \ndepends on smoothing parameters, the feature selection greedy \nmethods lack theoretically justified stopping criteria and in \ntheory it can be used for both classification and regression \nproblems, however in practice more often it formulation is \nlimited to classification problems. This paper investigates a \nproposed improvement on the three limitations of the Mutual \nInformation estimator (as mentioned above), through the use of \nresampling techniques and formulation of mutual information \nbased on differential entropic for regression problems.

Keywords:
Mutual information Feature selection Conditional mutual information Computer science Feature (linguistics) Artificial intelligence Pointwise mutual information Machine learning Estimator Resampling Regression Information theory Smoothing Data mining Pattern recognition (psychology) Mathematics Statistics

Metrics

10
Cited By
0.94
FWCI (Field Weighted Citation Impact)
23
Refs
0.89
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Neural Networks and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
Evolutionary Algorithms and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
Face and Expression Recognition
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.