Selecting relevant features for machine learning \nmodeling improves the performance of the learning methods. \nMutual information (MI) is known to be used as relevant \ncriterion for selecting feature subsets from input dataset with a \nnonlinear relationship to the predicting attribute. However, \nmutual information estimator suffers the following limitation; it \ndepends on smoothing parameters, the feature selection greedy \nmethods lack theoretically justified stopping criteria and in \ntheory it can be used for both classification and regression \nproblems, however in practice more often it formulation is \nlimited to classification problems. This paper investigates a \nproposed improvement on the three limitations of the Mutual \nInformation estimator (as mentioned above), through the use of \nresampling techniques and formulation of mutual information \nbased on differential entropic for regression problems.
Y B WangY. G. XieSufang ZhouBingbing JiangHangjun Che
P. CarmonaJosé Martínez SotocaFiliberto PlaFrederick Kin Hing PhoaJosé M. Bioucas‐Dias
Benoît FrénayGauthier DoquireMichel Verleysen
Huawen LiuJigui SunLei LiuHuijie Zhang
Alberto GuillénAntti SorjamaaG. RubioAmaury LendasseIgnacio Rojas