Feature selection is an important step of data preprocessing in data mining, so selecting the optimal subset of features can effectively reduce the data dimension and computational cost of learning algorithms. In this paper, we focus on the use of a Binary Particle Swarm Optimization (BPSO) based on conditional independence (CIBPSO) for feature selection, and to reduce the number of features and improve classification accuracy. CIBPSO adaptively adjusts particles for local search to prevent premature convergence. In a comparative analysis with BPSO and two BPSO-based feature selection algorithms at the same computational cost on 10 datasets of different sizes, the results show that CIBPSO consistently excels in reducing the number of features and significantly improves classification accuracy across a range of datasets.
Raphael Kwaku BotchwayVinod Kumar YadavZuzana Komínková OplatkováRoman Šenkeřík
Surjodoy Ghosh DastiderHimanshu KashyapShashwata MandalAbhinandan GhoshSaptarsi Goswami
Xuyang TengHongbin DongXiurong Zhou