JOURNAL ARTICLE

Efficient Feature Selection via $\ell _{2,0}$ ℓ2,0-norm Constrained Sparse Regression

Tianji PangFeiping NieJunwei HanXuelong Li

Year: 2018 Journal:   IEEE Transactions on Knowledge and Data Engineering Vol: 31 (5)Pages: 880-893   Publisher: IEEE Computer Society

Abstract

Sparse regression based feature selection method has been extensively investigated these years. However, because it has a non-convex constraint, i.e., $\ell _{2,0}$ℓ2,0-norm constraint, this problem is very hard to solve. In this paper, unlike most of the other methods which only solve its slack version by introducing sparsity regularization into objective function forcibly, a novel framework is proposed by us to solve the original $\ell _{2,0}$ℓ2,0-norm constrained sparse regression based feature selection problem. We transform our objective function into Linear Discriminant Analysis (LDA) by using a new label coding method, thus enabling our model to calculate the ratio of inter-class scatter to intra-class scatter of features which is the most widely used feature discrimination evaluation metric. According to that ratio, features can be selected by a simple sorting method. The projection gradient descent algorithm is introduced to further improve the performance of our algorithm by using the solution obtained before as its initial solution. This ensures the stability of this iterative algorithm. We prove that the proposed method can get the global optimal solution of this non-convex problem when all features are statistically independent. For the general case where features are statistically dependent, extensive experiments on six small sample size datasets and one large-scale dataset show that our algorithm has comparable or better classification capability comparing with other eight state-of-the-art feature selection methods by the SVM classifier. We also show that our algorithm can obtain a low loss value, which means the solution of our algorithm can get very close to this NP-hard problem's real solution. What is more, because we solve the original $\ell _{2,0}$ℓ2,0-norm constrained problem, we avoid the heavy work of tuning the regularization parameter because its meaning is explicit in our method, i.e., the number of selected features. At last, we evaluate the stability of our algorithm from two perspectives, i.e., the objective function values and the selected features, by experiments. From both perspectives, our algorithm shows satisfactory stability performance.

Keywords:
Feature selection Computer science Norm (philosophy) Artificial intelligence Regression Pattern recognition (psychology) Selection (genetic algorithm) Mathematics Statistics

Metrics

93
Cited By
3.32
FWCI (Field Weighted Citation Impact)
61
Refs
0.92
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Face and Expression Recognition
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Sparse and Compressive Sensing Techniques
Physical Sciences →  Engineering →  Computational Mechanics
Machine Learning and ELM
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Unsupervised Discriminative Feature Selection With $\ell _{2,0}$ℓ2,0-Norm Constrained Sparse Projection

Xia DongFeiping NieLai TianRong WangXuelong Li

Journal:   IEEE Transactions on Pattern Analysis and Machine Intelligence Year: 2025 Vol: 47 (10)Pages: 8321-8335
JOURNAL ARTICLE

Structured Sparse Non-negative Matrix Factorization with $\ell _{2,0}$-Norm

Wenwen MinTaosheng XuXiang WanTsung‐Hui Chang

Journal:   IEEE Transactions on Knowledge and Data Engineering Year: 2022 Pages: 1-13
JOURNAL ARTICLE

Robust $$\ell _{2,0}$$-penalized rank regression for high-dimensional group selection

Jing LvChaohui Guo

Journal:   Statistics and Computing Year: 2025 Vol: 35 (3)
© 2026 ScienceGate Book Chapters — All rights reserved.