JOURNAL ARTICLE

Non-Negative Spectral Learning and Sparse Regression-Based Dual-Graph Regularized Feature Selection

Ronghua ShangWenbing WangRustam StolkinLicheng Jiao

Year: 2017 Journal:   IEEE Transactions on Cybernetics Vol: 48 (2)Pages: 793-806   Publisher: Institute of Electrical and Electronics Engineers

Abstract

Feature selection is an important approach for reducing the dimension of high-dimensional data. In recent years, many feature selection algorithms have been proposed, but most of them only exploit information from the data space. They often neglect useful information contained in the feature space, and do not make full use of the characteristics of the data. To overcome this problem, this paper proposes a new unsupervised feature selection algorithm, called non-negative spectral learning and sparse regression-based dual-graph regularized feature selection (NSSRD). NSSRD is based on the feature selection framework of joint embedding learning and sparse regression, but extends this framework by introducing the feature graph. By using low dimensional embedding learning in both data space and feature space, NSSRD simultaneously exploits the geometric information of both spaces. Second, the algorithm uses non-negative constraints to constrain the low-dimensional embedding matrix of both feature space and data space, ensuring that the elements in the matrix are non-negative. Third, NSSRD unifies the embedding matrix of the feature space and the sparse transformation matrix. To ensure the sparsity of the feature array, the sparse transformation matrix is constrained using the -norm. Thus feature selection can obtain accurate discriminative information from these matrices. Finally, NSSRD uses an iterative and alternative updating rule to optimize the objective function, enabling it to select the representative features more quickly and efficiently. This paper explains the objective function, the iterative updating rules and a proof of convergence. Experimental results show that NSSRD is significantly more effective than several other feature selection algorithms from the literature, on a variety of test data.

Keywords:
Feature selection Artificial intelligence Pattern recognition (psychology) Regression Graph Dual (grammatical number) Computer science Dictionary learning Mathematics Machine learning Statistics Sparse approximation Theoretical computer science Philosophy

Metrics

141
Cited By
6.74
FWCI (Field Weighted Citation Impact)
74
Refs
0.97
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Face and Expression Recognition
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Machine Learning and ELM
Physical Sciences →  Computer Science →  Artificial Intelligence
Remote-Sensing Image Classification
Physical Sciences →  Engineering →  Media Technology
© 2026 ScienceGate Book Chapters — All rights reserved.