JOURNAL ARTICLE

Unsupervised feature selection by nonnegative sparsity adaptive subspace learning

Abstract

Given the high-dimensionality of the original data, dimensionality reduction becomes a necessary step in data processing. In this study, a novel unsupervised feature selection model is proposed, which regards the unsupervised feature selection process as nonnegative subspace learning. Considering the efficiency of the learned subspace which can better indicate the selected features, a nonnegative sparsity adaptive subspace learning framework is proposed. It adapts the sparsity by weighted l 2, 1 model. Specifically, the weights are defined by multi-stage support detection. Then we provide an approach to solve this weighted l 2, 1 constraint non-convex problem leading to the Non-negative Sparsity Adaptive Subspace Learning (NSASL) algorithm. By the experiments which are conducted on real-word datasets, the superiority of proposed method over seven state-of-the-art unsupervised feature selection algorithms is verified.

Keywords:
Subspace topology Artificial intelligence Pattern recognition (psychology) Computer science Constraint (computer-aided design) Dimensionality reduction Feature selection Curse of dimensionality Feature (linguistics) Unsupervised learning Selection (genetic algorithm) Machine learning Mathematics

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
21
Refs
0.11
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Face and Expression Recognition
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Machine Learning and ELM
Physical Sciences →  Computer Science →  Artificial Intelligence
Text and Document Classification Technologies
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.