JOURNAL ARTICLE

Multiple Manifold Regularized Sparse Coding for Multi-View Image Clustering

Abstract

Multi-view clustering has received an increasing attention in many applications, where different views of objects can provide complementary information to each other. Existing approaches on multi-view clustering mainly focus on extending Non-negative Matrix Factorization (NMF) by enforcing the constraint over the coefficient matrices from different views in order to preserve their consensus. In this paper, we argue that it is more reasonable to utilize the high-level manifold consensus rather than the low-level coefficient matrix consensus to better capture the underlying clustering structure of the data. Moreover, it is also effective to utilize the sparse coding framework, instead of the NMF framework, to deal with the sparsity issue. To this end, we propose a novel approach, named Multiple Manifold Regularized Sparse Coding (MMRSC). Experimental results on two publicly available real-world image datasets demonstrate that our proposed approach can significantly outperform the state-of-the-art approaches for the multi-view image clustering task.

Keywords:
Cluster analysis Non-negative matrix factorization Computer science Neural coding Manifold (fluid mechanics) Sparse matrix Artificial intelligence Image (mathematics) Clustering high-dimensional data Coding (social sciences) Constraint (computer-aided design) Pattern recognition (psychology) Matrix decomposition Data mining Mathematics Eigenvalues and eigenvectors

Metrics

2
Cited By
0.00
FWCI (Field Weighted Citation Impact)
10
Refs
0.12
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Face and Expression Recognition
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Sparse and Compressive Sensing Techniques
Physical Sciences →  Engineering →  Computational Mechanics
Remote-Sensing Image Classification
Physical Sciences →  Engineering →  Media Technology
© 2026 ScienceGate Book Chapters — All rights reserved.