Wen-Jue HeZheng ZhangXiaofeng Zhu
Efficiently learning informative yet compact representations from heterogeneous data remains challenging in incomplete multi-view clustering (IMC). The prevalent resource-efficient IMC models excel in constructing small-size anchors for fast similarity learning and data partition. However, existing anchor-based methods still suffer from shared deficiencies: 1) unstable and less informative anchor generation by random anchor selection or clueless learning and 2) imbalanced coherence and versatility capabilities of the learned anchors among different views. To mitigate these issues, we propose a novel dual-correlation-guided anchor learning (DCGA) method for scalable IMC, which learns informative anchor spaces to simultaneously incorporate both intra-view and inter-view correlations. Specifically, the intra-view anchor space is constructed and stabilized by compressing the view-specific data under the guidance of the conceived anchors as a bottleneck (A3B) strategy, with a strict theoretic analysis. Importantly, we, for the first time, build an unsupervised anchor learning scheme for incomplete multi-view data under the guidance of the bottleneck of information flow with the well-defined IB principle. As such, our model can simultaneously eliminate information redundancy and preserve the versatile knowledge derived from each view. Moreover, to endow the coherence of the learned anchors, an informative anchor constraint (IAC) is imposed to align the anchor spaces across different views. Extensive experiments on seven datasets against 11 state-of-the-art IMC methods validate the effectiveness and efficiency of our method. Code is available at https://github.com/DarrenZZhang/TNNLS25-DCGA.
Ao LiXiangmin XuTianyu GaoDehua MiaoFengwei GuXinwang Liu
Luyan CuiHuibing WangYawei ChenMingze YaoXianping FuJiqing Zhang
Xingfeng LiYinghui SunQuansen SunZhenwen RenYuan Sun
Peng SongJinshuai MuYuanbo ChengLiu Zhao-huWenming Zheng
Jun YinRuncheng CaiShiliang Sun