Multiple kernel learning(MKL) aims to find an optimal consistent kernel function.In the hierarchical multiple kernel clustering(HMKC) algorithm,the sample features are extracted layer by layer from high-dimensional space to maximize the retention of effective information,but the information interaction between layers is ignored.In this model,only the corresponding nodes in the adjacent layer will exchange information,but for other nodes,it is isolated,and if the full connection is adopted,the diversity of the final consistence matrix will be reduced.Therefore,this paper proposes a hierarchical multiple kernel K-Means(SCHMKKM) algorithm based on sparse connectivity,which controls the assignment matrix to achieve the effect of sparse connections through the sparsity rate,thereby locally fusing the features obtained by the distillation of information between layers.Finally,we perform cluster analysis on multiple data sets and compare it with the fully connected hierarchical multiple kernel K-Means(FCHMKKM) algorithm in experiment.Finally,it is proved that more discriminative information fusion is beneficial to learn a better consistent partition matrix,and the fusion strategy of sparse connection is better than the strategy of full connection.
Beomjin ParkChangyi ParkSungchul HongHosik Choi
Yihang LuHaonan XinRong WangFeiping NieXuelong Li
Miaomiao LiYi ZhangChuan MaSuyuan LiuZhe LiuJianping YinXinwang LiuQing Liao
Rong WangJitao LuYihang LuFeiping NieXuelong Li