Graph Convolutional Networks (GCNs) have recently received a lot of attention, owing to their ability to handle graph-structured data. To improve the expressive power of GCNs, several recent studies has concentrated on the stacking of multiple layers, such as convolutional neural networks. However, simply stacking multiple GCN layers will lead to over-fitting and over-smoothing issues. To integrate deeper information and solve the above problems, this paper proposes Multi-Hop Diffusion-Based Graph Convolutional Networks (MD-GCNs), a method for aggregating and stacking multi-hop neighbors of varying orders into one layer, allowing for the capture of long-distance interactions between remote nodes at each layer of GCNs. In order to calculate the weight between neighbor nodes with multi-hop in the same layer, Multi-Hop Diffusion (MD) mechanism introduces the graph diffusion to spread the weight, the receptive field of each layer of GCNs is increased. On this basis, we introduce the MD-GCNs architecture that can be stacked in multiple layers and has the ability to be expressed. Experimental results on node classification tasks in both transductive and inductive learning settings demonstrate the superiority of the proposed method.
Siyu HuangChaoying TangYuren Sun
Chang YinQing ZhouLiang GeJiaojiao Ou
Hao LiuDong LiBing ZengHaopeng Ren
Yongping DuRui YanYing HouYu PeiHonggui Han