The Transformer has achieved tremendous success in computer vision, natural language processing, and graph representation learning. However, the transformer cannot effectively encode the topology information of the graph into the model, while it is the advantage of the graph convolution network (GCN). Therefore, we propose a model GTGC combining transformer and GCN for graph classification tasks. To this end, we take the result of graph data passing through multi-head self-attention and feed-forward blocks as the input of the graph convolution module. By increasing the number of neighbors for each node's feature matrix, the nodes with more neighbors are more important in the attention mechanism. We validate the validity of the model on multiple data sets, including social network datasets and bioinformatics datasets. Experimental results demonstrate that our model achieves advanced accuracy
Yuquan GanSiyu WuChang SuNan XiangZhijie XuYushan Pan
Xiaofeng ZhaoJunyi MaLei WangJiayi ShiYao DingZhili ZhangJie Feng
Xiaobin HongTong ZhangZhen CuiJian Yang