Knowledge graphs (KG) represent information in the form of graphs. Because heterogeneous and collaborative designs remain incomplete employing missing links between the entity pairs. KG embedding paradigm tries to reduce incompleteness by representing graphs in vector space. Attention-based learning provides the natural way of KG representation learning by deriving attention to each central entity from its neighbors. The attention mechanism decides which part to attend more to collect the most important information. This paper analyzes the attention-based models used to learn KG latent structure and is utilized to predict the missing link for KG completion. Many of them are neural network-based methods. Experimental results show that DisenKGAT and HOLE outperform rest methods in standard evaluation metrics on the two benchmark FB15k-237 and WN18RR datasets respectively.
Chen LiXutan PengYuhang NiuShanghang ZhangHao PengChuan ZhouJianxin Li
Rui WangBicheng LiShengwei HuWenqian DuMin Zhang
Deng ChenWeiwei ZhangZuohua Ding
Wei ZhangTingting LiuZ LiYan DaiXu Zhou