Knowledge graphs (KGs) are extremely useful resources for varieties of applications. However, with the large and steadily growing sizes of modern KGs, knowledge graph embeddings (KGE), which represent entities and relations in KGs into 32-bit floating-point vectors, become more and more expensive in terms of memory. To this end, in this paper, we propose a general framework to compress the embeddings from real-valued vectors to binary ones while preserving the inherent information of KGs. Specifically, the proposed framework utilizes relational graph auto-encoders as well as the Gumbel-Softmax trick to obtain the compressed representations. Our framework can be applied to a number of existing KGE models. Particularly, we extend state-of-the-art models TransE, DistMult, and ConvE in this paper. Finally, extensive experiments show that the proposed method successfully reduces the memory size of the embeddings by 92% while only leading to a loss of no more than 5% in the knowledge graph completion task.
Hongyuan ZhangPei LiRui ZhangXuelong Li
Shuaishuai ZuLi LiJun ShenWeitao Tang
Bozhen HuZelin ZangJun XiaLirong WuCheng TanStan Z. Li
Huiling XuWei XiaQuanxue GaoJungong HanXinbo Gao
Tina BehrouziDimitrios Hatzinakos