Siyuan ChengNingyu ZhangBozhong TianXi ChenQingbin LiuHuajun Chen
Recently decades have witnessed the empirical success of framing Knowledge Graph (KG) embeddings via language models. However, language model-based KG embeddings are usually deployed as static artifacts, making them difficult to modify post-deployment without re-training after deployment. To address this issue, we propose a new task of editing language model-based KG embeddings in this paper. This task is designed to facilitate rapid, data-efficient updates to KG embeddings without compromising the performance of other aspects. We build four new datasets: E-FB15k237, A-FB15k237, E-WN18RR, and A-WN18RR, and evaluate several knowledge editing baselines demonstrating the limited ability of previous models to handle the proposed challenging task. We further propose a simple yet strong baseline dubbed KGEditor, which utilizes additional parametric layers of the hypernetwork to edit/add facts. Our comprehensive experimental results reveal that KGEditor excels in updating specific facts without impacting the overall performance, even when faced with limited training resources. Code and datasets will be available at https://github.com/AnonymousForPapers/DeltaKG.
Mirza Mohtashim AlamMd Rashad Al Hasan RonyMojtaba NayyeriKarishma MohiuddinM. S. T. Mahfuja AkterSahar VahdatiJens Lehmann
Mengqi ZhangXiaotian YeQiang LiuPengjie RenShu WuZhumin Chen
Xin XieZhoubo LiXiaohan WangZekun XiNingyu Zhang
Paolo RossoDingqi YangPhilippe Cudré-Mauroux
Paolo RossoDingqi YangPhilippe Cudré-Mauroux