Ying ZhangYangpeng ShenGang XiaoJingHui Peng
The completeness of knowledge graphs is critical to their effectiveness across various applications. However, existing knowledge graph completion methods face challenges such as difficulty in adapting to new entity information, parameter explosion, and limited generalization capability. To address these issues, this paper proposes a knowledge graph completion framework that integrates large language models with case-based reasoning (CBR-LLM). By combining non-parametric reasoning with the semantic understanding capabilities of large language models, the framework not only improves completion accuracy but also significantly enhances generalization under various data-missing scenarios. Experimental results demonstrate that CBR-LLM excels in handling complex reasoning tasks and large-scale data-missing scenarios, providing an efficient and scalable solution for knowledge graph completion.
Tiezheng GuoQingwen YangChen WangYanyi LiuLi PanJia-Wei TangDapeng LiYingyou Wen
Wenbin GuoXin WangJiaoyan ChenZhao LiZirui Chen
Haohua ZhangXinyu ZhuXiaoming Zhang
Liang YaoJiazhen PengChengsheng MaoYuan Luo
Teodoro BaldazziLuigi BellomariniEmanuel Sallinger