Temporal Knowledge Graph (TKG) reasoning aims to infer missing facts by leveraging abundant historical information, highlighting the dynamic interactions between entities and relations over time. However, existing methods often overlook potential correlations among relations and face difficulties in predicting previously unseen events. To address these challenges, we propose a novel framework, Time-aware Fact Diffusion with Contrastive Learning for Temporal Knowledge Reasoning (TFDCL), to improve TKG completion. Specifically, TFDCL incorporates a relation-guided filtering mechanism to enhance structural modeling when capturing both short-term and long-term historical features. Moreover, a Time-aware Fact Diffusion module is introduced, which injects noise into fact-level representations and progressively denoises them, thereby improving the model’s generalization to unseen events. Additionally, a contrastive learning objective is employed to align short-term and long-term representations, encouraging semantically similar events to be closer in the embedding space and better capturing the dynamic evolution of knowledge graphs. Extensive experiments on four benchmark datasets demonstrate that TFDCL consistently outperforms state-of-the-art baselines across multiple evaluation metrics, confirming its effectiveness and robustness.
Renning PangYao LiuYanglei GanTingting DaiYashen WangXiaojun ShiTian LanQiao Liu
Wei ChenHuaiyu WanYuting WuShuyuan ZhaoJiayaqi ChengYuxin LiYoufang Lin
Junyan GuoKai YangWenqian Zhao