Abstract

Large-scale pre-trained neural language model has facilitated to achieve the state-of-the-art performance on Dialogue State Tracking (DST) tasks. One of the existing works models the semantic correlation between the dialogue context and (domain, slot) pair encoded by BERT and make the prediction. Despite the effectiveness, they ignore the fact that there is no perfect semantic correspondence between (domain, slot) pair and the dialogue context. In this paper, we propose a domain-slot aware contrastive learning framework to solve this problem, which proposes three methods to bridge the semantic gap between the dialogue context and the (domain, slot) by constructing training sample pairs to fine-tune the BERT model and use it for base DST model. The experiments demonstrate that our proposed method has improved the performance of the baseline model on the MultiWOZ2.1 and MultiWOZ2.4 datasets, yielding competitive results.

Keywords:
Computer science Context (archaeology) Domain (mathematical analysis) Artificial intelligence Bridge (graph theory) Tracking (education) State (computer science) Natural language processing Machine learning Speech recognition Algorithm

Metrics

2
Cited By
1.28
FWCI (Field Weighted Citation Impact)
24
Refs
0.75
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Speech and dialogue systems
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Context-Aware Activity Recognition Systems
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.