JOURNAL ARTICLE

Label-Aware Auxiliary Learning for Dialogue State Tracking

Abstract

Dialogue State Tracking (DST) is an essential part of task-oriented dialogue systems. Many existing methods try to utilize external dialogue datasets to improve the performance of DST models. Instead of previous methods, in this paper, we propose Label-Aware Auxiliary Learning for DST (LAL-DST) which focuses on exploiting the abundant internal information of the target DST dataset to improve the performance. We design label-aware auxiliary tasks, in which we apply noising functions to either the dialogue history or the belief state label and take the concatenation of them as input. The goal of each task is to restore the corrupted context. During the training process, we first further train the large pre-trained language model on the auxiliary tasks, then fine-tune it on DST. Through the experimental results, we empirically show the effect of LAL-DST by the performance improvements it brings to MultiWOZ2.0 and WOZ.

Keywords:
Concatenation (mathematics) Computer science Task (project management) Process (computing) Context (archaeology) Artificial intelligence State (computer science) Tracking (education) Machine learning Natural language processing Algorithm Engineering Arithmetic Programming language

Metrics

2
Cited By
1.28
FWCI (Field Weighted Citation Impact)
24
Refs
0.75
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Speech and dialogue systems
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Speech Recognition and Synthesis
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.