JOURNAL ARTICLE

Task-Wise Prompt Query Function for Rehearsal-Free Continual Learning

Abstract

Continual learning (CL) aims to enable a model to retain knowledge of old tasks while learning new ones. One effective approach to CL is based on data rehearsal method. However, this approach increases the cost of storing data and cannot be used when data from old tasks is unavailable for some reason. Recently, with the emergence of large- scale pre-trained transformer models, prompt-based methods have become an alternative to data rehearsal. These methods rely on a query mechanism to generate prompts and have demonstrated resistance to forgetting in CL scenarios without rehearsal. However, these methods generate prompts in a task-wise way while queries for samples in an instance-wise way, and usually directly use pre-trained models as the encoding function for generating queries. This may lead to data retrieval errors and failure to match the correct prompts. In contrast, we propose building a new task-wise prompt query function that can continuously learn as the task progresses, thereby avoiding the issue of pre-trained models being unable to correctly match appropriate sample-prompt pairs. Our approach improves the effectiveness of the current state-of- the-art methods and has been verified on a series of datasets through our experimental results.

Keywords:
Computer science Forgetting Task (project management) Artificial intelligence Machine learning Transformer Function (biology)

Metrics

1
Cited By
0.64
FWCI (Field Weighted Citation Impact)
27
Refs
0.63
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.