JOURNAL ARTICLE

Prompting to Prompt for Rehearsal-Free Class Incremental Learning

Abstract

Class incremental learning (CIL) devotes to addressing catastrophic forgetting while continually learning new tasks. Recently, prompt tuning techniques based on vision transformers (ViT) have achieved promising results in rehearsal-free CIL. To alleviate forgetting, representative methods use a query-key mechanism to generate prompts and attach them to the frozen pre-trained ViT. However, these methods neglect the effect of query, and the learning capacity of the model is limited due to unsuitable prompts. In this paper, we propose a new approach called Prompting to Prompt (P2P). Instead of using a task-independent query function, we learn sample queries together with prompts in response to the shift of data distribution in CIL. P2P can better separate classes across tasks because the generated prompts are effective and more discriminative sample features can be extracted. Besides, the whole training process is end-to-end and queries are decided by prompts themselves, which avoids additional parameters. P2P improves the plasticity of model while maintaining good resistance to forgetting in the long task sequence. Experiments show that our approach achieves state-of-the-art results with even fewer parameters.

Keywords:
Forgetting Computer science Discriminative model Task (project management) Artificial intelligence Machine learning Process (computing) Class (philosophy) Sample (material) Cognitive psychology

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
25
Refs
0.03
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Human Pose and Action Recognition
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.