JOURNAL ARTICLE

Residual Prompt Tuning: improving prompt tuning with residual reparameterization

Abstract

Prompt tuning is one of the successful approaches for parameter-efficient tuning of pre-trained language models. Despite being arguably the most parameter-efficient (tuned soft prompts constitute <0.1% of total parameters), it typically performs worse than other efficient tuning methods and is quite sensitive to hyper-parameters. In this work, we introduce Residual Prompt Tuning - a simple and efficient method that significantly improves the performance and stability of prompt tuning. We propose to reparameterize soft prompt embeddings using a shallow network with a residual connection. Our experiments show that Residual Prompt Tuning significantly outperforms prompt tuning across T5-Large, T5-Base and BERT-Base models. Notably, our method reaches +7 points improvement over prompt tuning on SuperGLUE benchmark with T5-Base model and allows to reduce the prompt length by 10 times without hurting performance. In addition, we show that our approach is robust to the choice of learning rate and prompt initialization, and is effective in few-shot settings.

Keywords:
Residual Computer science Initialization Benchmark (surveying) Fine-tuning Base (topology) Algorithm Mathematics Physics

Metrics

17
Cited By
4.34
FWCI (Field Weighted Citation Impact)
34
Refs
0.93
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Speech Recognition and Synthesis
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

LIPT: Improving Prompt Tuning with Late Inception Reparameterization

Yawen HeAo FengZhengjie GaoXinyu Song

Journal:   Electronics Year: 2024 Vol: 13 (23)Pages: 4741-4741
JOURNAL ARTICLE

Prefix Tuning Using Residual Reparameterization

Youngjun JungHyunsun HwangChangki Lee

Journal:   IEEE Access Year: 2025 Vol: 13 Pages: 54866-54872
BOOK-CHAPTER

Visual Prompt Tuning

Menglin JiaLuming TangBor-Chun ChenClaire CardieSerge BelongieBharath H. AithalSer-Nam Lim

Lecture notes in computer science Year: 2022 Pages: 709-727
© 2026 ScienceGate Book Chapters — All rights reserved.