JOURNAL ARTICLE

Exploring Universal Intrinsic Task Subspace for Few-Shot Learning via Prompt Tuning

Yujia QinXiaozhi WangYusheng SuYankai LinNing DingJing YiWeize ChenZhiyuan LiuJuanzi LiLei HouPeng LiMaosong SunJie Zhou

Year: 2024 Journal:   IEEE/ACM Transactions on Audio Speech and Language Processing Vol: 32 Pages: 3631-3643   Publisher: Institute of Electrical and Electronics Engineers

Abstract

Why can pre-trained language models (PLMs) learn universal representations and effectively adapt to broad NLP tasks differing a lot superficially? In this work, we empirically find evidence indicating that the adaptations of PLMs to various few-shot tasks can be reparameterized as optimizing only a few free parameters in a unified low-dimensional intrinsic task subspace , which may help us understand why PLMs could easily adapt to various NLP tasks with small-scale data. To find such a subspace and examine its universality, we propose an analysis pipeline called intrinsic prompt tuning (IPT). Specifically, we resort to the recent success of prompt tuning and decompose the soft prompts of multiple NLP tasks into the same low-dimensional nonlinear subspace, then we learn to adapt the PLM to unseen data or tasks by only tuning parameters in this subspace. In the experiments, we study diverse few-shot NLP tasks and surprisingly find that in a 250-dimensional subspace found with 100 tasks, by only tuning 250 free parameters, we can recover 97% and 83% of the full prompt tuning performance for 100 seen tasks (using different training data) and 20 unseen tasks, respectively, showing great generalization ability of the found intrinsic task subspace. Besides being an analysis tool, IPTcould further help us improve the prompt tuning stability.

Keywords:
Subspace topology Computer science Artificial intelligence Task (project management) Generalization Machine learning Natural language processing Mathematics

Metrics

7
Cited By
4.47
FWCI (Field Weighted Citation Impact)
77
Refs
0.92
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

JOURNAL ARTICLE

Ontology-enhanced Prompt-tuning for Few-shot Learning

Hongbin YeNingyu ZhangShumin DengXiang ChenHui ChenFeiyu XiongXi ChenHuajun Chen

Journal:   Proceedings of the ACM Web Conference 2022 Year: 2022 Pages: 778-787
JOURNAL ARTICLE

PPT: Pre-trained Prompt Tuning for Few-shot Learning

Yuxian GuXu HanZhiyuan LiuMinlie Huang

Journal:   Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) Year: 2022 Pages: 8410-8423
© 2026 ScienceGate Book Chapters — All rights reserved.