JOURNAL ARTICLE

Vision-and-Language Pretrained Models: A Survey

Siqu LongFeiqi CaoSoyeon Caren HanHaiqin Yang

Year: 2022 Journal:   Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence Pages: 5530-5537

Abstract

Pretrained models have produced great success in both Computer Vision (CV) and Natural Language Processing (NLP). This progress leads to learning joint representations of vision and language pretraining by feeding visual and linguistic contents into a multi-layer transformer, Visual-Language Pretrained Models (VLPMs). In this paper, we present an overview of the major advances achieved in VLPMs for producing joint representations of vision and language. As the preliminaries, we briefly describe the general task definition and genetic architecture of VLPMs. We first discuss the language and vision data encoding methods and then present the mainstream VLPM structure as the core content. We further summarise several essential pretraining and fine-tuning strategies. Finally, we highlight three future directions for both CV and NLP researchers to provide insightful guidance.

Keywords:
Computer science Artificial intelligence Transformer Natural language processing Architecture Question answering Natural language Language understanding Encoding (memory) Mainstream Task (project management) Language acquisition Joint (building) Linguistics

Metrics

32
Cited By
2.21
FWCI (Field Weighted Citation Impact)
84
Refs
0.91
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Image and Video Retrieval Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

BOOK-CHAPTER

A Survey of Pretrained Language Models

Kaili SunXudong LuoMichael Y. Luo

Lecture notes in computer science Year: 2022 Pages: 442-456
JOURNAL ARTICLE

Survey of Applications of Pretrained Language Models

SUN Kaili, LUO Xudong , Michael Y.LUO

Journal:   DOAJ (DOAJ: Directory of Open Access Journals) Year: 2023
BOOK-CHAPTER

Pretrained language models

Chenguang Zhu

Elsevier eBooks Year: 2021 Pages: 113-133
© 2026 ScienceGate Book Chapters — All rights reserved.