JOURNAL ARTICLE

Layerwised multimodal knowledge distillation for vision-language pretrained model

Jin WangDawei LiaoYou ZhangDan XuXuejie Zhang

Year: 2024 Journal:   Neural Networks Vol: 175 Pages: 106272-106272   Publisher: Elsevier BV
Keywords:
Computer science Distillation Modalities Artificial intelligence Modality (human–computer interaction) Overfitting Machine learning Language model Transformer Artificial neural network

Metrics

11
Cited By
5.83
FWCI (Field Weighted Citation Impact)
64
Refs
0.93
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Enabling Multimodal Generation on CLIP via Vision-Language Knowledge Distillation

Wenliang DaiLu HouLifeng ShangXin JiangQun LiuPascale Fung

Journal:   Findings of the Association for Computational Linguistics: ACL 2022 Year: 2022 Pages: 2383-2395
JOURNAL ARTICLE

Multimodal Dialog Systems with Dual Knowledge-enhanced Generative Pretrained Language Model

Xiaolin ChenXuemeng SongLiqiang JingShuo LiLinmei HuLiqiang Nie

Journal:   ACM Transactions on Information Systems Year: 2023 Vol: 42 (2)Pages: 1-25
JOURNAL ARTICLE

Parameter-efficient online knowledge distillation for pretrained language models

Yukun WangJin WangXuejie Zhang

Journal:   Expert Systems with Applications Year: 2024 Vol: 265 Pages: 126040-126040
© 2026 ScienceGate Book Chapters — All rights reserved.