JOURNAL ARTICLE

CLIP Driven Few-Shot Panoptic Segmentation

Pengfei XianLai-Man PoYuzhi ZhaoWing-Yin YuKwok-Wai Cheung

Year: 2023 Journal:   IEEE Access Vol: 11 Pages: 72295-72305   Publisher: Institute of Electrical and Electronics Engineers

Abstract

This paper presents CLIP Driven Few-shot Panoptic Segmentation (CLIP-FPS), a novel few-shot panoptic segmentation model that leverages the knowledge of Contrastive Language-Image Pre-training (CLIP) model. The proposed method builds upon a center indexing attention mechanism to facilitate knowledge transfer, which entails representing objects in an image as centers along with their pixel offsets. The model comprises a decoder responsible for generating object center-offset groups and a self-attention module tasked with producing a feature attention map. Subsequently, the object centers index the map to acquire the corresponding embeddings, paving the way for matrix multiplication and SoftMax operation to facilitate text embedding matching and the computation of the final panoptic segmentation masks. Quantitative evaluation on datasets such as COCO and Cityscapes shows that our method outperforms existing panoptic segmentation techniques in terms of Panoptic Quality (PQ) metrics.

Keywords:
Computer science Artificial intelligence Segmentation Computer vision Softmax function Pixel Image segmentation Feature (linguistics) Feature extraction Embedding Object detection Search engine indexing Panopticon Pattern recognition (psychology) Deep learning

Metrics

1
Cited By
0.18
FWCI (Field Weighted Citation Impact)
58
Refs
0.40
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Image and Video Retrieval Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.