JOURNAL ARTICLE

An Attention-Guided Two-Stream Convolutional Neural Network for Few-Shot Learning

Hui ZhuXiaofang Zhao

Year: 2022 Journal:   2022 IEEE International Conference on Multimedia and Expo (ICME) Pages: 1-6

Abstract

Learning unlabeled samples from unseen categories given limited labeled data is a challenging problem. Existing few-shot learning methods fail to generate satisfactory feature representations due to tackling the informative and interference information without distinction. In this paper, we propose an attention-guided two-stream convolutional neural network (AGTSNet), which highlights the salient and discriminative features of the main object while alleviating the background interference to address this indiscriminate treatment. Comprehensive experiments are conducted on few-shot image classification with four standard benchmark datasets to demonstrate the effectiveness of our method.

Keywords:
Discriminative model Computer science Artificial intelligence Benchmark (surveying) Convolutional neural network Salient Feature (linguistics) Machine learning Pattern recognition (psychology) One shot Feature learning Deep learning Feature extraction Shot (pellet) Interference (communication) Contextual image classification Image (mathematics) Channel (broadcasting) Engineering

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
45
Refs
0.09
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Machine Learning and ELM
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Neural Network Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.