JOURNAL ARTICLE

Multiple Recurrent Attention Convolutional Neural Network For fine-grained image recognition

Xiaotong ZhuHengwei Bian

Year: 2022 Journal:   2022 International Conference on Image Processing, Computer Vision and Machine Learning (ICICML) Pages: 44-48

Abstract

Classifying fine-grained categories is a popular task nowadays since it can be applied in many daily tasks, such as helping people distinguish different species of animals or different model of vehicles. There are some existing approaches to deal with the task, while few take advantage of generating multiple attention in the image to better recognize the details. In this paper, we propose a novel network: Multiple Recurrent Attention Convolutional Neural Network (MRA-CNN), which uses a Multiple Attention Proposal Network (MAPN) to localize multiple key features and classify the subcategories according to them. The process of localizing attention and classifying each sub-picture is a mutual reinforcement. There will be a specialized loss function for MAPN, which consists of two losses, allowing the network to generate key features that are different from each other and have key information. We conduct our experiments mainly on the dataset CUB-Birds (CUB-200-2011). Our model achieves an overall accuracy of 85.6% which is pretty satisfying.

Keywords:
Computer science Key (lock) Convolutional neural network Task (project management) Artificial intelligence Attention network Process (computing) Machine learning Image (mathematics) Pattern recognition (psychology)

Metrics

2
Cited By
0.14
FWCI (Field Weighted Citation Impact)
13
Refs
0.47
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Neural Network Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Image and Video Retrieval Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.