JOURNAL ARTICLE

Contextual Attention Refinement Network for Real-Time Semantic Segmentation

Shijie HaoYuan ZhouYouming ZhangYanrong Guo

Year: 2020 Journal:   IEEE Access Vol: 8 Pages: 55230-55240   Publisher: Institute of Electrical and Electronics Engineers

Abstract

Recently, significant progress has been made in pixel-level semantic segmentation using deep neural networks. However, for the current semantic segmentation methods, it is still challenging to achieve the balance between segmentation accuracy and computational cost. To address this issue, we propose the Contextual Attention Refinement Network (CARNet). In this method, we construct the Contextual Attention Refinement Module (CARModule), which learns an attention vector to guide the fusion of low-level and high-level features for obtaining higher segmentation accuracy. The CARModule is lightweight and can be directly equipped with different types of network structures. To better optimize the network, we additionally consider the semantic information, and further introduce the Semantic Context Loss (SCLoss) into the overall loss function. In the experiments, we validate the effectiveness and efficiency of our method on several public datasets for semantic segmentation. The results show that our method achieves a good balance on accuracy and computational costs.

Keywords:
Computer science Segmentation Artificial intelligence Context (archaeology) Construct (python library) Machine learning Image segmentation Pattern recognition (psychology)

Metrics

21
Cited By
1.68
FWCI (Field Weighted Citation Impact)
54
Refs
0.85
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Neural Network Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Visual Attention and Saliency Detection
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.