JOURNAL ARTICLE

Dual Attention and Edge Refinement Network for Camouflaged Object Detection

Abstract

Camouflaged object detection (COD) aims to identify and segment objects similar to the background. However, existing methods suffer from unsatisfactory camouflaged object identifying or segmenting. In this paper, we propose a novel Dual Attention and Edge Refinement Network (DAERNet) to boost the performance of COD. Specifically, a Dual Attention Mechanism (DAM) is proposed to capture the scale diversities of camouflaged objects, which mainly includes Spotlight Attention Module (SAM) and Modulation Attention Module (MAM). The SAM aims to obtain multi-scale refinement features which contain discriminative semantic information, while the MAM aims to obtain the refined features with specific semantic information. Then, we propose a Boundary Extraction Module (BEM) to obtain edge information. Finally, a Feature Aggregation Module (FAM) is designed to fuse the refined multi-scale features obtained by DAM with the help of edge information to achieve accurate camouflage object prediction. Extensive experiments on four datasets demonstrate that the proposed DAERNet performs comparably against other SOTA methods.

Keywords:
Computer science Enhanced Data Rates for GSM Evolution Dual (grammatical number) Artificial intelligence Fuse (electrical) Object (grammar) Discriminative model Feature extraction Backbone network Object detection Pattern recognition (psychology) Scale (ratio) Computer vision

Metrics

2
Cited By
0.36
FWCI (Field Weighted Citation Impact)
48
Refs
0.54
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Visual Attention and Saliency Detection
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Image Enhancement Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Image and Video Retrieval Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.