JOURNAL ARTICLE

Boundary Guided Context Aggregation for Semantic Segmentation

Abstract

The recent studies on semantic segmentation are starting to notice the significance of the boundary information, where most approaches see boundaries as the supplement of semantic details. However, simply combing boundaries and the mainstream features cannot ensure a holistic improvement of semantics modeling. In contrast to the previous studies, we exploit boundary as a significant guidance for context aggregation to promote the overall semantic understanding of an image. To this end, we propose a Boundary guided Context Aggregation Network (BCANet), where a Multi-Scale Boundary extractor (MSB) borrowing the backbone features at multiple scales is specifically designed for accurate boundary detection. Based on which, a Boundary guided Context Aggregation module (BCA) improved from Non-local network is further proposed to capture long-range dependencies between the pixels in the boundary regions and the ones inside the objects. By aggregating the context information along the boundaries, the inner pixels of the same category achieve mutual gains and therefore the intra-class consistency is enhanced. We conduct extensive experiments on the Cityscapes and ADE20K databases, and comparable results are achieved with the state-of-the-art methods, clearly demonstrating the effectiveness of the proposed one.

Keywords:
Computer science Boundary (topology) Context (archaeology) Segmentation Pixel Consistency (knowledge bases) Semantics (computer science) Artificial intelligence Mathematics Geography

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
0.50
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Hermeneutics and Narrative Identity
Social Sciences →  Arts and Humanities →  Philosophy
Aging, Elder Care, and Social Issues
Health Sciences →  Health Professions →  General Health Professions
Health, Medicine and Society
Health Sciences →  Health Professions →  General Health Professions

Related Documents

© 2026 ScienceGate Book Chapters — All rights reserved.