JOURNAL ARTICLE

RGB-D DSO: Direct Sparse Odometry With RGB-D Cameras for Indoor Scenes

Zikang YuanKen ChengJinhui TangXin Yang

Year: 2021 Journal:   IEEE Transactions on Multimedia Vol: 24 Pages: 4092-4101   Publisher: Institute of Electrical and Electronics Engineers

Abstract

Visual odometry (VO) is a fundamental technique for many robotics and augmented reality (AR) applications. However, most existing RGB-D VO systems suffer from large performance degradation when large occlusions are present and/or a large portion of depth values are invalid due to the limited range of an RGB-D camera, prohibiting the usage of most systems in practical applications. To address above two problems, we present RGB-D DSO, an RGB-D direct sparse odometry with the core part being sliding-window optimization with occlusion removal and a depth refinement module. Occlusion removal excludes negative effects arising from occluded objects when minimizing the final energy function for camera pose tracking. Depth refinement ensures sufficient valid depth values uniformly distributed for the depth map of a keyframe. Experimental results on three public datasets demonstrate that our method achieves smaller tracking error than most existing state-of-the-art methods. Meanwhile, our system takes only 21.93 ms to track a frame, which is faster than most existing methods.

Keywords:
Artificial intelligence Computer science RGB color model Computer vision Visual odometry Odometry Augmented reality Robotics Simultaneous localization and mapping Robot Mobile robot

Metrics

16
Cited By
3.07
FWCI (Field Weighted Citation Impact)
21
Refs
0.92
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Robotics and Sensor-Based Localization
Physical Sciences →  Engineering →  Aerospace Engineering
Advanced Vision and Imaging
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
3D Surveying and Cultural Heritage
Physical Sciences →  Earth and Planetary Sciences →  Geology
© 2026 ScienceGate Book Chapters — All rights reserved.