JOURNAL ARTICLE

Absolute Stereo SFM without Stereo Correspondence for Vision Based SLAM

Abstract

This paper presents a vision based SLAM method by using stereo SFM technique. The proposed method is based on the stereo SFM presented in our former paper. The method do not need stereo correspondence but, nevertheless, can determine absolute scale factor, which is an important factor for long term navigation and SLAM. The method use only motion correspondence basically, which is easier to solve than stereo correspondence because the SFM algorithm can use a sequence of images taken at short time intervals. However, we infer the stereo correspondence inversely from the absolute estimates of structure and motion parameters and utilize this information to improve the performance of our method. Consequently, the method maintain the robustness to the stereo correspondence ambiguity and can avoid the degenerate configuration reported in the former paper. We also propose a simple initialization technique for the proposed method based on extended Kalman filter, which is critical issue for the methods using bearing-only measurements. The experimental results demonstrate the effectiveness of the algorithm.

Keywords:
Artificial intelligence Computer vision Initialization Robustness (evolution) Computer stereo vision Computer science Epipolar geometry Stereo cameras Structure from motion Stereo camera Quaternion Correspondence problem Extended Kalman filter Stereopsis Kalman filter Motion estimation Mathematics Image (mathematics)

Metrics

5
Cited By
1.18
FWCI (Field Weighted Citation Impact)
18
Refs
0.89
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Robotics and Sensor-Based Localization
Physical Sciences →  Engineering →  Aerospace Engineering
Advanced Vision and Imaging
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Robotic Path Planning Algorithms
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

© 2026 ScienceGate Book Chapters — All rights reserved.