Simultaneous localization and mapping (SLAM) is a fundamental problem for autonomous mobile robots widely used in automated warehouses, factory material transfer systems, flexible assembly systems, and other intelligent transportation systems. Compared to 2D lidar, a visual inertial odometry (VIO) consisting of a low-cost camera and a MEMS inertial measurement unit (IMU) is a popular method for achieving 6-DOF state estimation of robots. However, visual inertial odometry is prone to drift, especially in weak textures scenes, and cannot provide robust and high-precision localization. In this paper, we propose a fiducial-markers visual inertial tight coupling algorithm framework based on sliding window optimization. The framework includes: the preprocessing of fiducial-markers, camera and IMU, sliding window optimization, key frame selection, marginalization, closed-loop detection and optimization. Due to the corner detection noise of single-frame Tag, the pose estimation by PNP (Perspective-n-Point) method has large error, and will cause pose ambiguity problem. In this paper, we propose to use Tag observation in the sliding window optimization, which can effectively suppress the pose ambiguity and improve the global localization accuracy using the reprojection constraint of multi-frames and multi-Tag. Because non-observability, large initial error and wrong Tag detection will lead to the instability of the sliding window optimization, the improved strategy of the sliding window optimization is proposed. Experimental results show that the proposed algorithm framework is effective and robust, and can achieve high precision localization in weak texture conditions.
Sejong HeoJaehyuck ChaChan Gook Park
Timo HinzmannThomas SchneiderMarcin DymczykAndreas SchaffnerSimon LynenRoland SiegwartIgor Gilitschenski
Chufeng DuanRuofan LiuNan LiShengquan LiQuan TangZhiqiang DaiXiangwei Zhu
Zhihong WuHao AnBoyu WuHuaide WangKe Lu