JOURNAL ARTICLE

FAST-LIVO2: Fast, Direct LiDAR–Inertial–Visual Odometry

Abstract

This paper proposes FAST-LIVO2: a fast, direct LiDAR-inertial-visual odometry framework to achieve accurate and robust state estimation in Simultaneous Localization and Mapping (SLAM) tasks and provide great potential in real-time, onboard robotic applications. FAST-LIVO2 fuses the IMU, LiDAR and image measurements, efficiently through an error-state iterated Kalman filter (ESIKF). To address the dimension mismatch between the heterogeneous LiDAR and image measurements, we use a sequential update strategy in the Kalman filter. To enhance the efficiency, we use direct methods for both the visual and LiDAR fusion, where the LiDAR module registers raw points without extracting edge or plane features and the visual module minimizes direct photometric errors without extracting ORB or FAST corner features. The fusion of both visual and LiDAR measurements is based on a single unified voxel map where the LiDAR module constructs the geometric structure for registering new LiDAR scans and the visual module attaches image patches to the LiDAR points (i.e., visual map points) enabling new image alignment. To enhance the accuracy of image alignment, we use plane priors from the LiDAR points in the voxel map (and even refine the plane prior in the alignment process) and update the reference patch dynamically after new images are aligned. Furthermore, to enhance the robustness of image alignment, FAST-LIVO2 employs an on-demanding raycast operation and estimates the image exposure time in real time. We conduct extensive experiments on both benchmark and private datasets, demonstrating that our proposed system significantly outperforms other state-of-the-art odometry systems in terms of accuracy, robustness, and computation efficiency. Moreover, the effectiveness of key modules in the system is also validated. Lastly, we detail three applications of FAST-LIVO2: UAV onboard navigation demonstrating the system's computation efficiency for real-time onboard navigation, airborne mapping showcasing the system's mapping accuracy, and 3D model rendering (mesh-based and NeRF-based) underscoring the suitability of our reconstructed dense map for subsequent rendering tasks. We open source our code, dataset and application of this work on GitHub to benefit the robotics community.

Keywords:
Odometry Lidar Artificial intelligence Computer vision Computer science Visual odometry Remote sensing Inertial frame of reference Mobile robot Robot Geology Physics

Metrics

62
Cited By
81.78
FWCI (Field Weighted Citation Impact)
61
Refs
1.00
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Robotics and Sensor-Based Localization
Physical Sciences →  Engineering →  Aerospace Engineering
Advanced Vision and Imaging
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Satellite Image Processing and Photogrammetry
Physical Sciences →  Engineering →  Ocean Engineering

Related Documents

JOURNAL ARTICLE

FAST-LIO2: Fast Direct LiDAR-Inertial Odometry

Wei XuYixi CaiDongjiao HeJiarong LinFu Zhang

Journal:   IEEE Transactions on Robotics Year: 2022 Vol: 38 (4)Pages: 2053-2073
JOURNAL ARTICLE

FAST-LIO2: Fast Direct LiDAR-Inertial Odometry

Wei XuJiarong LinDongjiao HeFu Zhang

Journal:   SSRN Electronic Journal Year: 2024
JOURNAL ARTICLE

FAST-LIVO: Fast and Tightly-coupled Sparse-Direct LiDAR-Inertial-Visual Odometry

Chunran ZhengQingyan ZhuWei XuXiyuan LiuQizhi GuoFu Zhang

Journal:   2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) Year: 2022 Pages: 4003-4009
JOURNAL ARTICLE

FAST-LIEO: Fast and Real-Time LiDAR-Inertial-Event-Visual Odometry

Zirui WangYangtao GeKewei DongI‐Ming ChenJing Wu

Journal:   IEEE Robotics and Automation Letters Year: 2024 Vol: 10 (2)Pages: 1680-1687
© 2026 ScienceGate Book Chapters — All rights reserved.