JOURNAL ARTICLE

FARFusion: A Practical Roadside Radar-Camera Fusion System for Far-Range Perception

Yao LiYingjie WangChengzhen MengYifan DuanJianmin JiYu ZhangYanyong Zhang

Year: 2024 Journal:   IEEE Robotics and Automation Letters Vol: 9 (6)Pages: 5433-5440   Publisher: Institute of Electrical and Electronics Engineers

Abstract

Far-range perception through roadside sensors is crucial to the effectiveness of intelligent transportation systems. The main challenge of far-range perception is due to the difficulty of performing accurate object detection and tracking under far distances (e.g., > 150m) at a low cost. To cope with such challenges, deploying both millimeter wave Radars and high-definition (HD) cameras, and fusing their data for joint perception has become a common practice. The key to this solution, however, is the precise association between the two types of data, which are captured from different perspectives and have different degrees of measurement noises. Towards this goal, the first question is which plane to conduct the association, i.e., the 2D image plane or the BEV plane. We argue that the former is more suitable because the magnitude of location errors in the perspective projection points is smaller at far distances on the 2D plane and can lead to more accurate association. Thus, we first project the Radar-based target locations (on the BEV plane) to the 2D plane and then associate them with the camera-based object locations that are modeled as a point on each object. Subsequently, we map the camera-based object locations to the BEV plane through inverse projection mapping (IPM) with the corresponding depth information from the Radar data. Finally, we engage a BEV tracking module to generate target trajectories for traffic monitoring. Since our approach involves transformation between the 2D plane and BEV plane, we also devise a transformation parameters refining approach based on a depth scaling technique, utilising the above fusion process without requiring any additional devices such as GPS. We have deployed an actual testbed on an urban expressway and conducted extensive experiments to evaluate the effectiveness of our system. The results show that our system can improve AP BEV by 32%, and reduce the location error by 0.56m . Our system is capable of achieving an average location accuracy of 1.3m when we extend the detection range up to 500m . We thus believe that our proposed method offers a viable approach to efficient roadside far-range perception.

Keywords:
Radar Remote sensing Range (aeronautics) Radar systems Perception Computer science Computer vision Geography Artificial intelligence Environmental science Telecommunications Engineering Psychology Aerospace engineering

Metrics

14
Cited By
18.47
FWCI (Field Weighted Citation Impact)
28
Refs
0.99
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Robotics and Sensor-Based Localization
Physical Sciences →  Engineering →  Aerospace Engineering
Infrared Target Detection Methodologies
Physical Sciences →  Engineering →  Aerospace Engineering
Remote Sensing and LiDAR Applications
Physical Sciences →  Environmental Science →  Environmental Engineering
© 2026 ScienceGate Book Chapters — All rights reserved.