JOURNAL ARTICLE

Survey on Monocular Metric Depth Estimation

Jiuling ZhangYurong WuHua Jiang

Year: 2025 Journal:   Computers Vol: 14 (11)Pages: 502-502   Publisher: Multidisciplinary Digital Publishing Institute

Abstract

Monocular metric depth estimation (MMDE) aims to generate depth maps with an absolute metric scale from a single RGB image, which enables accurate spatial understanding, 3D reconstruction, and autonomous navigation. Unlike conventional monocular depth estimation that predicts only relative depth, MMDE maintains geometric consistency across frames and supports reliable integration with visual SLAM, high-precision 3D modeling, and novel view synthesis. This survey provides a comprehensive review of MMDE, tracing its evolution from geometry-based formulations to modern learning-based frameworks. The discussion emphasizes the importance of datasets, distinguishing metric datasets that supply absolute ground-truth depth from relative datasets that facilitate ordinal or normalized depth learning. Representative datasets, including KITTI, NYU-Depth, ApolloScape, and TartanAir, are analyzed with respect to scene composition, sensor modality, and intended application domain. Methodological progress is examined across several dimensions, including model architecture design, domain generalization, structural detail preservation, and the integration of synthetic data that complements real-world captures. Recent advances in patch-based inference, generative modeling, and loss design are compared to reveal their respective advantages and limitations. By summarizing the current landscape and outlining open research challenges, this work establishes a clear reference framework that supports future studies and facilitates the deployment of MMDE in real-world vision systems requiring precise and robust metric depth estimation.

Keywords:

Metrics

1
Cited By
0.00
FWCI (Field Weighted Citation Impact)
56
Refs
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

© 2026 ScienceGate Book Chapters — All rights reserved.