Abstract

Light field imaging has emerged as a new modality, enabling to capture the angular and spatial information of a scene. This additional angular information is used to estimate the depth of a 3-D scene. The continuum of virtual view-points in light field data efficiently handles occlusion and provides a robust depth estimate for smaller distances. However, a narrow baseline in a light field camera limits the depth estimation for larger distances. To have an efficient occlusion handling and increase the operating distances, we proposed a novel disparity based stereo light field depth estimation method. First, segments are obtained in central sub-aperture of left view and then estimate the disparity vector of these segments using left camera sub-aperture images. This handles occlusion efficiently. Then stereo disparity at boundaries of these segments exploiting the epipolar geometry inherent in a light field data. Finally this stereo disparity at boundaries is propagated to other pixels and normalized. We provided a synthetic stereo light field data-set having inherent characteristic of a light field. We have tested our approach on a variety of real-world scenes captured with Lytro Illum camera and also on synthetic images. The proposed method outperforms several state-of-the-art algorithms.

Keywords:
Epipolar geometry Computer vision Artificial intelligence Light field Computer science Pixel Stereo camera View synthesis Stereopsis Computer stereo vision Image (mathematics) Rendering (computer graphics)

Metrics

1
Cited By
0.12
FWCI (Field Weighted Citation Impact)
25
Refs
0.43
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Vision and Imaging
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Image Processing Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Image Processing Techniques and Applications
Physical Sciences →  Engineering →  Media Technology
© 2026 ScienceGate Book Chapters — All rights reserved.