JOURNAL ARTICLE

Real-time per-pixel focusing method for light field rendering

Tomáš ChlubnaTomáš MiletPavel Zemčík

Year: 2021 Journal:   Computational Visual Media Vol: 7 (3)Pages: 319-333   Publisher: Springer Nature

Abstract

Abstract Light field rendering is an image-based rendering method that does not use 3D models but only images of the scene as input to render new views. Light field approximation, represented as a set of images, suffers from so-called refocusing artifacts due to different depth values of the pixels in the scene. Without information about depths in the scene, proper focusing of the light field scene is limited to a single focusing distance. The correct focusing method is addressed in this work and a real-time solution is proposed for focusing of light field scenes, based on statistical analysis of the pixel values contributing to the final image. Unlike existing techniques, this method does not need precomputed or acquired depth information. Memory requirements and streaming bandwidth are reduced and real-time rendering is possible even for high resolution light field data, yielding visually satisfactory results. Experimental evaluation of the proposed method, implemented on a GPU, is presented in this paper.

Keywords:
Rendering (computer graphics) Light field Computer science Image-based modeling and rendering Pixel Computer vision Artificial intelligence Real-time rendering Computer graphics (images) Software rendering Global illumination Computer graphics 3D computer graphics

Metrics

9
Cited By
0.92
FWCI (Field Weighted Citation Impact)
45
Refs
0.75
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Vision and Imaging
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Image Processing Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Image Processing Techniques and Applications
Physical Sciences →  Engineering →  Media Technology

Related Documents

© 2026 ScienceGate Book Chapters — All rights reserved.