JOURNAL ARTICLE

FoV-NeRF: Foveated Neural Radiance Fields for Virtual Reality

Nianchen DengZhenyi HeJiannan YeBudmonde DuinkharjavPraneeth ChakravarthulaXubo YangQi Sun

Year: 2022 Journal:   IEEE Transactions on Visualization and Computer Graphics Vol: 28 (11)Pages: 3854-3864   Publisher: Institute of Electrical and Electronics Engineers

Abstract

Virtual Reality (VR) is becoming ubiquitous with the rise of consumer displays and commercial VR platforms. Such displays require low latency and high quality rendering of synthetic imagery with reduced compute overheads. Recent advances in neural rendering showed promise of unlocking new possibilities in 3D computer graphics via image-based representations of virtual or physical environments. Specifically, the neural radiance fields (NeRF) demonstrated that photo-realistic quality and continuous view changes of 3D scenes can be achieved without loss of view-dependent effects. While NeRF can significantly benefit rendering for VR applications, it faces unique challenges posed by high field-of-view, high resolution, and stereoscopic/egocentric viewing, typically causing low quality and high latency of the rendered images. In VR, this not only harms the interaction experience but may also cause sickness. To tackle these problems toward six-degrees-of-freedom, egocentric, and stereo NeRF in VR, we present the first gaze-contingent 3D neural representation and view synthesis method. We incorporate the human psychophysics of visual- and stereo-acuity into an egocentric neural representation of 3D scenery. We then jointly optimize the latency/performance and visual quality while mutually bridging human perception and neural scene synthesis to achieve perceptually high-quality immersive interaction. We conducted both objective analysis and subjective studies to evaluate the effectiveness of our approach. We find that our method significantly reduces latency (up to 99% time reduction compared with NeRF) without loss of high-fidelity rendering (perceptually identical to full-resolution ground truth). The presented approach may serve as the first step toward future VR/AR systems that capture, teleport, and visualize remote environments in real-time.

Keywords:
Computer science Rendering (computer graphics) Virtual reality Computer vision Stereoscopy Artificial intelligence Perception Ground truth Radiance Visualization Image quality Computer graphics (images) Image (mathematics)

Metrics

121
Cited By
14.98
FWCI (Field Weighted Citation Impact)
69
Refs
0.99
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Vision and Imaging
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Computer Graphics and Visualization Techniques
Physical Sciences →  Computer Science →  Computer Graphics and Computer-Aided Design
3D Shape Modeling and Analysis
Physical Sciences →  Engineering →  Computational Mechanics

Related Documents

JOURNAL ARTICLE

Scene-Aware Foveated Neural Radiance Fields

Xuehuai ShiLili WangXinda LiuWu JianZhiwen Shao

Journal:   IEEE Transactions on Visualization and Computer Graphics Year: 2024 Vol: 31 (9)Pages: 5039-5054
JOURNAL ARTICLE

Point-NeRF: Point-based Neural Radiance Fields

Qiangeng XuZexiang XuJulien PhilipSai BiZhixin ShuKalyan SunkavalliUlrich Neumann

Journal:   2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Year: 2022 Pages: 5428-5438
JOURNAL ARTICLE

MS-NeRF: Multi-Space Neural Radiance Fields

Ze-Xin YinPengyi JiaoJiaxiong QiuMing‐Ming ChengBo Ren

Journal:   IEEE Transactions on Pattern Analysis and Machine Intelligence Year: 2025 Vol: 47 (5)Pages: 3766-3783
© 2026 ScienceGate Book Chapters — All rights reserved.