JOURNAL ARTICLE

DE-NeRF: DEcoupled Neural Radiance Fields for View-Consistent Appearance Editing and High-Frequency Environmental Relighting

Abstract

Neural Radiance Fields (NeRF) have shown promising results in novel view synthesis. While achieving state-of-the-art rendering results, NeRF usually encodes all properties related to geometry and appearance of the scene together into several MLP (Multi-Layer Perceptron) networks, which hinders downstream manipulation of geometry, appearance and illumination. Recently researchers made attempts to edit geometry, appearance and lighting for NeRF. However, they fail to render view-consistent results after editing the appearance of the input scene. Moreover, high-frequency environmental relighting is also beyond their capability as lighting is modeled as Spherical Gaussian (SG) and Spherical Harmonic (SH) functions or a low-resolution environment map. To solve the above problems, we propose DE-NeRF to decouple view-independent appearance and view-dependent appearance in the scene with a hybrid lighting representation. Specifically, we first train a signed distance function to reconstruct an explicit mesh for the input scene. Then a decoupled NeRF learns to attach view-independent appearance to the reconstructed mesh by defining learnable disentangled features representing geometry and view-independent appearance on its vertices. For lighting, we approximate it with an explicit learnable environment map and an implicit lighting network to support both low-frequency and high-frequency relighting. By modifying the view-independent appearance, rendered results are consistent across different viewpoints. Our method also supports high-frequency environmental relighting by replacing the explicit environment map with a novel one and fitting the implicit lighting network to the novel environment map. Experiments show that our method achieves better editing and relighting performance both quantitatively and qualitatively compared to previous methods.

Keywords:
Computer science Computer vision Artificial intelligence Radiance Rendering (computer graphics) Artificial neural network Image-based lighting Computer graphics (images) Image-based modeling and rendering Optics Physics

Metrics

26
Cited By
17.21
FWCI (Field Weighted Citation Impact)
33
Refs
0.99
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Computer Graphics and Visualization Techniques
Physical Sciences →  Computer Science →  Computer Graphics and Computer-Aided Design
3D Shape Modeling and Analysis
Physical Sciences →  Engineering →  Computational Mechanics
Advanced Vision and Imaging
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

JOURNAL ARTICLE

VD-NeRF: Visibility-Aware Decoupled Neural Radiance Fields for View-Consistent Editing and High-Frequency Relighting

Tong WuJia-Mu SunYu‐Kun LaiLin Gao

Journal:   IEEE Transactions on Pattern Analysis and Machine Intelligence Year: 2025 Vol: 47 (5)Pages: 3344-3357
JOURNAL ARTICLE

Ref-NeRF: Structured View-Dependent Appearance for Neural Radiance Fields

Dor VerbinPeter HedmanBen MildenhallTodd ZicklerJonathan T. BarronPratul P. Srinivasan

Journal:   2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Year: 2022 Pages: 5481-5490
JOURNAL ARTICLE

Ref-NeRF: Structured View-Dependent Appearance for Neural Radiance Fields

Dor VerbinPeter HedmanBen MildenhallTodd ZicklerJonathan T. BarronPratul P. Srinivasan

Journal:   IEEE Transactions on Pattern Analysis and Machine Intelligence Year: 2024 Vol: 47 (11)Pages: 9426-9437
JOURNAL ARTICLE

NeRF-Editing: Geometry Editing of Neural Radiance Fields

Yu-Jie YuanYang-Tian SunYu‐Kun LaiYuewen MaRongfei JiaLin Gao

Journal:   2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Year: 2022 Pages: 18332-18343
BOOK-CHAPTER

CaSE-NeRF: Camera Settings Editing of Neural Radiance Fields

Ciliang SunYuqi LiJiabao LiChong WangXinmiao Dai

Lecture notes in computer science Year: 2023 Pages: 95-107
© 2026 ScienceGate Book Chapters — All rights reserved.