JOURNAL ARTICLE

Texture-Consistent 3D Scene Style Transfer via Transformer-Guided Neural Radiance Fields

Wudi ChenZhiyuan ZhaShigang WangLiaqat AliBihan WenXin YuanJiantao ZhouCe Zhu

Year: 2025 Journal:   IEEE Transactions on Image Processing Vol: 34 Pages: 7193-7208   Publisher: Institute of Electrical and Electronics Engineers

Abstract

Recent advancements have suggested that neural radiance fields (NeRFs) show great potential in 3D style transfer. However, most existing NeRF-based style transfer methods still face considerable challenges in generating stylized images that simultaneously preserve clear scene textures and maintain strong cross-view consistency. To address these limitations, in this paper, we propose a novel transformer-guided approach for 3D scene style transfer. Specifically, we first design a transformer-based style transfer network to capture long-range dependencies and generate 2D stylized images with initial consistency, which serve as supervision for the 3D stylized generation. To enable fine-grained control over style, we propose a latent style vector as a conditional feature and design a style network that projects this style information into the 3D space. We further develop a merge network that integrates style features with scene geometry to render 3D stylized images that are both visually coherent and stylistically consistent. In addition, we propose a texture consistency loss to preserve scene structure and enhance texture fidelity across views. Extensive quantitative and qualitative experimental results demonstrate that our proposed approach outperforms many state-of-the-art methods in terms of visual perception, image quality and multi-view consistency. Our code and more results are available at: https://github.com/PaiDii/TGTC-Style.git.

Keywords:

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
66
Refs
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

© 2026 ScienceGate Book Chapters — All rights reserved.