NieR
Normal-Based Lighting Scene Rendering

Hongsheng Wang1,2, Yang Wang2, Yalan Liu2, Fayuan Hu2, Shengyu Zhang†1, Fei Wu1 and Feng Lin2


1 Zhejiang University, China      2 Zhejiang Lab, China

Abstract

In real-world road scenes, diverse material properties lead to complex light reflection phenomena, making accurate color reproduction crucial for enhancing the realism and safety of simulated driving environments. However, existing methods often struggle to capture the full spectrum of lighting effects, particularly in dynamic scenarios where viewpoint changes induce significant material color variations. To address this challenge, we introduce NieR (Normal-Based Lighting Scene Rendering), a novel framework that takes into account the nuances of light reflection on diverse material surfaces, leading to more precise rendering. To simulate the lighting synthesis process, we present the LD (Light Decomposition) module, which captures the lighting reflection characteristics on surfaces. Furthermore, to address dynamic lighting scenes, we propose the HNGD (Hierarchical Normal Gradient Densification) module to overcome the limitations of sparse Gaussian representation. Specifically, we dynamically adjust the Gaussian density based on normal gradients. Experimental evaluations demonstrate that our method outperforms state-of-the-art (SOTA) methods in terms of visual quality and exhibits significant advantages in performance indicators. Codes are available in the appendix.
The pipeline of our method, with the LD module on the left and the HNGD module on the right

Evaluation

To assess NieR's ability to render scenes with realistic lighting and fine details, we compare it against Gaussian Splatting as the baseline method. We employ established quantitative metrics, including Peak Signal-to-Noise Ratio (PSNR), Learned Perceptual Image Patch Similarity (LPIPS), and Structural Similarity Index (SSIM). These metrics evaluate the similarity between rendered images and ground truth data, quantifying visual fidelity and detail preservation. we compare our method with Gaussian Splatting, Mip-NeRF360, InstantNGP, and Plenoxels on seven scenes from the Mip-Nerf360 dataset and two scenes from the Tanks and Temples dataset [2017]. For the full evaluation please check the paper and the supplemental.

Visual Comparisons

Ours
gaussian-splatting
Ours
Mip-NeRF360
Ours
Instant-NGP [Müller 2022]
Ours
Ground Truth