PBR-NeRF

PBR-NeRF

Inverse Rendering with Physics-Based
Neural Fields

Sean Wu1     Shamik Basu2     Tim Broedermann1     Luc Van Gool1,3     Christos Sakaridis1

CVPR 2025

1ETH Zurich     2University Of Bologna     3INSAIT, Sofia University St. Kliment Ohridski    

PBR-NeRF uses physics-based losses to fix baked-in specular highlights (highlighted areas) and recover realistic material and lighting.

1. Enforces energy conservation by penalizing BRDFs that reflect more light than they receive.

2. Reduces baked-in specular highlights (highlighted areas) by promoting separate diffuse and specular lobes.

Predicted novel views (RGB), normals, roughness, metallicness, and albedo from 49 input views and a point cloud.

Summary

  • PBR-NeRF uses physics-informed neural fields to solve the inverse rendering problem: jointly estimating geometry, materials, and lighting from posed images.
  • Standard NeRF and 3DGS ignore light transport physics, treating scenes as black boxes that memorize view-dependent effects like reflections.
  • Inverse rendering enables realistic and editable scenes, but is highly ill-posed: infinite combos of materials, lighting, and geometry can generate the same image.
  • Our contribution: Physics-based priors make inverse rendering more accurate and tractable by enforcing (1) energy conservation in materials, and (2) separate diffuse and specular lobes to reduce baked-in highlights.
  • Results: State-of-the-art material estimation on NeILF++ and DTU benchmarks with same or better novel view synthesis quality

Architecture

Jointly trained neural fields for geometry (NeRF SDF), materials (BRDF field), and lighting (NeILF)
⇒ expressive differential renderer
Conservation of Energy Loss \( \mathcal{L}_{\text{cons}} \) and Specular Loss \( \mathcal{L}_{\text{spec}} \)
⇒ enforce physically correct rendering

Physics-based Losses for Inverse Rendering

Our physics-based losses constrain the material BRDF (and indirectly the lighting):
1. Conservation of Energy Loss \( \mathcal{L}_{\text{cons}} \) supervises the full BRDF \( f_r = f_s + f_d \) (dotted envelope).
2. Specular Loss \( \mathcal{L}_{\text{spec}} \) adjusts relative magnitudes of the specular \( f_s \) and diffuse \( f_d \) lobes.

Inverse Rendering Results

Material Estimation

The BRDF field predicts albedo, metallicness, and roughness using the Disney BRDF. Predicted materials can be extracted, edited, or re-rendered under different lighting.

Lighting Estimation

The Neural Incident Light Field (NeILF) learns spatially-varying lighting and handles occlusion, ambient lighting, and multi-bounce reflections. Our physics-based losses improve estimated lighting quality with more concentrated light sources and fewer artifacts.

Geometry Estimation

The Signed Distance Function (NeRF SDF) represents smooth surface geometry, which is crucial for physics-based rendering (PBR). Here we visualize the predicted normals.

Novel View Synthesis (RGB) Results

We render novel views by evaluating the Rendering Equation with our predicted geometry (NeRF SDF), materials (BRDF field), and lighting (NeILF). Our physics-based inverse rendering produces RGB predictions that match or exceed NeILF++ in quality.

SOTA Material Estimation

Albedo

Our predicted albedo is more consistent with fewer artifacts caused by shadows or baked-in specular highlights.

Metallicness

Our predicted metallicness is more robust to shadows and fringing artifacts.

Roughness

We correctly predict a lower (darker) roughness than NeILF++, reducing overestimated diffuse reflection.

BibTeX

@inproceedings{wu2025pbrnerf,
      title     = {{PBR-NeRF}: Inverse Rendering with Physics-Based Neural Fields},
      author    = {Wu, Sean and Basu, Shamik and Broedermann, Tim and Van Gool, Luc and Sakaridis, Christos},
      booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
      year      = {2025}
}
    

This webpage template is from Nerfies. The video comparison with sliding bar is from Ref-NeRF. The image comparison with sliding bar is from Neuralangelo.