Path sampling methods for differentiable rendering

Tanli Su, Ioannis Gkioulekas

EGSR 2024

teaser
We introduce differential sampling and adaptive sampling as new path sampling methods for differentiable rendering. Compared to BRDF sampling, our method produces less noisy gradients and better inverse rendering optimization.

Abstract

We introduce a suite of path sampling methods for differentiable rendering of scene parameters that do not induce visibility-driven discontinuities, such as BRDF parameters. We begin by deriving a path integral formulation for differentiable rendering of such parameters, which we then use to derive methods that importance sample paths according to this formulation. Our methods are analogous to path tracing and path tracing with next event estimation for primal rendering, have linear complexity, and can be implemented efficiently using path replay backpropagation. Our methods readily benefit from differential BRDF sampling routines, and can be further enhanced using multiple importance sampling and a loss-aware pixel-space adaptive sampling procedure tailored to our path integral formulation. We show experimentally that our methods reduce variance in rendered gradients by potentially orders of magnitude, and thus help accelerate inverse rendering optimization of BRDF parameters.

Variance of image gradients

variance
Our differential sampling method yields significant variance reduction for gradient estimation. We visualize the per-pixel variance of image gradients with and without differential sampling, multiple importance sampling (MIS), and next event estimation (NEE). Numbers represent ratios of mean variance between our method and BRDF sampling, lower is better.

Inverse rendering optimization

Our combined differential and adaptive sampling method leads to improved inverse rendering performance. Below, we compare optimization results from standard BRDF sampling and our method (combined differential and adaptive sampling).

Initialization Standard Ours Target
Initialization Standard Ours Target
Initialization Standard Ours Target

Resources

Paper: Our paper is available here.

Code: Our code is available on Github.

Citation

@InProceedings{Su:2024:Sampling,
	author    = {Su, Tanli and Gkioulekas, Ioannis},
	title     = {Path Sampling Methods for Differentiable Rendering},
	booktitle = {Eurographics Symposium on Rendering},
	month     = {July},
	year      = {2024},
}

Acknowledgments

We thank Yash Belhe for giving us early access to code for the differential BRDF sampling methods in Belhe et al. [2024]. The Pans and Sphere scenes are by Zhang et al. [2021], the Dragon scene by Nicolet et al. [2023], and the Vases scene by user Wig42 on Blend Swap. This work was supported by NSF awards 1900849, 2008123, NSF Graduate Research Fellowship DGE2140739, and a Sloan Research Fellowship.