Path sampling methods for differentiable rendering

Tanli Su, Ioannis Gkioulekas

EGSR 2024

teaser
We introduce differential sampling and adaptive sampling as new path sampling methods for differentiable rendering. Compared to BRDF sampling, our method produces less noisy gradients and better inverse rendering optimization.

Abstract

We introduce a suite of path sampling methods for differentiable rendering of scene parameters that do not induce visibility-driven discontinuities, such as BRDF parameters. We begin by deriving a path integral formulation for differentiable rendering of such parameters, which we then use to derive methods that importance sample paths according to this formulation. Our methods are analogous to path tracing and path tracing with next event estimation for primal rendering, have linear complexity, and can be implemented efficiently using path replay backpropagation. Our methods readily benefit from differential BRDF sampling routines, and can be further enhanced using multiple importance sampling and a loss-aware pixel-space adaptive sampling procedure tailored to our path integral formulation. We show experimentally that our methods reduce variance in rendered gradients by potentially orders of magnitude, and thus help accelerate inverse rendering optimization of BRDF parameters.

Variance of image gradients

variance
Our differential sampling method yields significant variance reduction for gradient estimation. We visualize the per-pixel variance of image gradients with and without differential sampling, multiple importance sampling (MIS), and next event estimation (NEE). Numbers represent ratios of mean variance between our method and BRDF sampling, lower is better.

Inverse rendering optimization

Our combined differential and adaptive sampling method leads to improved inverse rendering performance. Below, we compare optimization results from standard BRDF sampling and our method (combined differential and adaptive sampling).

Initialization Standard Ours Target
Initialization Standard Ours Target
Initialization Standard Ours Target

Video

Resources

Paper: Our paper is available on the Eurographics Digital Library and locally.

Presentation: Our presentation slides are available here.

Code: Our code is available on Github.

Citation

@inproceedings{Su:2024:Sampling,
	booktitle = {Eurographics Symposium on Rendering},
	editor = {Haines, Eric and Garces, Elena},
	title = {{Path Sampling Methods for Differentiable Rendering}},
	author = {Su, Tanli and Gkioulekas, Ioannis},
	year = {2024},
	publisher = {The Eurographics Association},
	ISSN = {1727-3463},
	ISBN = {978-3-03868-262-2},
	DOI = {10.2312/sr.20241148}
}

Acknowledgments

We thank Yash Belhe for giving us early access to code for the differential BRDF sampling methods in Belhe et al. [2024]. The Pans and Sphere scenes are by Zhang et al. [2021], the Dragon scene by Nicolet et al. [2023], and the Vases scene by user Wig42 on Blend Swap. This work was supported by NSF awards 1900849, 2008123, NSF Graduate Research Fellowship DGE2140739, and a Sloan Research Fellowship.