3D reconstruction with fast dipole sums

Hanyu Chen, Bailey Miller, Ioannis Gkioulekas

ACM Transactions on Graphics (SIGGRAPH Asia) 2024

teaser
The regularized dipole sum is a point-based representation that can model both implicit geometry and radiance fields using per-point attributes, and supports efficient ray tracing and differentiable rendering, thus facilitating optimization using multi-view images. We initialize our regularized dipole sum representation using the dense point cloud output of a structure from motion procedure (COLMAP). Bootstrapping from this initialization, we use inverse rendering to optimize per-point attributes (visualized in insets as varying point radii), resulting in a higher-quality surface reconstruction. Images are from the “Komainu / Kobe / Ikuta-jinja” dataset by Open Heritage 3D.

Abstract

We introduce a method for high-quality 3D reconstruction from multi-view images. Our method uses a new point-based representation, the regularized dipole sum, which generalizes the winding number to allow for interpolation of per-point attributes in point clouds with noisy or outlier points. Using regularized dipole sums, we represent implicit geometry and radiance fields as per-point attributes of a dense point cloud, which we initialize from structure from motion. We additionally derive Barnes-Hut fast summation schemes for accelerated forward and adjoint dipole sum queries. These queries facilitate the use of ray tracing to efficiently and differentiably render images with our point-based representations, and thus update their point attributes to optimize scene geometry and appearance. We evaluate our method in inverse rendering applications against state-of-the-art alternatives, based on ray tracing of neural representations or rasterization of Gaussian point-based representations. Our method significantly improves 3D reconstruction quality and robustness at equal runtimes, while also supporting more general rendering methods such as shadow rays for direct illumination.

Video summary

Visualization

A visualization of all our 3D reconstruction results is available at the interactive supplemental website.

Reference Ours NeuS2 Surfels

Resources

Paper: Our paper and supplement are available on the ACM Digital Library, on arXiv, and locally.

Presentation: Our presentation slides are available here.

Code: Our code is available on Github.

Data: The data to reproduce our experiments is available on Amazon S3 for Blended MVS (train and test data, point clouds) and DTU (train and test data, point clouds).

Citation

@article{Chen:Dipoles:2024,
	author = {Chen, Hanyu and Miller, Bailey and Gkioulekas, Ioannis},
	title = {3D Reconstruction with Fast Dipole Sums},
	year = {2024},
	issue_date = {December 2024},
	publisher = {Association for Computing Machinery},
	address = {New York, NY, USA},
	volume = {43},
	number = {6},
	issn = {0730-0301},
	url = {https://doi.org/10.1145/3687914},
	doi = {10.1145/3687914},
	journal = {ACM Trans. Graph.},
	month = nov,
	articleno = {192},
	numpages = {19},
}

Acknowledgments

We thank Keenan Crane, Rohan Sawhney, and Nicole Feng for many helpful discussions, and the authors of Dai et al. [2024]; Wang et al. [2023]; Li et al. [2023] for help running experimental comparisons. This work was supported by NSF award 1900849, NSF Graduate Research Fellowship DGE2140739, an NVIDIA Graduate Fellowship for Miller, and a Sloan Research Fellowship for Gkioulekas.