Abstract
We introduce a light steering technology that operates at megahertz frequencies, has no moving parts, and costs less than a hundred dollars. Our technology can benefit many projector and imaging systems that critically rely on high-speed, reliable, low-cost, and wavelength-independent light steering, including laser scanning projectors, LiDAR sensors, and fluorescence microscopes. Our technology uses ultrasound waves to generate a spatiotemporally-varying refractive index field inside a compressible medium, such as water, turning the medium into a dynamic traveling lens. By controlling the electrical input of the ultrasound transducers that generate the waves, we can change the lens, and thus steer light, at the speed of sound (1.5 km/s in water). We build a physical prototype of this technology, use it to realize different scanning techniques at megahertz rates (three orders of magnitude faster than commercial alternatives such as galvo mirror scanners), and demonstrate proof-of-concept projector and LiDAR applications. To encourage further innovation towards this new technology, we derive theory for its fundamental limits and develop a physically accurate simulator for virtual design. Our technology offers a promising solution for achieving high-speed and low-cost light steering in a variety of applications.
Key idea
Hardware implementation
Video
Resources
Paper: Our paper and supplement are available on CVF open access, and locally (paper, supplement).
Poster: Our poster is available here.
Presentation: Our presentation slides are available here.
Code: Our code is available on Github.
Citation
@InProceedings{Pediredla:2023:Megahertz,
author = {Pediredla, Adithya and Narasimhan, Srinivasa G. and Chamanzar, Maysamreza and Gkioulekas, Ioannis},
title = {Megahertz Light Steering Without Moving Parts},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2023},
pages = {1-12}
}
Acknowledgments
We thank Hossein Baktash and Lloyd Lobo for help with the hardware prototype, and Andre Nascimento and Amy Lee for help with the physics-based renderer. This work was supported by NSF awards 1730147, 1900821, 1900849, 1935849, gifts from AWS Cloud Credits for Research and the Sybiel Berkman Foundation, and a Sloan Research Fellowship for Ioannis Gkioulekas.