Megahertz light steering without moving parts

Adithya Pediredla, Srinivasa Narasimhan, Maysamreza Chamanzar, Ioannis Gkioulekas

CVPR 2023

(a) Light steering devices in LiDAR systems and laser projectors have moving mechanical components, limiting them to kHz scanning rates. (b) Our technology uses the acousto-optic effect to enable MHz light steering without moving parts. The insets show that, before galvo mirrors could scan even a few points, our prototype scanned a thousand points to project the letter "A" on the wall.


We introduce a light steering technology that operates at megahertz frequencies, has no moving parts, and costs less than a hundred dollars. Our technology can benefit many projector and imaging systems that critically rely on high-speed, reliable, low-cost, and wavelength-independent light steering, including laser scanning projectors, LiDAR sensors, and fluorescence microscopes. Our technology uses ultrasound waves to generate a spatiotemporally-varying refractive index field inside a compressible medium, such as water, turning the medium into a dynamic traveling lens. By controlling the electrical input of the ultrasound transducers that generate the waves, we can change the lens, and thus steer light, at the speed of sound (1.5 km/s in water). We build a physical prototype of this technology, use it to realize different scanning techniques at megahertz rates (three orders of magnitude faster than commercial alternatives such as galvo mirror scanners), and demonstrate proof-of-concept projector and LiDAR applications. To encourage further innovation towards this new technology, we derive theory for its fundamental limits and develop a physically accurate simulator for virtual design. Our technology offers a promising solution for achieving high-speed and low-cost light steering in a variety of applications.

Key idea

Our technology uses the acousto-optic effect to turn a transparent medium, such as water, into a programmable optic that steers an incident light beam. Sound is a pressure wave that travels inside a medium by compressing and rarefying it, spatiotemporally changing the medium density. In turn, this changes the refractive index of the medium, which is proportional to the density. We design the pressure profile of the sound wave so that, at any time instant, the spatially-varying refractive index makes the medium behave as a periodic set of virtual gradient-index (GRIN) lenses, each with an aperture equal to the sound wavelength. The GRIN lenses bend light beams incident on the medium, with the GRIN profile determining the beam trajectory. These lenses travel at the speed of sound (1.5 km/s in water) and are reconfigurable at MHz frequencies, allowing us to steer light faster than mechanical devices. To enable flexible steering patterns, we combine this optic with a pulsed laser with a programmable pulse rate. By synchronizing the laser source with the sound waveform, and modulating the phase of the sound waveform, we control both the speed of beam steering and the location of the beam.

Hardware implementation

We show the (a) schematic and (b) prototype we built for demonstrations of proof-of-concept applications and comparisons with a galvo mirror system. The laser beam is diverged, then focused by the ultrasonically-sculpted gradient-index lens, and finally passes through the galvo mirrors onto the scene. We steer the beam with either the ultrasonically-sculpted lens or the galvo mirrors, but not both, depending on the experiment. The reflected light from the object takes the same path back to the sensor. The SPAD sensor, which is colocated with the laser, does not have any optics in front of it other than the ultrasonically-sculpted lens. This setup allows us to compare the scanning speed of our system with galvo mirrors, while keeping the aperture the same.



Paper: Our paper and supplement are available on CVF open access, and locally (paper, supplement).

Poster: Our poster is available here.

Presentation: Our presentation slides are available here.

Code: Our code is available on Github.


	author    = {Pediredla, Adithya and Narasimhan, Srinivasa G. and Chamanzar, Maysamreza and Gkioulekas, Ioannis},
	title     = {Megahertz Light Steering Without Moving Parts},
	booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
	month     = {June},
	year      = {2023},
	pages     = {1-12}


We thank Hossein Baktash and Lloyd Lobo for help with the hardware prototype, and Ande Nascimento and Amy Lee for help with the physics-based renderer. This work was supported by NSF awards 1730147, 1900821, 1900849, 1935849, gifts from AWS Cloud Credits for Research and the Sybiel Berkman Foundation, and a Sloan Research Fellowship for Ioannis Gkioulekas.