Modern image sensors consist of systems of cascaded and bulky spherical optics for imaging with minimal aberrations. While these systems provide high quality images, their improved functionality comes at the cost of increased size and weight. One route to reduce a system’s complexity is via computational imaging, in which much of the aberration correction and functionality of the optics is shifted to post-processing in software. Alternatively, a designer could miniaturize the optics by replacing them with diffractive optical elements, which mimic the functionality of refractive systems in a more compact form factor. Metasurfaces are an extreme example of such diffractive elements, in which quasiperiodic arrays of resonant subwavelength optical antennas impart spatially varying changes on a wavefront. While separately both computational imaging and metasurfaces are promising avenues toward simplifying optical systems, a synergistic combination of these fields can further enhance system performance and facilitate advanced capabilities. In this talk, I will present a method to combine these two techniques to enable ultrathin optics for performing full-color and varifocal imaging across the whole visible spectrum as well as high precision depth sensing. I will also discuss the use of computational techniques for designing metasurfaces with exotic behaviors lacking any intuition-informed design, as well as for performing computation on incident light, with applications in optical information processing, sensing, and computing.
|