April 22 - 26, 2024
Seattle, Washington
May 7 - 9, 2024 (Virtual)
Symposium Supporters
2024 MRS Spring Meeting
EL08.12.01

Monocular Metasurface Camera for Multidimensional Light Field Sensing

When and Where

Apr 25, 2024
8:45am - 9:15am
Room 340/341, Level 3, Summit

Presenter(s)

Co-Author(s)

Yuanmu Yang1

Tsinghua University1

Abstract

Yuanmu Yang1

Tsinghua University1
Conventional camera systems can only detect light intensity while losing important information about the target scene, including depth, polarization, and spectrum. In order to further obtain the multi-dimensional light-field information of the target object, it is often required to use bulky and expensive instruments. Metasurface is composed of an array of optical antennas that can manipulate the amplitude, phase, polarization, and spectrum of light at the subwavelength scale. By replacing conventional diffractive of refractive elements with metasurfaces in imaging systems, one may be able to build optical sensors for high-performance multidimensional light sensing with low size, weight, power, and cost. Here, I will present our group’s recent effort to replace conventional camera lenses with metalenses. By leveraging the unique capability of metasurface to tailor the vectorial field of light, in combination with an advanced image retrieval algorithm, we aim to build compact camera systems that can capture multi-dimensional light field information of a target scene in a single shot under ambient illumination conditions. Specifically, I will show how we use a polarization-multiplexed metalens for imaging through turbid water (Laser & Photonics Reviews 15, 2100097 (2021)). Recently, we further extended such a concept to build a monocular camera that can capture a 4D image, including 2D all-in-focus intensity, depth, and polarization of a target scene in a single shot (Nature Communications 14, 1035 (2023)). I would also like to present our effort to commercialize flat-optics-based passive monocular 3D cameras, which drastically differ from existing 3D imaging hardware, including LiDAR and binocular cameras. The miniaturized 3D camera module may be seamlessly integrated with smartphones, AR/VR headsets, and robots for a variety of applications, including face and gesture recognition, spatial localization and mapping, and collision avoidance.

Keywords

spectroscopy

Symposium Organizers

Yao-Wei Huang, National Yang Ming Chiao Tung University
Min Seok Jang, Korea Advanced Institute of Science and Technology
Ho Wai (Howard) Lee, University of California, Irvine
Pin Chieh Wu, National Cheng Kung University

Symposium Support

Bronze
APL Quantum
Kao Duen Technology Corporation
Nanophotonics Journal

Session Chairs

Matthew Sheldon
Pin Chieh Wu

In this Session