April 22 - 26, 2024
Seattle, Washington
May 7 - 9, 2024 (Virtual)
Symposium Supporters
2024 MRS Spring Meeting
SB03.10.02

Artificial Thermomechanical Receptor for In-Sensor Multimodal Fusion

When and Where

Apr 25, 2024
11:00am - 11:15am
Room 436, Level 4, Summit

Presenter(s)

Co-Author(s)

Junhyuk Bang1,Kyun Kyu Kim2,Seung Hwan Ko1

Seoul National University1,Stanford University2

Abstract

Junhyuk Bang1,Kyun Kyu Kim2,Seung Hwan Ko1

Seoul National University1,Stanford University2
As wearable devices evolve in complexity, there's a surging demand for advanced multimodal sensing technologies. Especially the acquisition of thermodynamic information is essential, which allows human-like perception, sensor calibration, and hazard detection. While previous studies have succeeded in selectively capturing temperature and mechanical signals, these methods typically required individual measuring units of target stimuli, leading to increased structural complexity, and necessitating extra computational processes for signal matching.<br/>To address these limitations of traditional multimodal sensors, we introduce the neuromorphic sensor for in-sensor multimodal fusion. This innovative design is inspired by the sensory nerve system's efficient computing architecture, which represents multimodal information by interleaving signals across time. To replicate this neural-like computing architecture, we utilized a stretchable memristive nanowire network (the Ag@Cu<sub>2</sub>O core-shell nanowires) as the sensing material. This random network, with numerous memristive junctions of a metal-metal oxide-metal configuration, experiences a binary phase transition related to material and network geometry during the memristive switching process. In the high-resistance state, the conductivity pathway primarily goes through a dense network of Cu<sub>2</sub>O shells, while in the low-resistance state, it predominantly goes through a sparse network of conductive filaments. This unique feature empowers active alteration in sensing capability, allowing for the selective extraction of thermal and mechanical signals through a singular resistance measurement unit. Furthermore, by carefully manipulating the signal measurement voltage pulse scenario, thermomechanical information is integrated into a singular interleaving response profile. It enables the spatiotemporal synchronization of thermal and mechanical information itself without the necessity of an external computing unit.<br/>Owing to its design simplicity, our neuromorphic multimodal sensor is much more miniaturized than previous work, achieving a sensing channel size of under 0.1mm<sup>2</sup> and a device thickness of less than 40 μm. This advancement not only improves its conformability on curved surfaces but also ensures rapid response to external stimuli. To demonstrate the practical application of realizing human-like perception, we integrated our sensor with the deep neural network and recognized random objects. In-sensor multimodal fusion, which captures complex physical characteristics, shows superior recognition accuracy compared to datasets acquired by measuring single information (thermal information or mechanical information). These findings open new opportunities for wearable electronics with wearability, and functionality and pave the way for intelligent multimodal sensor systems.

Symposium Organizers

Dimitra Georgiadou, University of Southampton
Paschalis Gkoupidenis, Max Planck Institute
Francesca Santoro, Forschungszentrum Jülich/RWTH Aachen University
Yoeri van de Burgt, Technische Universiteit Eindhoven

Session Chairs

Paschalis Gkoupidenis
Jonathan Rivnay

In this Session