April 22 - 26, 2024
Seattle, Washington
May 7 - 9, 2024 (Virtual)
Symposium Supporters
2024 MRS Spring Meeting & Exhibit
MT03.06.08

Advancements in Multimodal Emotion Sensing for Enhanced Human-Machine Interaction

When and Where

Apr 25, 2024
11:30am - 11:45am
Room 322, Level 3, Summit

Presenter(s)

Co-Author(s)

Damien Thuau3,Anand Babu1,Isabelle Dufour2,Dipankar Mandal1

Institute of Nano Science and Technology1,Université de Bordeaux2,Bordeaux INP3

Abstract

Damien Thuau3,Anand Babu1,Isabelle Dufour2,Dipankar Mandal1

Institute of Nano Science and Technology1,Université de Bordeaux2,Bordeaux INP3
Achieving effective human-machine interaction hinges on the recognition and response of machines to human emotions. However, the integration of emotions into robotics is a formidable challenge, primarily due to the intricacy and subjectivity of human emotional experiences. To enable the development of cognitive robotics capable of simultaneously perceiving and responding to human emotions, it is essential to make significant strides in emotional sensors driven by artificial intelligence (AI) and machine learning algorithms capable of accurately interpreting and responding to emotional cues. In this study, we introduce a novel approach to multimodal emotion sensing by harnessing non-invasive data collection of physiological indicators, including heart rate pulse, breathing patterns, and voice signatures. We utilize wearable printed organic piezoelectric sensors that offer rapid response times and are highly sensitive response a wide spectrum of force waveforms, frequencies, and pulse-width modulation. The responses obtained from these wearable sensors are fed into a long short-term memory neural network algorithm, achieving remarkable classification in distinguishing various emotional states.<br/>To further demonstrate the feasibility of transferring emotions to robotics, we have implemented Q-learning, demonstrating the effective training of a robotic model. By harnessing these multifaceted signals, our AI-driven emotional sensors aim to redefine the landscape of human-machine interaction. We envision a future in which machines possess the capability to discern and dynamically adapt to the diverse spectrum of human emotional states, fostering more seamless and responsive interactions between humans and machines.

Symposium Organizers

Keith Butler, University College London
Kedar Hippalgaonkar, Nanyang Technological University
Shijing Sun, University of Washington
Jie Xu, Argonne National Laboratory

Symposium Support

Bronze
APL Machine Learning
SCIPRIOS GmbH

Session Chairs

Shijing Sun
Steven Torrisi

In this Session