December 1 - 6, 2024
Boston, Massachusetts
Symposium Supporters
2024 MRS Fall Meeting & Exhibit
SB10.06.07

TouchpadAnyWear—Textile-Integrated Tactile Sensors for Multimodal High Spatial-Resolution Touch Inputs with Motion Artifacts Tolerance

When and Where

Dec 4, 2024
4:30pm - 4:45pm
Hynes, Level 3, Room 302

Presenter(s)

Co-Author(s)

Junyi Zhao1,2,Chuan Wang1

Washington University in St. Louis1,Facebook Reality Labs2

Abstract

Junyi Zhao1,2,Chuan Wang1

Washington University in St. Louis1,Facebook Reality Labs2
Touch sensing is essential in human-computer interaction (HCI), the Internet of Things (IoT), and Augmented and Virtual Reality (AR/VR). The rapid proliferation of wearable devices, such as smart glasses and head-mounted displays, underscores the necessity of mobile touch input interfaces. In this study, we introduce TouchpadAnyWear, a novel series of textile-integrated force sensors capable of multi-modal touch input for detecting micro-gestures, two-dimensional (2D) continuous input, and force-sensitive strokes. This thin, conformal device (less than 1mm thick) provides high spatial resolution sensing and exhibits strong motion artifact tolerance due to its unique capacitive sensor architecture.<br/><br/>Compared with conventional gesture sensing techniques (e.g., IMU, magnetic, optical, camera), the E-textile tactile sensor exhibits several advantages, including reduced bulk, low power consumption, lightweight, skin conformability, and robustness against environmental interference. The sensor is composed of a knitted fabric dielectric core, sandwiched by printed silver electrodes and shielded by conductive fabrics on both sides. Featuring a high-density sensor pixel array (up to 49 pixels/cm<sup>2</sup>), TouchpadAnyWear can detect the size, location, and pressure of touch inputs with sub-millimeter spatial resolution and accommodate a wide range of force inputs (0.05N to 20N). Inspired by mechanoreceptors (specifically the SA-1 type found in human skin), micro-structured silicone domes are patterned onto the knitted fabrics to locally stiffen sensing units, thereby reducing motion artifacts during deformation. These polymer domes also provide passive tactile feedback to users, facilitating eyes-free localization of the active sensing pixels. The patterning and fabrication of the silicone domes and silver electrodes were achieved via programmable direct-ink-writing (DIW).<br/><br/>Two distinct configurations have been developed: an 8-by-8 grid flexible sensor, demonstrated as a miniature high-resolution touchpad that can be attached to any ubiquitous object; and a T-shaped wearable sensor designed as a finger sleeve for thumb-to-finger micro-gesture input. A coin-sized, miniaturized flexible printed circuit board (FPCB) has been designed and fabricated for direct connection to the sensor device, enabling multichannel data acquisition and communication in a portable manner. Furthermore, fast and highly accurate gesture input recognition (over 95%) has been achieved using machine learning-assisted algorithms. User evaluations involving 35 participants were conducted to validate the effectiveness and usability of TouchpadAnyWear in daily human-computer interaction contexts, such as tapping, swiping, multi-touch, 2D cursor control, text input, and 2D stroke-based gestures. This study also explores potential applications of TouchpadAnyWear in wearable smart devices, gaming, and AR/VR devices.

Keywords

additive manufacturing | biomimetic

Symposium Organizers

Madhu Bhaskaran, RMIT University
Hyun-Joong Chung, University of Alberta
Ingrid Graz, Johannes Kepler University
Edwin Jager, Linköping University

Symposium Support

Bronze
Institute of Physics Publishing

Session Chairs

Ingrid Graz
Nils-Krister Persson

In this Session