Peter Ballentine1,Eric Chang1,Matei Ciocarlie1,Ioannis Kymissis1
Columbia University1
Peter Ballentine1,Eric Chang1,Matei Ciocarlie1,Ioannis Kymissis1
Columbia University1
The types of tasks robots are capable of are in part limited by the lack of an adequate robot “sense of touch.” The human “sense of touch” takes in pressure, vibration, texture, curvature, and temperature via the skin’s biological transduction modes and synthesizes detailed information about the contact. Meanwhile, combining these modalities on a signal-dense tactile robot finger to detect the stimuli that make up the human “sense of touch” remains an open problem. Currently, many state-of-the-art robotic tactile sensors are unimodal, solely relying on a single transduction mode (e.g., optical, capacitive, resistive, etc.) to take in stimuli. We hypothesize that utilizing multimodal sensing capabilities will allow us to achieve richer and more meaningful tactile feedback for robotic applications.<br/><br/>In this work, we will present a bimodal polyvinylidene fluoride (PVDF)-based sensor and its subsequent integration into a larger multimodal sensing system for a tactile robot finger. The PVDF-based sensor consists of a thin sheet of PVDF with metal photolithographically patterned on both sides, and is capable of both piezoelectric and projected capacitance sensing modes for measuring relative movement (i.e., vibration) and proximity, respectively. We will integrate the PVDF into a sensing system including 10 MEMS microphones, an accelerometer, and 8 temperature sensors on a PCB under PDMS; 9 capacitive force sensors suspended within the PDMS; and the bimodal PVDF sensor suspended in the PDMS above the other sensors. Within this setup we plan to optimize the thickness and stiffness of the PDMS between each layer as they relate to signal transmission and sensor durability, as well as the chemical functionalization of the PDMS surface to improve durability and surface texture. We will then collect the overlapping signals from each sensor type and utilize machine learning techniques to determine how to best utilize sensor data for force regressions, touch localization, texture classification, and radius of curvature detection. The eventual aim of the project is to incorporate the complete sensing system into a robotic finger and utilize its multimodality to enable more dexterous robotic manipulation.