Kyun Kyu Kim1,Zhenan Bao1
Stanford University1
Kyun Kyu Kim1,Zhenan Bao1
Stanford University1
With the help of machine learning, electronic devices — including gloves and electronic skins — can track the movement of human hands and perform tasks such as object and gesture recognition. However, such devices can be bulky and lack an ability to adapt to the curvature of the body. Furthermore, the existing models for signal processing require massive amounts of labelled data for individual tasks and users.<br/>Here, we report a nanomesh receptor that is integrated with an unsupervised meta-learning scheme and can be used for data-efficient user-independent recognition of different hand tasks. The nanomesh is based on biocompatible materials and can be directly printed onto the skin without the need for an external substrate, which improves user-comfort and avoids potential substrate mechanical constraints. The system can translate skin stretches into proprioception information, analogous to the way cutaneous receptors provide feedback for hand. With the approach, complex proprioceptive signals can be decoded using a single sensor along the index finger, without the need for a multi-sensing array. Highly informative multi-joint proprioceptive information can thus be produced in low-dimensional data, reducing computational processing time of our learning network. Our learning framework does not require large amounts of data to be collected for each individual user. We develop a time-dependent contrastive learning algorithm to provide an awareness of temporal continuity and to generate a motion feature space. Our system pretrains unlabelled signals collected from three different users to distinguish user independent, task specific sensor signal patterns from random hand motion. We show that the pretrained model can quickly adapt to different daily tasks —motion command, keypad typing, two-handed keyboard typing, and object recognition — using only a few personal hand signals. [1]<br/>Reference:<br/>1. K.K.Kim, Z.Bao*, Nature Electronics, in press