Dec 4, 2024
8:00pm - 10:00pm
Hynes, Level 1, Hall A
Suyeon Kang1,Sangmin Lee1,Wanjun Park1
Hanyang University1
We present a wearable system that emulates the perceptions of human hand by distinguishing the shapes of interacting objects and patterns of finger motion. The system was constructed by applying ten simple strain sensors, whose structures are simple carbon-elastomer composite. The strain sensor shows proper performance with moderate stretchability, durability, and repeatability characteristics, as well as immunity in the operating environment with the simplicity of the sensor architecture. In spite of the output variation of the sensor due to the inherent mechanical characteristics of polymers, the system can distinguish the shapes and sizes of objects and hand gestures with data obtained from ten sensors by applying the machine learning method. The one-dimensional convolutional neural network (1D-CNN) was used to analyze the sensor data and predict objects or motions. After proceeding through preprocessing with outlier elimination, the sensor datasets were trained by the learning algorithm. As a result, we verify that the proposed system successfully predicts shapes and sizes for testing objects and human figure motions with significant accuracy using the assembly of a single type of strain sensor.