Fig. 5: Demonstration of closed-loop human-machine interfacing through gesture-based robotic control and haptic feedback reproduction.

a Real-time control of a robotic arm using hand gestures captured by the AMSG wristband and interpreted via a deep learning model. b Confusion matrix showing the classification accuracy of 10 distinct gestures, with each based on the current signal sequences of 200 test samples. c Principal component analysis of 10 gestures and their cluster differentiation. d User interface displaying the current signals of gesture commands and corresponding robotic motion outputs. e Illustration of a use-case scenario where the temperature and pressure detected by the AMSG on the robotic hand are reproduced on the user’s wristband. f Modulation of temperature and vibration responses according to the current magnitude. g Current signals of the AMSG as the robotic hand approaches or retreats from heat or cold sources, illustrated by optical and infrared images. h Current signals when gripping a paper cup, where variations in the amplitude and tangent angle are used to determine appropriate force levels. i User interface displaying real-time haptic signal waveforms and haptic reproduction parameters.