Fig. 7: Identification and delivery of packaged object in an HMI application.
From: Soft robotic hand with tactile palm-finger coordination

a Experimental setup of the human–machine interface (HMI) experiment. Five object clusters are packaged in blind bags. The tactile images are obtained by controlling TacPalm SoftHand to grasp the blind bags, which are then used to construct a dataset for machine learning. The trained model is integrated with the robotic hand–arm system to identify objects in grasped blind bags during the experiment. Commands from the participant are input via a smart pad. b Time-lapse images of the HMI experiment. The identification results (classified object and its probability) are shown on the computer display. c The captured tactile images of blind bags and the identification results in the HMI experiment. d The classification confusion matrix of five objects based on the experimental data. e The probability of the classified object in each test.