Fig. 4: Structure and training results of hypergraph self-attention neural network.

A Workflow and detailed structure of the hypergraph attention network. B Structural details and process of the HyperSA module. C Confusion matrix for 15 behavior categories, with the vertical axis representing the actual categories and the horizontal axis representing the predicted categories. D Attention weight matrix. Distribution of attention weights between queries and keys in the Hyper Self-Attention module. E Hyperedge initialization graph. Hyperedges are constructed by grouping joints within the same semantically defined body region (e.g., head, limbs). F Attention distribution map of the head region with the nose as the query joint. G Distribution of attention in the limb region was mapped with the back as the query joint. Red lines indicate inter-joint attention weights, with darker colors indicating higher weights.