Fig. 7

Attention rollout matrix for UTD-MHAD. Attention becomes increasingly concentrated on semantically meaningful segments across Transformer layers, illustrating a refined temporal abstraction.

Attention rollout matrix for UTD-MHAD. Attention becomes increasingly concentrated on semantically meaningful segments across Transformer layers, illustrating a refined temporal abstraction.