Fig. 4
From: Enhanced hierarchical attention mechanism for mixed MIL in automatic Gleason grading and scoring

Shown the process of Hierarchical Attention Mechanisms for increasing the weight of valid labels. (a) is shown the process of the Hierarchical Attention Mechanisms of the two branch, and the SliAtt is the attention mechanisms in the slide level, the InsAtt is the attention mechanisms in the instance level, and the HAA is the multi-Hierarchical Attention Mechanisms in the model; (b) is represent the attention mechanisms process of our method, the weight values present the \({\mathcalligra{x} }_{\mathcalligra{i}}\) weight in all of the labels.