Table 5 Performance vs. activation function (3-layer, \(50\%\) dropout).

From: Embedding-driven dual-branch approach for accurate breast tumor cellularity classification

Activation

Accuracy (%)

Sensitivity (%)

F1 (%)

Average (%)

ReLU

96.43

96.43

96.43

96.98

Sigmoid

92.86

92.86

92.86

93.21

Tanh

95.00

95.00

95.00

95.42

LeakyReLU (0.01)

97.86

97.86

97.86

98.15

Swish

97.14

97.14

97.14

97.52