Table 3 Comparative experiments of attention modules

From: An algorithm based on multi-branch feature cross fusion for archaeological illustration of murals

Module

MAE

Precision

Fβ-Score

PSNR

w/o

0.1105

0.4641

0.461

11.8624

SE

0.1112

0.4645

0.4614

11.8245

ECA

0.1117

0.4635

0.4594

11.822

CBAM

0.1127

0.4587

0.4562

11.8171

DA

0.1118

0.4544

0.4493

11.7191

SEconvBlock

0.1098

0.4690

0.4648

11.8851

  1. Bold numbers represent the effect of the model in this paper