Table 2 Comparison results on the Flare7K++ real test dataset. Best results are highlighted in Bold, second-best in Italic. * denotes models with reduced parameters due to limited GPU memory. † indicates methods without released code, for which metrics are reported from the original paper and may be incomplete. Note: The average inference time per image for SMFR-Net and SMFR-Net-L is 0.0825 s and 0.0412 s, respectively, measured on an NVIDIA TITAN RTX (24 GB) GPU.
Dataset | Flare7K++ real test dataset | |||||||
|---|---|---|---|---|---|---|---|---|
Metrics | PSNR\(\uparrow\) | SSIM\(\uparrow\) | LPIPS\(\downarrow\) | G-PSNR\(\uparrow\) | S-PSNR\(\uparrow\) | Params (M) | MACs (G) | |
Input | 22.561 | 0.856 | 0.0777 | 19.555 | 13.104 | - | - | |
Previous Synthesis Pipelines | FF-Former†18 | 27.350 | 0.901 | 0.0440 | - | - | - | - |
Sharma33 | 20.492 | 0.826 | 0.1115 | 17.790 | 12.685 | 22.365 | 285.12 | |
Wu10 | 24.613 | 0.871 | 0.0598 | 21.772 | 16.728 | 34.526 | 261.901 | |
Flare7K1 | 26.978 | 0.890 | 0.0466 | 23.507 | 21.563 | 20.429 | 159.643 | |
Flare7K++ | Zhou et al.34 | 25.184 | 0.872 | 0.0548 | 22.112 | 20.543 | 20.628 | 327.347 |
Restormer*17 | 27.597 | 0.897 | 0.0447 | 23.828 | 22.452 | 2.981 | 57.975 | |
MPRNet*35 | 27.036 | 0.893 | 0.0481 | 23.490 | 22.267 | 3.642 | 567.187 | |
U-net11 | 27.189 | 0.894 | 0.0452 | 23.527 | 22.647 | 34.527 | 261.953 | |
NAFNet26 | 27.042 | 0.888 | 0.0556 | 24.098 | 22.459 | 67.788 | 252.314 | |
Uformer16 | 27.633 | 0.894 | 0.0428 | 23.949 | 22.603 | 20.601 | 164.361 | |
HINet36 | 27.548 | 0.892 | 0.0464 | 24.081 | 22.907 | 88.674 | 685.127 | |
Kotp and Torki19 | 27.662 | 0.897 | 0.0422 | 23.987 | 22.847 | 129.306 | 271.419 | |
SPDDNet37 | 28.033 | 0.903 | 0.0420 | 24.537 | 23.614 | 25.620 | 105.010 | |
LPFSformer38 | 28.238 | 0.905 | 0.0422 | 24.793 | 23.876 | 13.733 | 525.442 | |
Flare7K++ FlareReal600 | SMFR-Net-L (ours) | 28.225 | 0.907 | 0.0403 | 24.760 | 23.832 | 2.152 | 31.228 |
SMFR-Net (ours) | 28.352 | 0.907 | 0.0384 | 24.841 | 23.941 | 7.981 | 103.888 | |