Table 7 The assessment of the image detection run by the 5 transfer learning algorithms.
Experimental Methods Accuracy(%) Experimental Images | MTSL-DRDR | TATD | SRW | DTR | HDARMST | IFDAFHTL |
---|---|---|---|---|---|---|
Notebook Computer | 91.27 | 90.85 | 91.79 | 92.60 | 93.22 | 94.21 |
Bicycle | 94.86 | 93.57 | 91.33 | 92.80 | 93.03 | 96.05 |
Zebra | 93.57 | 91.37 | 92.57 | 91.78 | 92.76 | 93.72 |
Flower | 92.49 | 91.05 | 92.42 | 91.61 | 93.18 | 95.48 |
Experimental Methods Precision(%) Experimental Images | MTSL-DRDR | TATD | SRW | DTR | HDARMST | IFDAFHTL |
Notebook computer | 90.85 | 91.54 | 91.61 | 91.74 | 92.08 | 92.54 |
Bicycle | 91.15 | 92.06 | 90.21 | 92.07 | 91.76 | 93.07 |
Zebra | 90.37 | 91.31 | 91.64 | 90.46 | 92.39 | 91.23 |
Flower | 91.60 | 90.90 | 90.32 | 91.26 | 91.69 | 92.19 |
Experimental Methods Recall(%) Experimental Images | MTSL-DRDR | TATD | SRW | DTR | HDARMST | IFDAFHTL |
Notebook Computer | 81.57 | 84.38 | 86.49 | 87.22 | 86.72 | 89.12 |
Bicycle | 80.44 | 85.56 | 83.55 | 86.44 | 86.01 | 87.39 |
Zebra | 83.29 | 83.04 | 84.21 | 83.25 | 85.47 | 86.80 |
Flower | 82.07 | 85.73 | 86.36 | 84.93 | 86.03 | 88.47 |
Experimental methods SSIM [0,1] Experimental Images | MTSL-DRDR | TATD | SRW | DTR | HDARMST | IFDAFHTL |
Notebook Computer | 0.82 | 0.83 | 0.83 | 0.86 | 0.87 | 0.89 |
Bicycle | 0.80 | 0.85 | 0.86 | 0.85 | 0.90 | 0.92 |
Zebra | 0.84 | 0.88 | 0.81 | 0.87 | 0.84 | 0.90 |
Flower | 0.85 | 0.84 | 0.82 | 0.81 | 0.82 | 0.87 |
Experimental Methods PSNR(Effective Value ≥ 40) Experimental Images | MTSL-DRDR | TATD | SRW | DTR | HDARMST | IFDAFHTL |
Notebook Computer | 40.11 | 40.39 | 41.68 | 41.35 | 41.62 | 42.33 |
Bicycle | 39.84 | 41.51 | 41.39 | 40.72 | 40.91 | 41.97 |
Zebra | 38.97 | 41.34 | 40.26 | 41.69 | 41.55 | 43.06 |
Flower | 39.26 | 40.87 | 42.08 | 40.31 | 41.35 | 42.79 |