Table 1 Comparison results of combination of batch normalization (BN) and dropout rate. BN only showed the best accuracy.
Regularization Layer | Kernel size | Activation function | Optimizer | Number of Epochs | Accuracy |
---|---|---|---|---|---|
BN | 2 × 2 | ReLUa | Adamb | 46 | 93.7% |
BN and Dropout (0.1)c | 2 × 2 | ReLU | Adam | 28 | 74.6% |
BN and Dropout (0.2) | 2 × 2 | ReLU | Adam | 165 | 92.7% |
BN and Dropout (0.3) | 2 × 2 | ReLU | Adam | 48 | 86.6% |
BN and Dropout (0.4) | 2 × 2 | ReLU | Adam | 46 | 85.0% |
BN and Dropout (0.5) | 2 × 2 | ReLU | Adam | 55 | 73.9% |
Dropout (0.1) | 2 × 2 | ReLU | Adam | 59 | 86.6% |
Dropout (0.2) | 2 × 2 | ReLU | Adam | 116 | 88.2% |
Dropout (0.3) | 2 × 2 | ReLU | Adam | 96 | 87.7% |
Dropout (0.4) | 2 × 2 | ReLU | Adam | 112 | 87.9% |
Dropout (0.5) | 2 × 2 | ReLU | Adam | 28 | 72.1% |