Table 3 Comparison of diverse optimizers. Optimization with Adam had the best performance.
Optimizer | Learning Rate | Number of Epochs | Validation loss | Accuracy |
---|---|---|---|---|
AdaGrada | 0.001 | 23 | 0.83 | 55.7% |
Adamb | 0.001 | 46 | 0.18 | 93.7% |
RMSpropc | 0.001 | 62 | 0.18 | 91.4% |
SGDd | 0.001 | 14 | 0.98 | 58.7% |