Table 2 Comparison of diverse activation functions. ReLU showed the best accuracy.
Activation function | Number of Epochs | Accuracy |
---|---|---|
LeakyReLU (0.1)a | 37 | 84.2% |
PReLUb | 28 | 90.4% |
ELU (0.1)c | 73 | 91.6% |
ReLUd | 46 | 93.7% |
Activation function | Number of Epochs | Accuracy |
---|---|---|
LeakyReLU (0.1)a | 37 | 84.2% |
PReLUb | 28 | 90.4% |
ELU (0.1)c | 73 | 91.6% |
ReLUd | 46 | 93.7% |