Table 3 Comparison of diverse optimizers. Optimization with Adam had the best performance.

From: Detection of Hepatocellular Carcinoma in Contrast-Enhanced Magnetic Resonance Imaging Using Deep Learning Classifier: A Multi-Center Retrospective Study

Optimizer

Learning Rate

Number of Epochs

Validation loss

Accuracy

AdaGrada

0.001

23

0.83

55.7%

Adamb

0.001

46

0.18

93.7%

RMSpropc

0.001

62

0.18

91.4%

SGDd

0.001

14

0.98

58.7%

  1. aAdaGrad: adaptive gradient algorithm.
  2. bAdam: a method for stochastic optimization.
  3. cRMSprop: a mini-batch version of rprop.
  4. dSGD: stochastic gradient descent.