Table 3 B5 dataset-based hyperparameter configuration.

From: Application of state of health estimation and remaining useful life prediction for lithium-ion batteries based on AT-CNN-BiLSTM

Model

Hidden layers

Hidden layer setup

Dropout

Optimization function

CNN

\(\textrm{Cl}=1\)& \(\textrm{Dl}=1\)

\(\textrm{Fn}=64\)& \(\textrm{Du}=1\)

0.3

Adam

LSTM

\(\textrm{Ll}=1\)& \(\textrm{Dl}=1\)

\(\textrm{Lu}=64\)& \(\textrm{Du}=1\)

0.3

Adam

BiLSTM

\(\textrm{Bl}=1\)& \(\textrm{Dl}=1\)

\(\textrm{Bu}=64\)& \(\textrm{Du}=1\)

0.3

Adam

CNN-LSTM

\(\textrm{Cl}=1\)& \(\textrm{Ll}=1 \& \textrm{Dl}=1\)

\(\textrm{Fn}=64\)& \(\textrm{Lu}=64 \& D u=1\)

0.3

Adam

CNN-BiLSTM

\(\textrm{Cl}=1\)& \(\textrm{Bl}=1 \& \textrm{Dl}=1\)

\(\textrm{Fn}=64\)& \(\textrm{Bl}=64 \& \textrm{Du}=1\)

0.3

Adam

CNN-LSTM-Attention

\(\textrm{Cl}=1\)& \(\textrm{Ll}=1 \& \textrm{Al}=1\) & \(\textrm{Dl}=1\)

\(\textrm{Fn}=64\)& \(\textrm{Du}=1\)

0.3

Adam

CNN-BiLSTM-Attention

\(\textrm{Cl}=1\)& \(\textrm{Bl}=1\) & \(\textrm{Al}=1\) & \(\textrm{Dl}=1\)

\(\textrm{Fn}=64\)& \(\textrm{Bl}=64\) & \(\textrm{At}\)& \(\textrm{Du}=1\)

0.3

Adam

  1. \(\textrm{Cl}, \textrm{Ll}, \textrm{Bl}, \textrm{Dl}, \textrm{Al}\), and Fn represent the number of convolutional layer, the number of LSTM layer, the number of BiLSTM layer, the number of dense layer, the number of attention layer, and the number of filters, respectively; \(\textrm{Lu}, \textrm{Bu}\), and Du are the number of LSTM cells, the number of BiLSTM cells, and the number of dense cells, respectively. The Attention mechanism for time steps is referred to as At