Table 2 Deep learning model hyperparameter configuration specifying the architectural design and training parameters for the LSTM-Attention hybrid neural network applied to ventilation parameter prediction.

From: Digital twin-driven deep learning prediction and adaptive control for coal mine ventilation systems

Hyperparameter

Configuration Value

Description

LSTM layers

3 layers

Number of stacked LSTM layers

Hidden units per layer

128, 64, 32

Neuron count in each LSTM layer

Attention heads

4

Number of parallel attention mechanisms

Dropout rate

0.3

Probability for dropout regularization

Learning rate

0.001

Initial learning rate for Adam optimizer

Batch size

64

Number of samples per training batch

Time window length

60 steps

Historical sequence length for input

Prediction horizon

12 steps

Future time steps to forecast