Table 1 Parameters of the proposed model used in the experiments.
From: Enhancing heart disease prediction using a self-attention-based transformer model
Parameter | Description |
---|---|
Model | Self-attention-based transformer model |
Input dimension \(=14\) | Input features dimension |
Output dimension \(=\mathrm{2,4}\) | Number of output classes |
d-model = 128 | Dimensionality of the model's hidden states |
nhead \(=4\) | Attention heads in the multi-head self-attention |
Num-layers \(=4\) | Layers in the encoder |
Dropout \(=0.2\) | Dropout probability |
Batch-size \(=\mathrm{32,64}\) | Number of samples |
Epochs \(=90\) | number of iterations |
Learning-rate \(=0.001\) | Learning rate |
Optimizer \(=\) Adam | optimizer used for updating the parameters |
Train-loss | Avg raining loss over the training dataset |
Cross entropy | Loss function |
Test-loss | Avg loss over the testing dataset |