Table 3 Parameters of DEC architecture used.
Activation Function | Leaky Relu | |
|---|---|---|
Normalization | Batch Normalization | |
Dropout | 0.3 after each activation | |
Latent dimension | Encoder Layer (5-64-32) | |
Decoder Layer (32-64-5) | ||
Batch Size | 128 | |
Learning Rate | 0.001 | |
Pretrain Epochs | 30 | |
DEC Epochs | 20 | |
Hyperparameter Tuning | Grid Search | alpha= 0.1, 0.5, 1.0, 2.0, 2.5 |
Beta= 0.5, 1.0, 2.0, 2.5 | ||
Optimizer | Adam optimizer | |
Loss Function | MSE and Kullback-Leibler (KL) divergence | |