Table 4 Training time and hyperparameter tuning configuration of ML models.
Model | Hyperparameters Tuned | Tuning Method | Training Time (s) | Remarks |
|---|---|---|---|---|
Random Forest (RF) | Number of trees (n_estimators), maximum tree depth (max_depth), minimum samples per split (min_samples_split) | Grid search + 5-fold cross-validation | 18.4 | Achieved highest accuracy with moderate computational cost; robust to feature heterogeneity |
Gradient Boosting (GBM) | Learning rate, number of estimators, maximum depth | Grid search + 5-fold cross-validation | 24.7 | Stable performance with slightly higher training time due to sequential boosting |
Artificial Neural Network (ANN) | Number of hidden layers, neurons per layer, learning rate, batch size, number of epochs | Grid search + 5-fold cross-validation | 61.3 | Required extensive tuning and longest training time due to iterative weight optimization |
Support Vector Machine (SVM) | Kernel type (RBF), regularization parameter (C), gamma | Grid search + 5-fold cross-validation | 14.9 | Fast training but limited performance for high-dimensional nonlinear relationships |