Table 3 Machine learning models and their parameter configurations.
Model | Parameter | Value/setting | Description | Data Partitioning |
|---|---|---|---|---|
Gradient Boosting | Implementation | scikit-learn | Python library used for model development | 10-fold cross-validation (90% train / 10% test per fold) |
Number of trees | 100 | Total estimators in the ensemble | ||
Learning rate | 0.1 | Step size controlling contribution of each tree | ||
Maximum depth | 3 | Limits complexity of individual trees | ||
Minimum samples split | 2 | Minimum number of samples required to split a node | ||
Training fraction | 1.0 (full dataset) | Ensures replicable performance | ||
Validation method | k-fold cross-validation | Evaluates model robustness and generalization | ||
AdaBoost | Implementation | scikit-learn | Python library used for model development | 10-fold cross-validation (90% train / 10% test per fold) |
Base estimator | Decision Tree Regressor | Weak learner for boosting | ||
Number of estimators | 50 | Total boosting iterations | ||
Learning rate | 1 | Weight applied to each estimator | ||
Loss function | Linear regression loss | Appropriate for continuous targets | ||
Validation method | k-fold cross-validation | Used for model robustness assessment | ||
Linear Regression | Implementation | scikit-learn | Python library used for model development | 10-fold cross-validation (90% train / 10% test per fold) |
Regularization | None | Standard linear regression without penalty | ||
Model type | Ordinary Least Squares (OLS) | Direct mapping between inputs and outputs | ||
Validation method | k-fold cross-validation | Consistent with other models for comparison |