Table 5 Previous modeling techniques used for LWHSC.
S. No | Technique | Property | Best R2 value | References |
|---|---|---|---|---|
1 | Gradient Boosting Regression (GBR) | CS of LWHSC | 0.95 | |
2 | GPR, Ensemble Learning (EL), Support Vector Machine Regression (SVMR), and optimized GPR, SVMR, and EL | CS of LWC | (Optimized GPR, 0.9803) (SVMR, 0.9777) (GPR, 0.9740) | |
3 | Multilayer Perceptron (MLP), Support Vector Machine (SVM), and Decision Tree (DT) | CS of lightweight pumice concrete | 0.914 | |
4 | Gradient-Boosted Trees (GBT), RF, Tree Ensemble (TE), Extreme Gradient Boosting (XGB), Keras Neural Network (KNN), Simple Regression (SR), Probabilistic Neural Network (PNN), Multilayer Perceptron (MLP), and Linear Regression (LR) | CS of LWC | 0.90 | |
5 | Support Vector Machine (SVM), Artificial Neural Network (ANN), DT, GPR, and XGBoost. | lightweight aggregate concrete (LWAC) | 0.99 | |
6 | RF and GEP | CS of HSC | 0.96 | |
7 | ANN | CS | 0.83 | |
8 | XG boost, Catboat, Extra trees regressor, Bagging regressor | CS, STS, FS | 0.98 | |
9 | ANN | CS, STS, FS, E | 0.99 | |
10 | ELM | CS, STS, FS | 0.96 | |
11 | REG, CART, CHAID, ANN, SVM | Slump, CS | 0.85 | |
12 | SVM, GRP | CS | 0.99 | |
13 | LR, XGBOOST | CS, STS | 0.99 | |
14 | MEP and RF | CS, TS, FS | 0.99 | Present work |