Fig. 11 | Scientific Reports

Fig. 11

From: Prediction and optimization of stretch flangeability of advanced high strength steels utilizing machine learning approaches

Fig. 11

Schematic illustration of the XGBoost algorithm. This process begins with an initial prediction of \(\:{\widehat{\text{y}}}^{\left(0\right)}=0\), and sequentially builds decision trees to minimize the loss function \(\:\text{L}\), which combines prediction error \(\:\text{l}\)(from \(\:\text{l}({\text{y}}^{\left(0\right)},\:{\widehat{\text{y}}}^{\left(0\right)}),\:\text{l}({\text{y}}^{\left(1\right)},\:{\widehat{\text{y}}}^{\left(1\right)})\:\)… to \(\:\text{l}({\text{y}}^{\left(\text{K}\right)},\:{\widehat{\text{y}}}^{\left(\text{K}\right)})\)) and a regularization term Ω. Each tree \(\:{\text{f}}_{\text{k}}\left({\text{x}}_{\text{i}}\right)\) is built to predict the residuals of the previous iteration, with the final output being the summation of contributions from all trees. This iterative process would continue until convergence, with validation of test set ensuring the model’s generalization.

Back to article page