Table 4 Prediction Model Performance Evaluation Indicators.
Evaluation Indicator | Calculation Formula | Applicable Scenario |
|---|---|---|
Mean Absolute Error (MAE) | \(\frac{1}{n}\mathop \sum \limits_{i = 1}^{n} \parallel y_{i} - \widehat{{y_{i} }}\parallel\) | Progress prediction, continuous quality metrics |
Root Mean Square Error (RMSE) | \(\sqrt {\frac{1}{n}\mathop \sum \limits_{i = 1}^{n} \left( {y_{i} - \widehat{{y_{i} }}} \right)^{2} }\) | High-precision numerical prediction tasks |
F1-Score | \(\frac{{2 \cdot {\text{Precision}} \cdot {\text{Recall}}}}{{{\text{Precision}} + {\text{Recall}}}}\) | Risk classification, quality categorization |
Area Under Curve (AUC) | \(\mathop \smallint \limits_{0}^{1} {\text{TPR}}\left( t \right)\,d{\text{FPR}}\left( t \right)\) | Binary risk assessment, anomaly detection |
Mean Absolute Percentage Error (MAPE) | \(\frac{{100{\text{\% }}}}{n}\mathop \sum \limits_{i = 1}^{n} \frac{{y_{i} - \widehat{{y_{i} }}}}{{y_{i} }}\) | Relative error assessment, percentage-based metrics |