Table 3 Quantitative evaluation metrics for accuracy verification.
Metrics | Equation | Description |
|---|---|---|
Sensitivity | \(Sensitivity{ = }\frac{TP}{{TP + FN}}\) | The ratio of the number of landslides successfully classified as landslides to the total number of landslides |
Specificity | \(Specificity{ = }\frac{TN}{{FP + TN}}\) | The ratio of the number of successfully classified non-landslides to the total number of non-landslides |
Precision | \(Precision{ = }\frac{TP}{{TP + FP}}\) | The ratio of correct landslide results to the number of landslide results predicted by the classifier |
Accuracy | \(Accuracy{ = }\frac{TP + TN}{{TP + FP + TN + FN}}\) | The ratio of correctly predicted landslide and non-landslide samples to the total number of samples |
F1-score | \({\text{F}}1{\text{ - score}} = 2 \times \frac{Precision \times Sensitivity}{{Precision + Sensitivity}}\) | Both precision and sensitivity metrics are considered together |