Table 6 Comparison results of various evaluation metrics from decision tree with non-English Language stopwords removal.

From: Key insights into recommended SMS spam detection datasets

Decision Tree

Dataset

Accuracy

Precision (Ham)

Precision (Spam)

Recall (Ham)

Recall (Spam)

F1 - Score (Ham)

F1 - Score (Spam)

2

97.85%

0.99

0.97

0.97

0.99

0.98

0.98

4

81.19%

0.81

0.82

0.88

0.73

0.84

0.77

6

91.75%

0.92

0.92

0.92

0.92

0.92

0.92

7

91.33%

0.92

0.91

0.91

0.92

0.91

0.91

8

88.21%

0.84

0.92

0.91

0.86

0.87

0.89

9

76.19%

0.80

0.75

0.50

0.92

0.62

0.83

10

78.38%

0.69

0.86

0.79

0.78

0.73

0.82