Table 4 Comparison results of various evaluation metrics from decision tree with english Language stopwords removal.

From: Key insights into recommended SMS spam detection datasets

Decision Tree

Dataset

Accuracy

Precision (Ham)

Precision (Spam)

Recall (Ham)

Recall (Spam)

F1 - Score (Ham)

F1 - Score (Spam)

1

96.86%

0.97

0.95

0.99

0.83

0.98

0.89

2

98.28%

0.98

0.98

0.98

0.99

0.98

0.98

3

98.35%

0.99

0.94

0.99

0.94

0.99

0.94

4

78.22%

0.80

0.76

0.82

0.73

0.81

0.74

5

76.55%

0.87

0.65

0.73

0.82

0.80

0.72

6

92.50%

0.92

0.93

0.93

0.92

0.93

0.92

7

89.94%

0.89

0.91

0.91

0.89

0.90

0.90

8

89.96%

0.88

0.92

0.90

0.90

0.89

0.91

9

76.19%

0.64

0.90

0.88

0.69

0.74

0.78

10

86.49%

0.76

0.95

0.93

0.83

0.84

0.88