Table 6 Classification and loss metrics for the best four selection methods (In %).
Algorithm | Accuracy | F1-score | AUC | AP | MCC | Log loss | Jaccard score | Hamming loss |
---|---|---|---|---|---|---|---|---|
Mutual Information | ||||||||
 Random forest | 91 | 90 | 0.97 | 0.98 | 0.80 | 3.19 | 0.86 | 0.09 |
 Logistic regression | 87 | 87 | 0.93 | 90.96 | 0.73 | 4.47 | 0.80 | 0.12 |
 Decision Tree | 87 | 87 | 0.9 | 0.9 | 0.73 | 4.34 | 0.81 | 0.12 |
 KNN | 78 | 77 | 0.77 | 0.79 | 0.53 | 7.54 | 0.69 | 0.21 |
 Adaboost | 89 | 89 | 0.96 | 0.98 | 0.77 | 3.70 | 0.83 | 0.10 |
 Catboost | 91 | 91 | 0.97 | 0.98 | 0.81 | 3.07 | 0.86 | 0.08 |
 Lightgbm | 90 | 89 | 0.97 | 0.98 | 0.78 | 3.45 | 0.85 | 0.1 |
 Xgboost | 91 | 91 | 0.97 | 0.98 | 0.81 | 3.07 | 0.86 | 0.08 |
 Stacking | 90 | 89 | 0.96 | 0.98 | 0.78 | 3.58 | 0.84 | 0.10 |
 Hard-Voting | 95 | 95 | 0.98 | 0.96 | 0.89 | 1.66 | 0.93 | 0.04 |
 Soft-Voting | 94 | 94 | 0.98 | 0.99 | 0.87 | 1.918 | 0.92 | 0.88 |
Bat Algorithm | ||||||||
 Random forest | 90 | 89 | 0.95 | 0.97 | 0.77 | 3.45 | 0.86 | 0.1 |
 Logistic regression | 80 | 78 | 0.92 | 0.96 | 0.60 | 7.03 | 0.71 | 0.20 |
 Decision Tree | 87 | 85 | 0.92 | 0.95 | 0.71 | 4.47 | 0.82 | 0.12 |
 KNN | 77 | 76 | 0.79 | 0.85 | 0.53 | 7.00 | 0.695 | 0.22 |
 Adaboost | 88 | 87 | 0.95 | 0.97 | 0.73 | 4.09 | 0.83 | 0.11 |
 Catboost | 90 | 88 | 0.95 | 0.92 | 0.76 | 3.58 | 0.85 | 0.10 |
 Lightgbm | 92 | 91 | 0.95 | 0.97 | 0.81 | 2.81 | 0.88 | 0.08 |
 Xgboost | 92 | 91 | 0.95 | 0.98 | 0.81 | 2.68 | 0.89 | 0.07 |
 Stacking | 92 | 91 | 0.95 | 0.97 | 0.81 | 2.8 | 0.88 | 0.08 |
 Hard-Voting | 93 | 92 | 0.97 | 0.94 | 0.84 | 2.55 | 0.88 | 0.07 |
 Soft-Voting | 91 | 90 | 0.97 | 0.98 | 0.81 | 3.07 | 0.86 | 0.08 |
Flower Pollination Algorithm | ||||||||
 Random forest | 87 | 86 | 0.95 | 0.97 | 0.73 | 4.47 | 0.81 | 0.12 |
 Logistic regression | 83 | 83 | 0.92 | 0.95 | 0.67 | 5.88 | 0.74 | 0.17 |
 Decision Tree | 83 | 82 | 0.9 | 0.94 | 0.65 | 6.01 | 0.74 | 0.17 |
 KNN | 77 | 76 | 0.77 | 0.8 | 0.51 | 6.05 | 0.67 | 0.23 |
 Adaboost | 84 | 84 | 0.93 | 0.96 | 0.67 | 5.32 | 0.77 | 0.15 |
 Catboost | 88 | 87 | 0.95 | 0.97 | 0.75 | 4.22 | 0.81 | 0.12 |
 Lightgbm | 89 | 88 | 0.95 | 0.97 | 0.76 | 3.83 | 0.83 | 0.11 |
 Xgboost | 86 | 85 | 0.94 | 0.96 | 0.69 | 4.98 | 0.78 | 0.144 |
 Stacking | 87 | 87 | 0.94 | 0.97 | 0.75 | 4.34 | 0.81 | 0.12 |
 Hard-Voting | 85 | 84 | 0.93 | 0.87 | 0.68 | 5.11 | 0.79 | 0.14 |
 Soft-Voting | 86 | 84 | 0.93 | 0.96 | 0.69 | 4.98 | 0.80 | 0.14 |
Jaya Algorithm | ||||||||
 Random forest | 89 | 87 | 0.94 | 0.97 | 0.75 | 3.83 | 0.84 | 0.11 |
 Logistic regression | 80 | 78 | 0.92 | 0.96 | 0.60 | 7.05 | 0.71 | 0.20 |
 Decision Tree | 85 | 83 | 0.9 | 0.94 | 0.66 | 5.11 | 0.80 | 0.14 |
 KNN | 74 | 72 | 0.82 | 0.89 | 0.97 | 9.08 | 0.65 | 0.26 |
 Adaboost | 89 | 87 | 0.95 | 0.97 | 0.73 | 3.96 | 0.84 | 0.11 |
 Catboost | 89 | 87 | 0.94 | 0.97 | 0.74 | 3.96 | 0.84 | 0.11 |
 Lightgbm | 90 | 88 | 0.94 | 0.97 | 0.77 | 3.45 | 0.865 | 0.1 |
 Xgboost | 90 | 88 | 0.95 | 0.97 | 0.75 | 3.58 | 0.86 | 0.10 |
 Stacking | 89 | 88 | 0.94 | 0.97 | 0.75 | 3.70 | 0.855 | 0.10 |
 Hard-Voting | 89 | 88 | 0.95 | 0.89 | 0.76 | 3.96 | 0.82 | 0.1148 |
 Soft-Voting | 89 | 89 | 0.96 | 0.97 | 0.77 | 3.70 | 0.83 | 0.10 |