Table 3 Hyper-parameters for each algorithm in the grid search.

From: An explainable machine learning-based clinical decision support system for prediction of gestational diabetes mellitus

Logistic regression

C: 0.1, 1, 10

solver: newton-cg, lbfgs, liblinear, sag, saga

penalty: l1 (liblinear, saga solver only), l2, elasticnet (saga solver only)

Random forest

n_estimators: 100, 200, 300, 500

max_depth: 10, 20, 30, 50

max_features: auto, sqrt

min_samples_leaf: 1, 2, 4

min_samples_split: 2, 5, 10

Support vector machine

kernel: rbf, poly, sigmoid, linear

C: 0.1, 1, 10

degree: 2, 3, 4 (poly kernel only)

gamma: scale, auto (rbf, poly, sigmoid kernel only)

Adaptive boosting

n_estimators: 20, 50, 100

learning_rate: 0.1, 0.2, 0.3

Extreme gradient boosting

n_estimators: 20, 50, 100

learning_rate: 0.1, 0.2, 0.3

max_depth: 4, 6, 8

objective: binary:logistic

subsample: 0.6, 0.8, 1

colsample_bytree: 0.6, 0.8, 1