Table 30 Distillation quality scores-F1 score as the performance metric.

From: Distilling knowledge from graph neural networks trained on cell graphs to non-neural student models

Model

Number_of_Parameters

Best_Performance

DQ_Score

ExtraTrees trained on logits

409.975

0.89539

0.0084

XGBoost trained on logits

462.97

0.8712

0.022

Random trained on logits

1331.635

0.8806

0.0248

HistGrad trained on logits

679.8

0.8666

0.02676

LightGBM trained on logits

692.15

0.8794

0.0197