Table 1 Result on the MIMIC-IV datasets on the test set with the weighted BCE loss and a batch size of 20. The fusion models use the MAGBERT mechanism. The table also presents the average AUROC along with its confidence interval (CI).
Data and Settings | Models | AUPR | AUROC | \(\overline{\text {AUROC}}\) | CI |
---|---|---|---|---|---|
Gentamicin \(T = 4, dt=1\) learning rate: \(1e-4\) | Line | 0.0886 | 0.5027 | 0.5980 | (0.5439, 0.6521) |
LSTM | 0.1136 | 0.5852 | |||
Star | 0.1274 | 0.5893 | |||
Encoder | 0.1151 | 0.5547 | |||
BERT | 0.1644 | 0.5807 | |||
BertLstm | 0.1445 | 0.6077 | |||
BertStar | 0.1631 | 0.6874 | |||
BertEncoder | 0.1249 | 0.6224 | |||
LstmBert | 0.1422 | 0.6156 | |||
StarBert | 0.1379 | 0.6495 | |||
EncoderBert | 0.1364 | 0.5824 | |||
Gentamicin \(T=3, dt= 1\) learning rate: \(5e-5\) | LSTM | 0.0935 | 0.4659 | 0.5388 | (0.4818, 0.5958) |
Star | 0.1041 | 0.5173 | |||
BERT | 0.1513 | 0.4927 | |||
BertLstm | 0.0935 | 0.4659 | |||
BertStar | 0.1271 | 0.6103 | |||
LstmBert | 0.1395 | 0.5770 | |||
StarBert | 0.1223 | 0.5883 | |||
BertEncoder | 0.1474 | 0.6145 | |||
EncoderBert | 0.1041 | 0.5173 | |||
P. aeruginosa \(T=3, dt=1\) learning rate: \(1e-5\) | LSTM | 0.2166 | 0.5000 | 0.5775 | (0.5500, 0.6050) |
BERT | 0.2120 | 0.6000 | |||
BertLstm | 0.2405 | 0.5800 | |||
BertStar | 0.2407 | 0.6300 |