Table 2 Ranking and characteristics of all algorithms evaluated in Task 2

From: Towards fair decentralized benchmarking of healthcare AI algorithms with the Federated Tumor Segmentation (FeTS) challenge

Model ID

Rank

Architecture

Loss

Post-processing

Ensembling

nnU-Net

15

1

U-Net, larger encoder

CE, batch Dice, region-based

ET (small to NCR)

10

Yes

35

2

U-Net, larger encoder, multi-scale skip block

Focal loss, Jaccard, region-based

30

No

37

3

U-Net

CE, Dice, Top-K, region-based

5

Yes

38

4

U-Net, residual blocks, transformer in bottleneck

CE, Dice

ET (small to NCR)

3

Yes + other

16

5

U-Net

CE, Dice

ET (drop disconnected), TC (fill surrounded), WT (drop small components)

5

Yes

14

6

U-Net, larger encoder

CE, batch Dice, region-based

ET (small to NCR)

5

No

11

7

U-Net

CE, Dice

TC (fill surrounded)

5

Yes

54

8

CoTr, HR-Net, U-Net, U-Net++

CE, Dice, Hausdorff, region-based

ET (small to NCR)

5

Yes + other

10

9

U-Net

CE, Dice, region-based

ET (small to NCR)

5

Yes

31

10

U-Net, larger encoder, residual blocks

Dice, focal loss

ET (small to NCR)

5

No

51

11

HNF-Net

CE, generalized Dice, region-based

ET (small to NCR)

5

No

33

12

U-Net, multiple encoders

CE, Dice, region-based

ET (small to NCR)

4

No

46

13

U-Net

CE, Dice, generalized Wasserstein Dice

8

No

40

14

U-Net, larger encoder, residual blocks

Dice, region-based

ET (small to NCR)

4

No

27

15

U-Net, modality co-attention, multi-scale skip block, transformer in bottleneck

CE, region-based

ET (drop small components)

No

44

16

U-Net

CE, Dice, region-based

ET (convert to NCR based on auxiliary network), drop small components

10

Yes + other

19

17

U-Net

CE, Dice, batch Dice, region-based

ET (small to NCR)

15

Yes + other

32

18

U-Net

Batch Dice, region-based

ET (small to neighboring label), drop small components

5

No

42

19

18

20

HarDNet

CE, Dice, focal loss, region-based

3

No

48

21

U-Net, attention

Dice, region-based

1

No

25

22

U-Net, attention

CE, Dice, region-based

1

No

13

23

26

24

U-Net, multiple decoders

CE, Dice, region-based

TC (remove outside of WT), drop small components, morph. closing

1

No

30

25

2-stage, 2D, CNN, U-Net, U-Net++, residual blocks

Dice

29

No

41

26

CNN, neural architecture search

CE, Dice, region-based

5

No

8

27

Swin Transformer

CE, Dice, VAT, region-based

1

No

12

28

U-Net

Dice, region-based

1

No

47

29

U-Net

CE, Dice

1

No

22

30

2D, U-Net, attention, residual blocks

CE, Dice

No

45

31

2-stage, U-Net, residual blocks

CE, Dice, region-based

ET (small to NCR)

5

No

52

32

U-Net, attention, residual blocks

Dice, region-based

5

No

36

33

2D, U-Net, residual encoder

Dice

1

No

23

34

2D, U-Net, residual encoder, transformer

CE, Dice, region-based

1

No

39

35

2-stage, U-Net

1

No

43

36

U-Net, multi-stage

BCE

fill holes

1

No

21

37

2D, U-Net++

Dice, boundary distance

3

No

28

38

2-stage, CNN, Graph NN

CE

1

No

53

39

CNN, larger encoder, residual blocks

Dice, boundary, region-based

ET (small to NCR)

1

No

29

40

2D, U-Net

Dice

1

No

24

41

  1. Four institutions were not used for ranking, as many models could not be evaluated on them due to technical problems. Brief explanations of the algorithm characteristics are provided in the participants’ methods section. ‘-’ denotes that nothing was reported for this field. CNN convolutional neural network, (B)CE (binary) cross-entropy, VAT virtual adversarial training.