Table 3 Benchmark on each dataset of MedMNIST2D in metrics of AUC and ACC.

From: MedMNIST v2 - A large-scale lightweight benchmark for 2D and 3D biomedical image classification

Methods

PathMNIST

ChestMNIST

DermaMNIST

OCTMNIST

PneumoniaMNIST

RetinaMNIST

AUC

ACC

AUC

ACC

AUC

ACC

AUC

ACC

AUC

ACC

AUC

ACC

Methods

BreastMNIST

BloodMNIST

TissueMNIST

OrganAMNIST

OrganCMNIST

OrganSMNIST

 

AUC

ACC

AUC

ACC

AUC

ACC

AUC

ACC

AUC

ACC

AUC

ACC

ResNet-18 (28)10

0.983

0.907

0.768

0.947

0.917

0.735

0.943

0.743

0.944

0.854

0.717

0.524

ResNet-18 (224)10

0.989

0.909

0.773

0.947

0.920

0.754

0.958

0.763

0.956

0.864

0.710

0.493

ResNet-50 (28)10

0.990

0.911

0.769

0.947

0.913

0.735

0.952

0.762

0.948

0.854

0.726

0.528

ResNet-50 (224)10

0.989

0.892

0.773

0.948

0.912

0.731

0.958

0.776

0.962

0.884

0.716

0.511

auto-sklearn11

0.934

0.716

0.649

0.779

0.902

0.719

0.887

0.601

0.942

0.855

0.690

0.515

AutoKeras12

0.959

0.834

0.742

0.937

0.915

0.749

0.955

0.763

0.947

0.878

0.719

0.503

Google AutoML Vision

0.944

0.728

0.778

0.948

0.914

0.768

0.963

0.771

0.991

0.946

0.750

0.531

ResNet-18 (28)10

0.901

0.863

0.998

0.958

0.930

0.676

0.997

0.935

0.992

0.900

0.972

0.782

ResNet-18 (224)10

0.891

0.833

0.998

0.963

0.933

0.681

0.998

0.951

0.994

0.920

0.974

0.778

ResNet-50 (28)10

0.857

0.812

0.997

0.956

0.931

0.680

0.997

0.935

0.992

0.905

0.972

0.770

ResNet-50 (224)10

0.866

0.842

0.997

0.950

0.932

0.680

0.998

0.947

0.993

0.911

0.975

0.785

auto-sklearn11

0.836

0.803

0.984

0.878

0.828

0.532

0.963

0.762

0.976

0.829

0.945

0.672

AutoKeras12

0.871

0.831

0.998

0.961

0.941

0.703

0.994

0.905

0.990

0.879

0.974

0.813

Google AutoML Vision

0.919

0.861

0.998

0.966

0.924

0.673

0.990

0.886

0.988

0.877

0.964

0.749