Table 5 Results of the inter-annotator agreement evaluation. Cohen’s Kappa and Macro F1 values shown are the average of each of the pairwise values obtained.

From: Visual WetlandBirds Dataset: Bird Species Identification and Behavior Recognition in Videos

Metric

Cohen’s Kappa

Fleiss’ Kappa

Macro F1

Score

0,858

0,855

0,946