Fig. 6 | Scientific Data

Fig. 6

From: A national-scale land cover reference dataset from local crowdsourcing initiatives in Indonesia

Fig. 6

Locations, annotations, and the number of annotations of the crowd-annotated LC reference data. (a) All samples (thus crowd consensus/majority answer can be “Yes” or “No”), coloured by the LC label asked to be accepted/rejected in the annotation task (simplified LC legend). (b) Samples with crowd consensus/majority answer “Yes” (thus confirming the prior LC label) and number of annotations (by unique annotators) of at least two. (c) Number of annotations by unique annotators (excluding control items). Note in all (ac): (i) if an annotator made multiple annotations for a sample (item, VHR image chip), the majority annotation from that annotator for that sample was used; (ii) annotations from annotators with expert-agreement scores worse than chance were filtered out.

Back to article page