Table 2 Summary of data files included in the dataset, Group 2: CROWD_ANNOTATIONS.

From: A national-scale land cover reference dataset from local crowdsourcing initiatives in Indonesia

File name

Description

Rows

Columns

crowdAnnotationsRaw.csv

LC annotations (accepting/rejecting a priori LC label) made by the crowd (using the mobile application), unprocessed.

1866901

2

crowdAnnotationsRawInfo.csv

Further information about the annotations made by the crowd, unprocessed.

1866901

4

crowdAnnotationsPerAnnotatorMajority_.csv

Annotations made by the crowd, summarized to a unique record (majority) per annotator, per image.

928139

3

crowdAnnotationsConsensusPerSample_.csv

Annotations made by the crowd, processed to obtain a consensus/majority annotation per image.

69800

4

crowdAnnotators_expertAgreement_.csv

Agreement of the crowd annotators with expert annotation on the in-app control images. Summarized per annotator, per pile.

872

6

crowdAnnotators_intraAnnotatorAgreement_.csv

Intra-annotator agreement of the crowd annotators. Summarized per annotator, per pile.

498

6

crowdAnnotators_interAnnotatorAgreement_.csv

Inter-annotator agreement of the crowd annotators. Summarized per annotator, per pile.

896

7

crowdAnnotatorsSummaryScorePerSamplePerLabel_.csv

Summary of the crowd annotations and their credibility scores, per sample, per answer.

83943

11

crowdAnnotatorsSummaryScorePerSamplePerLabel_annotatorsFiltered_.csv

Summary of the crowd annotations and their credibility scores, per sample, per answer, with annotations from low-performing annotators excluded.

76514

11

  1. The files with suffix “_” in the file names are processed annotation data. See header information of the tables in Supplementary File 1.