Table 1 Examples of hand gesture datasets.

From: A richly annotated dataset of co-speech hand gestures across diverse speaker contexts

Dataset

Medium

Description

HANDS69

video frames

29 static gestures

primarily emblems produced by 5 people

3DIG70

video and motion capture

1739 dynamic iconic gestures

referring to 20 objects produced by 30 people

BEAT71

motion capture

number of gestures not reported

76-hour 3D motion capture

produced by 30 people

semantic relevancy and emotion categories annotations

NEMO72

video and motion capture

3715 dynamic gestures

2D & 3D motion capture

produced by 428 people

iconic gestures referring to 35 objects

M3D-TED73

annotations

1139 gesture strokes

produced by 5 English speakers

23 minutes of annotated referential and non-referential gestures

EcoLang74

video and annotations

produced by 78 dyads

naturalistic gestures during adult-child and adult-adult interaction

Talking With Hands Dataset75

motion capture

no number of gestures reported

50-hour 3D motion capture

produced by 50 people

collaborative gestures captured during conversation

EGGNOG76

video and motion capture

20 dynamic gestures

produced by 40 people

8 hours of video

gestures during collaborative object-oriented task

Speaker-Specific Gesture Dataset77

videos and motion capture

dynamic gestures

144-hour 2D motion capture

produced by 10 people (lecturers, TV show hosts, televangelists)

SaGA78

videos

1764 gestures

detailed gesture annotations

produced from 25 dialogs in direction-giving task

iconic and deictic gestures

Dutch dyadic dataset79

video and annotations

439 gestures

produced by 34 dyads in Dutch

naturalistic gestures in conversation

  1. For each dataset, we list its medium (video, motion capture) and give a brief description of its main properties including the type and number of gestures included.