Table 3 Eye Gaze Data.
Reference | Focus | N participants | Age | Input data/device used | Method | Dataset |
---|---|---|---|---|---|---|
Pierce et al.65 | Biomarker detection | 444 subjects from 6 distinct groups | Eye tracking data, Tobii T120 eye tracker | Own dataset | ||
Murias et al.66 | Biomarker detection | 25 ASD | 24–72 months | Eye tracking data, Tobii TX300 eye tracker | Own dataset | |
Chawarska et al.67 | Eye movement to determine prodromal symptoms of ASD | 84 ASD | 6 months | Gaze trajectories, SensoMotoric Instruments IView X RED eye-tracking system | Own dataset | |
Shi et al.68 | Visual stimuli design consideration | 13 ASD, 20 TD | 4–6 years | Infra-red eye-tracking recording, EyeLink1000 | Own dataset | |
Shic et al.69 | Visual attention preference | 28 ASD, 16 DD, 34 TD | 20 months | Gaze patterns, SMI iView X™ RED dark-pupil 60 Hz eye-tracking system | Own dataset | |
Liu et al.73 | Eye movement for diagnosis | 29 ASD, 58 TD | 4–11 years | Gaze data, Tobii T60 eye tracker | Machine learning, k nearest neighbours (kNN) | Own dataset |
Tung et al.61 | Eye detection | 33 ASD | Video, camera | Own dataset | ||
Balestra et al.62 | Eye tracking to study language impairments and text comprehension and production deficits | 1 ASD | 25 years | Eye tracking data, Tobii 1750 eye tracker | n/a | |
Li et al.63 | Identification of fixations and saccades | 38 ASD, 179 TD | Eye-tracking data | Modified DBSCAN Algorithm | Own dataset | |
Matthews et al.64 | Eye gaze analysis for affective state recognition | 19 ASD, 19 TD | ASD: 41.05 ± 32.15 TD: 32.15 ± 9.93 (in years) | Video, Gazepoint GP3 eye-tracker | Scanpath trend analysis and arousal sensing and detection of focal attention | n/a |
Campbell et al.70 | Gaze pattern for saliency analysis | 15 ASD, 13 TD | 8–43 months | Gaze trajectories, SensoMotoric Instruments iView XRED eye-tracking system | Bayesian model | n/a |
Syeda et al.71 | Eye gaze for visual face scanning and emotion analysis | 21 ASD, 21 TD | 5–17 years | Gaze data, Tobii EyeX controller | Own dataset | |
Chrysouli et al.72 | Eye gaze analysis for affective state recognition | Video, Kinect camera | Deep learning, two-stream CNN | MaTHiSis | ||
Liu et al.74 | Eye movement for diagnosis | Children: 20 ASD, 21 TD adults: 19 ASD, 22 intellectually disabled (ID), 28 TD | children: ASD: 7.85 ± 1.59 TD: 7.73 ± 1.51 adults: ASD: 20.84 ± 3.27 ID: 23.59 ± 3.08 TD: 20.61 ± 2.90 | Eye tracking data, Tobii T60 eye tracker | Bag-of-Words (BOW) framework and SVM | |
Vu et al.75 | Gaze pattern for diagnosis | 16 ASD, 16 TD | 2–10 years | Gaze data, Tobii EyeX controller | Machine learning, similarity matching + kNN | Own dataset |
Jiang and Zhao76 | Visual attention preference for diagnosis | 20 ASD, 19 TD | ASD: 30.8 ± 11.1 TD: 32 ± 10.4 (in years) | Eye tracking data | Deep learning | |
Higuchi et al.77 | Gaze direction for behaviour analysis | 2 ASD, 2 TD | Video, camera | OpenFace Toolkit | Own dataset | |
Chong et al.78 | Eye contact detection for behaviour analysis | 50 ASD, 50 TD | Videos, | Deep learning | Own dataset (subset is from MMDB) | |
Toshniwal et al.79 | Attention recognition for assistive technology | 10 ASD, 8 NDD | 12–18 years | Video, Mobile phone | Android Face Detection API | Own dataset |