Fig. 2: The approach to establish the effects of ocular speech tracking.
From: Eye movements track prioritized auditory features in selective attention to natural speech

a Example trials of two participants (S5, S27) show how their measured gaze (green) on the horizontal and vertical plane follows envelope (grey) fluctuations (data was rescaled between 0 - 1 for illustration). b A regularized linear regression approach called temporal response functions (TRF) was used to predict how features of speech are tracked by eye movements. The difference in prediction accuracy for a control model (C) and combined models that additionally contained the speech envelope (CE) or acoustic onsets (CO) was used to estimate ocular speech tracking solely related to the acoustic features of interest, i.e. speech envelope and acoustic onsets. Prediction accuracies were calculated by Fisher z-transformed Spearman’s rank correlations (z’) between measured eye movements (mr) and predicted eye movements (pr). c We expected ocular speech tracking to be modulated by task-induced selective attention. The tracking difference between combined and control models (i.e. pure speech tracking) was expected to be higher whenever sentences were the target in a single-speaker or multi-speaker condition. For statistical computations, we used Bayesian multilevel regression models and illustrated the posterior distributions.