Abstract
We present a comprehensive dataset collected from 251 participants, which includes diverse features characterizing eye-gaze dynamics. Participants performed several tasks per session, typical in eye-tracking research: vanishing saccade, cued saccade, flickering cross, rotating ball, and free viewing. Covering these experimental paradigms, the dataset enables analysis of human eye phenomena such as oculomotor control and perceptual processing. As features, the dataset includes timestamped gaze coordinates, pupil sizes, and event classifications, labelled as fixations, saccades, or blinks. All the data were recorded using the EyeLink Portable Duo eye-tracker hardware at 1000 Hz, and processed from raw EyeLink Data Format (EDF) files into structured files via an automated pipeline. The dataset was anonymized according to the Norwegian Agency for Shared Services in Education and Research (SIKT) ethical standards. Possible applications of these datasets are the study of visual attention, cognitive science, and assistive technology.
Similar content being viewed by others
Data availability
The EDF and ASC data files for each participant are available in the Figshare repository36 and can be found within the archive eyeGaze_dataSet_OslometEyetracking_Lab.zip.
Code availability
The Python library used to convert raw EDF recordings into labeled CSV files for the various tasks is provided in the Figshare36, under the directory EyeGazeDynamicsPipeline.
References
Buswell, G. How People Look at Pictures: A Study of the Psychology of Perception in Art (University of Chicago Press, 1935).
Yarbus, A. L. Eye Movements and Vision (Plenum Press, 1967). Originally published in Russian in 1965.
Holmqvist, K. et al. Eye Tracking: A Comprehensive Guide to Methods and Measures (Oxford University Press, Oxford, 2011).
Tullis, T. & Albert, B.Measuring the User Experience: Collecting, Analyzing, and Presenting Usability Metrics (Morgan Kaufmann, Waltham, MA, 2013).
Leigh, R. J. & Zee, D. S.The Neurology of Eye Movements (Oxford University Press, 2015), 5 edn.
Ghosh, S., Dhall, A., Hayat, M., Knibbe, J. & Ji, Q. Automatic gaze analysis: A survey of deep learning based approaches. IEEE Transactions on Pattern Analysis and Machine Intelligence 46, 61–84 (2023).
Martinez-Conde, S., Macknik, S. L. & Hubel, D. H. The role of fixational eye movements in visual perception. Nature Reviews Neuroscience 5, 229–240 (2004).
Rayner, K. Eye movements and attention in reading, scene perception, and visual search. Quarterly Journal of Experimental Psychology 62, 1457–1506 (2009).
Carter, B. & Luke, R. Applications of eye-tracking to explore cognitive processing. Cognitive Science Review 15, 154–169 (2020).
Duchowski, A. T. & Duchowski, J. K. A broader perspective on eye tracking: Cognitive load and emotional response. ACM Transactions on Applied Perception 14, 25:1–25:16 (2017).
Gu, Q. et al. Microsaccades reflect attention shifts: a mini review of 20 years of microsaccade research. Frontiers in psychology15 (2024).
Purves, D. et al. Types of eye movements and their functions https://www.ncbi.nlm.nih.gov/books/NBK10991/ (2016).
Rahal, R. & Fiedler, K. Eye-tracking in psychological research: A review of methods and findings. Psychological Bulletin 145, 689–708 (2019).
Just, M. A. & Carpenter, P. A. A theory of reading: From eye fixations to comprehension. Psychological Review 87, 329–354 (1980).
Alamia, A. et al. Unconscious surprises: Pupil responses and their link to visual processing. Consciousness Studies 26, 78–91 (2019).
Posner, M. I. Orienting of attention. Quarterly Journal of Experimental Psychology 32, 3–25 (1980).
Pastukhov, A. & Braun, J. Individual differences in the frequency of perceptual reversals are stable over time and across stimuli of different complexity. Vision Research 89, 24–34 (2013).
Lei, Y., He, S., Khamis, M. & Ye, J. An end-to-end review of gaze estimation and its interactive applications on handheld mobile devices. ACM Computing Surveys 56, 1–38 (2023).
Krafka, K. et al. Eye tracking for everyonehttps://arxiv.org/abs/1606.05814 (2016).
Lohr, D., Aziz, S., Friedman, L. & Komogortsev, O. V. Gazebasevr, a large-scale, longitudinal, binocular eye-tracking dataset collected in virtual reality. Scientific Data 10, 177 (2023).
Lencastre, P. Eyet4empathy: Dataset of foraging for visual information, gaze typing and empathy assessment. Scientific Data 9, 752 (2022).
IQ & Group, E.-T. R. Tüeyeq: A dataset for iq performance and eye movement analysis. Intelligence Studies (2024).
SR Research Ltd. EyeLink 1000 User Manual: Tower, Desktop, LCD Arm, Primate and Long Range Mounts; Remote, 2000 Hz and Fiber Optic Camera Upgrades. SR Research Ltd. https://natmeg.se/onewebmedia/EL1000_UserManual_1.52.pdf (2010).
SR-Research. Eyelink portable duo. https://www.sr-research.com/wp-content/uploads/2021/07/eyelink-duo-eye-tracker-video-cover.jpg (2023).
RepublicofGamers. Asus rog swift 360hz pg259qnr. https://rog.asus.com/monitors/23-to-24-5-inches/rog-swift-360hz-pg259qnr-model/ (2023).
Saunders, D. R. & Woods, R. L. Direct measurement of the system latency of gaze-contingent displays. Behavior Research Methods 46, 439–447 (2014).
Barlow, H. B. Temporal and spatial summation in human vision at different background intensities. The Journal of Physiology 141, 337–350 (1958).
Breitmeyer, B. & Ogmen, H.Visual Masking: Time slices through conscious and unconscious vision (Oxford University Press, 2006).
Ross, J., Morrone, M. C., Goldberg, M. E. & Burr, D. C. Changes in visual perception at the time of saccades. Trends in Neurosciences 24, 113–121 (2001).
Lencastre, P., Mathema, R. & Lind, P. G. From eyes’ microtremors to critical flicker fusion. PLoS One 20, e0325391, https://doi.org/10.1371/journal.pone.0325391 (2025).
Scocchia, L., Valsecchi, M. & Triesch, J. Top-down influences on ambiguous perception: the role of stable and transient states of the observer. Frontiers in human neuroscience 8, 979 (2014).
Poom, L. Divergent mechanisms of perceptual reversals in spinning and wobbling structure-from-motion stimuli. PLoS ONE 19, e0297963–e0297963 (2024).
Miconi, T., Groomes, L. & Kreiman, G. There’s waldo! a normalization model of visual search predicts single-trial human fixations in an object search task. Cerebral Cortex 26, 3064–3082, https://doi.org/10.1093/cercor/bhv129 (2016).
Birawo, B. & Kasprowski, P. Review and evaluation of eye movement event detection algorithms. Sensors 22, 8810 (2022).
Vikesdal, G. H. & Langaas, T. Saccade latency and fixation stability: Repeatability and reliability. Journal of Eye Movement Research 9, 1–13 (2016).
Mathema, R. Comprehensive datasets of features describing eye-gaze dynamics in different tasks. Figshare, https://doi.org/10.6084/m9.figshare.29312225 (2025).
Ehinger, B. V., Groß, K., Ibs, I. & König, P. A new comprehensive eye-tracking test battery concurrently evaluating the pupil labs glasses and the eyelink 1000. PeerJ 7, e7086 (2019).
Bhandari, S., Lencastre, P. & Mathema, R. et al. Modeling eye gaze velocity trajectories using GANs with spectral loss for enhanced fidelity. Scientific Reports 15, 19929 (2025).
Bhandari, S., Lencastre, P. & Lind, P. Modeling stochastic eye tracking data: A comparison of quantum generative adversarial networks and markov models. GECCO ’24 Companion, 1934-1941 (Association for Computing Machinery, New York, NY, USA, 2024).
Bhandari, S., Lencastre, P., Denisov, S., Bystryk, Y. & Lind, P. Intlevpy: A python library to classify and model intermittent and lévy processes. SoftwareX 31, 102334 (2025).
Acknowledgements
The authors thank the Research Council of Norway, under the project “Virtual-Eye” (Ref. 335940-FORSKER22).
Funding
Open access funding provided by OsloMet - Oslo Metropolitan University.
Author information
Authors and Affiliations
Contributions
R.M., S.M.N., S.B. and P.L. were involved in the study design, participant recruitment, data collection, and primary research. R.M., S.M.N., S.B., M.R. and P.L. were involved in the data processing. R.M. and S.B. contributed to developing the GitHub library. The main manuscript was drafted by R.M. and S.B. with contributions from all authors. P.G.L., A.Y. and P.L. contributed to the study conceptualization and design, supervision, as well as coordination with ethical approvals, data storage, and sharing agreements. P.L. conducted the overall coordination of the data collection project.All authors reviewed and approved the final version of the manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Mathema, R., Nav, S.M., Bhandari, S. et al. Comprehensive dataset of features describing eye-gaze dynamics across multiple tasks. Sci Data (2026). https://doi.org/10.1038/s41597-026-06754-x
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41597-026-06754-x


