Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Advertisement

Scientific Data
  • View all journals
  • Search
  • My Account Login
  • Content Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • RSS feed
  1. nature
  2. scientific data
  3. data descriptors
  4. article
Comprehensive dataset of features describing eye-gaze dynamics across multiple tasks
Download PDF
Download PDF
  • Data Descriptor
  • Open access
  • Published: 07 February 2026

Comprehensive dataset of features describing eye-gaze dynamics across multiple tasks

  • Rujeena Mathema1,2 na1,
  • Nav Shamimeh M.1,2 na1,
  • Shailendra Bhandari  ORCID: orcid.org/0000-0002-7860-48541,2,
  • Manoj Regmi1,2,
  • Pedro G. Lind1,2,3,4,
  • Anis Yazidi1,2,5 &
  • …
  • Pedro Lencastre1,2 

Scientific Data , Article number:  (2026) Cite this article

  • 1242 Accesses

  • Metrics details

We are providing an unedited version of this manuscript to give early access to its findings. Before final publication, the manuscript will undergo further editing. Please note there may be errors present which affect the content, and all legal disclaimers apply.

Subjects

  • Predictive markers
  • Psychophysics

Abstract

We present a comprehensive dataset collected from 251 participants, which includes diverse features characterizing eye-gaze dynamics. Participants performed several tasks per session, typical in eye-tracking research: vanishing saccade, cued saccade, flickering cross, rotating ball, and free viewing. Covering these experimental paradigms, the dataset enables analysis of human eye phenomena such as oculomotor control and perceptual processing. As features, the dataset includes timestamped gaze coordinates, pupil sizes, and event classifications, labelled as fixations, saccades, or blinks. All the data were recorded using the EyeLink Portable Duo eye-tracker hardware at 1000 Hz, and processed from raw EyeLink Data Format (EDF) files into structured files via an automated pipeline. The dataset was anonymized according to the Norwegian Agency for Shared Services in Education and Research (SIKT) ethical standards. Possible applications of these datasets are the study of visual attention, cognitive science, and assistive technology.

Similar content being viewed by others

GazeBase, a large-scale, multi-stimulus, longitudinal eye movement dataset

Article Open access 16 July 2021

EyeT4Empathy: Dataset of foraging for visual information, gaze typing and empathy assessment

Article Open access 03 December 2022

Fixations, blinks, and pupils differentially capture individual and interpersonal dynamics in role-asymmetric mutual gaze interaction

Article Open access 11 February 2026

Data availability

The EDF and ASC data files for each participant are available in the Figshare repository36 and can be found within the archive eyeGaze_dataSet_OslometEyetracking_Lab.zip.

Code availability

The Python library used to convert raw EDF recordings into labeled CSV files for the various tasks is provided in the Figshare36, under the directory EyeGazeDynamicsPipeline.

References

  1. Buswell, G. How People Look at Pictures: A Study of the Psychology of Perception in Art (University of Chicago Press, 1935).

  2. Yarbus, A. L. Eye Movements and Vision (Plenum Press, 1967). Originally published in Russian in 1965.

  3. Holmqvist, K. et al. Eye Tracking: A Comprehensive Guide to Methods and Measures (Oxford University Press, Oxford, 2011).

  4. Tullis, T. & Albert, B.Measuring the User Experience: Collecting, Analyzing, and Presenting Usability Metrics (Morgan Kaufmann, Waltham, MA, 2013).

  5. Leigh, R. J. & Zee, D. S.The Neurology of Eye Movements (Oxford University Press, 2015), 5 edn.

  6. Ghosh, S., Dhall, A., Hayat, M., Knibbe, J. & Ji, Q. Automatic gaze analysis: A survey of deep learning based approaches. IEEE Transactions on Pattern Analysis and Machine Intelligence 46, 61–84 (2023).

    Google Scholar 

  7. Martinez-Conde, S., Macknik, S. L. & Hubel, D. H. The role of fixational eye movements in visual perception. Nature Reviews Neuroscience 5, 229–240 (2004).

    Google Scholar 

  8. Rayner, K. Eye movements and attention in reading, scene perception, and visual search. Quarterly Journal of Experimental Psychology 62, 1457–1506 (2009).

    Google Scholar 

  9. Carter, B. & Luke, R. Applications of eye-tracking to explore cognitive processing. Cognitive Science Review 15, 154–169 (2020).

    Google Scholar 

  10. Duchowski, A. T. & Duchowski, J. K. A broader perspective on eye tracking: Cognitive load and emotional response. ACM Transactions on Applied Perception 14, 25:1–25:16 (2017).

    Google Scholar 

  11. Gu, Q. et al. Microsaccades reflect attention shifts: a mini review of 20 years of microsaccade research. Frontiers in psychology15 (2024).

  12. Purves, D. et al. Types of eye movements and their functions https://www.ncbi.nlm.nih.gov/books/NBK10991/ (2016).

  13. Rahal, R. & Fiedler, K. Eye-tracking in psychological research: A review of methods and findings. Psychological Bulletin 145, 689–708 (2019).

    Google Scholar 

  14. Just, M. A. & Carpenter, P. A. A theory of reading: From eye fixations to comprehension. Psychological Review 87, 329–354 (1980).

    Google Scholar 

  15. Alamia, A. et al. Unconscious surprises: Pupil responses and their link to visual processing. Consciousness Studies 26, 78–91 (2019).

    Google Scholar 

  16. Posner, M. I. Orienting of attention. Quarterly Journal of Experimental Psychology 32, 3–25 (1980).

    Google Scholar 

  17. Pastukhov, A. & Braun, J. Individual differences in the frequency of perceptual reversals are stable over time and across stimuli of different complexity. Vision Research 89, 24–34 (2013).

    Google Scholar 

  18. Lei, Y., He, S., Khamis, M. & Ye, J. An end-to-end review of gaze estimation and its interactive applications on handheld mobile devices. ACM Computing Surveys 56, 1–38 (2023).

    Google Scholar 

  19. Krafka, K. et al. Eye tracking for everyonehttps://arxiv.org/abs/1606.05814 (2016).

  20. Lohr, D., Aziz, S., Friedman, L. & Komogortsev, O. V. Gazebasevr, a large-scale, longitudinal, binocular eye-tracking dataset collected in virtual reality. Scientific Data 10, 177 (2023).

    Google Scholar 

  21. Lencastre, P. Eyet4empathy: Dataset of foraging for visual information, gaze typing and empathy assessment. Scientific Data 9, 752 (2022).

    Google Scholar 

  22. IQ & Group, E.-T. R. Tüeyeq: A dataset for iq performance and eye movement analysis. Intelligence Studies (2024).

  23. SR Research Ltd. EyeLink 1000 User Manual: Tower, Desktop, LCD Arm, Primate and Long Range Mounts; Remote, 2000 Hz and Fiber Optic Camera Upgrades. SR Research Ltd. https://natmeg.se/onewebmedia/EL1000_UserManual_1.52.pdf (2010).

  24. SR-Research. Eyelink portable duo. https://www.sr-research.com/wp-content/uploads/2021/07/eyelink-duo-eye-tracker-video-cover.jpg (2023).

  25. RepublicofGamers. Asus rog swift 360hz pg259qnr. https://rog.asus.com/monitors/23-to-24-5-inches/rog-swift-360hz-pg259qnr-model/ (2023).

  26. Saunders, D. R. & Woods, R. L. Direct measurement of the system latency of gaze-contingent displays. Behavior Research Methods 46, 439–447 (2014).

    Google Scholar 

  27. Barlow, H. B. Temporal and spatial summation in human vision at different background intensities. The Journal of Physiology 141, 337–350 (1958).

    Google Scholar 

  28. Breitmeyer, B. & Ogmen, H.Visual Masking: Time slices through conscious and unconscious vision (Oxford University Press, 2006).

  29. Ross, J., Morrone, M. C., Goldberg, M. E. & Burr, D. C. Changes in visual perception at the time of saccades. Trends in Neurosciences 24, 113–121 (2001).

    Google Scholar 

  30. Lencastre, P., Mathema, R. & Lind, P. G. From eyes’ microtremors to critical flicker fusion. PLoS One 20, e0325391, https://doi.org/10.1371/journal.pone.0325391 (2025).

    Google Scholar 

  31. Scocchia, L., Valsecchi, M. & Triesch, J. Top-down influences on ambiguous perception: the role of stable and transient states of the observer. Frontiers in human neuroscience 8, 979 (2014).

    Google Scholar 

  32. Poom, L. Divergent mechanisms of perceptual reversals in spinning and wobbling structure-from-motion stimuli. PLoS ONE 19, e0297963–e0297963 (2024).

    Google Scholar 

  33. Miconi, T., Groomes, L. & Kreiman, G. There’s waldo! a normalization model of visual search predicts single-trial human fixations in an object search task. Cerebral Cortex 26, 3064–3082, https://doi.org/10.1093/cercor/bhv129 (2016).

    Google Scholar 

  34. Birawo, B. & Kasprowski, P. Review and evaluation of eye movement event detection algorithms. Sensors 22, 8810 (2022).

    Google Scholar 

  35. Vikesdal, G. H. & Langaas, T. Saccade latency and fixation stability: Repeatability and reliability. Journal of Eye Movement Research 9, 1–13 (2016).

    Google Scholar 

  36. Mathema, R. Comprehensive datasets of features describing eye-gaze dynamics in different tasks. Figshare, https://doi.org/10.6084/m9.figshare.29312225 (2025).

  37. Ehinger, B. V., Groß, K., Ibs, I. & König, P. A new comprehensive eye-tracking test battery concurrently evaluating the pupil labs glasses and the eyelink 1000. PeerJ 7, e7086 (2019).

    Google Scholar 

  38. Bhandari, S., Lencastre, P. & Mathema, R. et al. Modeling eye gaze velocity trajectories using GANs with spectral loss for enhanced fidelity. Scientific Reports 15, 19929 (2025).

    Google Scholar 

  39. Bhandari, S., Lencastre, P. & Lind, P. Modeling stochastic eye tracking data: A comparison of quantum generative adversarial networks and markov models. GECCO ’24 Companion, 1934-1941 (Association for Computing Machinery, New York, NY, USA, 2024).

  40. Bhandari, S., Lencastre, P., Denisov, S., Bystryk, Y. & Lind, P. Intlevpy: A python library to classify and model intermittent and lévy processes. SoftwareX 31, 102334 (2025).

    Google Scholar 

Download references

Acknowledgements

The authors thank the Research Council of Norway, under the project “Virtual-Eye” (Ref. 335940-FORSKER22).

Funding

Open access funding provided by OsloMet - Oslo Metropolitan University.

Author information

Author notes
  1. These authors contributed equally: Rujeena Mathema, Shamimeh M. Nav.

Authors and Affiliations

  1. Department of Computer Science, OsloMet - Oslo Metropolitan University, P.O. Box 4 St. Olavs plass, 0130, Oslo, Norway

    Rujeena Mathema, Nav Shamimeh M., Shailendra Bhandari, Manoj Regmi, Pedro G. Lind, Anis Yazidi & Pedro Lencastre

  2. OsloMet Artificial Intelligence Lab, Stensberggata 29, N-0166, Oslo, Norway

    Rujeena Mathema, Nav Shamimeh M., Shailendra Bhandari, Manoj Regmi, Pedro G. Lind, Anis Yazidi & Pedro Lencastre

  3. School of Economics, Innovation and Technology, Kristiania University of Applied Sciences, 0153, Oslo, Norway

    Pedro G. Lind

  4. Simula Research Laboratory, Numerical Analysis and Scientific Computing, Oslo, 0164, Norway

    Pedro G. Lind

  5. Department of Informatics, University of Oslo, 0316, Oslo, Norway

    Anis Yazidi

Authors
  1. Rujeena Mathema
    View author publications

    Search author on:PubMed Google Scholar

  2. Nav Shamimeh M.
    View author publications

    Search author on:PubMed Google Scholar

  3. Shailendra Bhandari
    View author publications

    Search author on:PubMed Google Scholar

  4. Manoj Regmi
    View author publications

    Search author on:PubMed Google Scholar

  5. Pedro G. Lind
    View author publications

    Search author on:PubMed Google Scholar

  6. Anis Yazidi
    View author publications

    Search author on:PubMed Google Scholar

  7. Pedro Lencastre
    View author publications

    Search author on:PubMed Google Scholar

Contributions

R.M., S.M.N., S.B. and P.L. were involved in the study design, participant recruitment, data collection, and primary research. R.M., S.M.N., S.B., M.R. and P.L. were involved in the data processing. R.M. and S.B. contributed to developing the GitHub library. The main manuscript was drafted by R.M. and S.B. with contributions from all authors. P.G.L., A.Y. and P.L. contributed to the study conceptualization and design, supervision, as well as coordination with ethical approvals, data storage, and sharing agreements. P.L. conducted the overall coordination of the data collection project.All authors reviewed and approved the final version of the manuscript.

Corresponding author

Correspondence to Shailendra Bhandari.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mathema, R., Nav, S.M., Bhandari, S. et al. Comprehensive dataset of features describing eye-gaze dynamics across multiple tasks. Sci Data (2026). https://doi.org/10.1038/s41597-026-06754-x

Download citation

  • Received: 12 September 2025

  • Accepted: 27 January 2026

  • Published: 07 February 2026

  • DOI: https://doi.org/10.1038/s41597-026-06754-x

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Download PDF

Advertisement

Explore content

  • Research articles
  • News & Comment
  • Collections
  • Follow us on X
  • Sign up for alerts
  • RSS feed

About the journal

  • Aims and scope
  • Editors & Editorial Board
  • Journal Metrics
  • Policies
  • Open Access Fees and Funding
  • Calls for Papers
  • Contact

Publish with us

  • Submission Guidelines
  • Language editing services
  • Open access funding
  • Submit manuscript

Search

Advanced search

Quick links

  • Explore articles by subject
  • Find a job
  • Guide to authors
  • Editorial policies

Scientific Data (Sci Data)

ISSN 2052-4463 (online)

nature.com sitemap

About Nature Portfolio

  • About us
  • Press releases
  • Press office
  • Contact us

Discover content

  • Journals A-Z
  • Articles by subject
  • protocols.io
  • Nature Index

Publishing policies

  • Nature portfolio policies
  • Open access

Author & Researcher services

  • Reprints & permissions
  • Research data
  • Language editing
  • Scientific editing
  • Nature Masterclasses
  • Research Solutions

Libraries & institutions

  • Librarian service & tools
  • Librarian portal
  • Open research
  • Recommend to library

Advertising & partnerships

  • Advertising
  • Partnerships & Services
  • Media kits
  • Branded content

Professional development

  • Nature Awards
  • Nature Careers
  • Nature Conferences

Regional websites

  • Nature Africa
  • Nature China
  • Nature India
  • Nature Japan
  • Nature Middle East
  • Privacy Policy
  • Use of cookies
  • Legal notice
  • Accessibility statement
  • Terms & Conditions
  • Your US state privacy rights
Springer Nature

© 2026 Springer Nature Limited

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing