Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Advertisement

Nature Communications
  • View all journals
  • Search
  • My Account Login
  • Content Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • RSS feed
  1. nature
  2. nature communications
  3. articles
  4. article
Spatial-chromatic encoding cosmetic contact lenses for enhanced natural eye tracking
Download PDF
Download PDF
  • Article
  • Open access
  • Published: 03 February 2026

Spatial-chromatic encoding cosmetic contact lenses for enhanced natural eye tracking

  • Hengtian Zhu1,
  • Heyu Huang1,
  • Huan Yang1,
  • Zixu Li1,
  • Zhenning Qi1,
  • Yuan Fang2,
  • Yining Xu1,
  • Yifeng Xiong1,
  • Ye Chen  ORCID: orcid.org/0000-0003-0269-07223,
  • Songtao Yuan  ORCID: orcid.org/0000-0001-9212-06642 &
  • …
  • Fei Xu  ORCID: orcid.org/0000-0001-5239-45721,4,5 

Nature Communications , Article number:  (2026) Cite this article

We are providing an unedited version of this manuscript to give early access to its findings. Before final publication, the manuscript will undergo further editing. Please note there may be errors present which affect the content, and all legal disclaimers apply.

Subjects

  • Biomedical engineering
  • Computer science
  • Glasses

Abstract

Vision-based analysis of ocular features represents the predominant approach for eye tracking. However, these features are highly susceptible to interference from illumination, eyelid/eyelash occlusion, and individual variations, leading to low recognition rates and diminished tracking accuracy. To address these limitations, eye movement feature enhanced cosmetic contact lenses are proposed, implementing a spatial-chromatic encoding strategy. Employing a head-mounted eye tracker integrated with RGB cameras, this system enables accurate and robust gaze tracking in natural environments. Under challenging illumination, the lenses achieve a 93% feature recognition rate, significantly surpassing pupil recognition and tolerating highly off-axis camera placement. Eye movement model and human eye tracking demonstrate superior accuracy (<1°), gaze direction estimation, and continuous fixation positioning. Utilizing these lenses, diverse eye-tracking applications are demonstrated, including image identification, reading analysis, and outdoor interaction. This approach advances the development of lightweight, unobtrusive eye-tracking systems and facilitates broader application of gaze-based interaction technology in real-world settings.

Data availability

The authors declare that all data supporting the results in this study are present in the paper, and the data sources are uploaded in the Supplementary Information with this paper. Any additional requests for information can be directed to and will be fulfilled by the corresponding authors. Source data are provided with this paper.

Code availability

The codes supporting this study’s findings are available from https://github.com/yiyinju/feature-recognition-for-EMFE-cosmetic-contact-lens53

References

  1. Valliappan, N. et al. Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nat. Commun. 11, 4553 (2020).

    Google Scholar 

  2. Chang, H. et al. Sleep microstructure organizes memory replay. Nature 637, 1161–1169 (2025).

    Google Scholar 

  3. Gehmacher, Q. et al. Eye movements track prioritized auditory features in selective attention to natural speech. Nat. Commun. 15, 3692 (2024).

    Google Scholar 

  4. Pärnamets, P. et al. Biasing moral decisions by exploiting the dynamics of eye gaze. Proc. Natl. Acad. Sci. USA 112, 4170–4175 (2015).

    Google Scholar 

  5. Adhanom, I. B., MacNeilage, P. & Folmer, E. Eye Tracking in virtual reality: a broad review of applications and challenges. Virtual Real. 27, 1481–1505 (2023).

    Google Scholar 

  6. Song, J.-H., van de Groep, J., Kim, S. J. & Brongersma, M. L. Non-local metasurfaces for spectrally decoupled wavefront manipulation and eye tracking. Nat. Nanotechnol. 16, 1224–1230 (2021).

    Google Scholar 

  7. Novák, J. S. et al. Eye tracking, usability, and user experience: a systematic review. Int. J. Hum.-Comput. Interact. 40, 4484–4500 (2024).

    Google Scholar 

  8. Clark, R. et al. The potential and value of objective eye tracking in the ophthalmology clinic. Eye 33, 1200–1202 (2019).

    Google Scholar 

  9. Fu, H. L. et al. Influence of cues on the safety hazard recognition of construction workers during safety training: evidence from an eye-tracking experiment. J. Civ. Eng. Educ. 150, 1 (2024).

    Google Scholar 

  10. Xu, J. W. et al. Left gaze bias between LHT and RHT: a recommendation strategy to mitigate human errors in left- and right-hand driving. IEEE Trans. Intell. Veh. 8, 4406–4417 (2023).

    Google Scholar 

  11. Homayounfar, S. Z. et al. Multimodal smart eyewear for longitudinal eye movement tracking. Matter 3, 1275–1293 (2020).

    Google Scholar 

  12. Robinson, D. A. A method of measuring eye movemnent using a scieral search coil in a magnetic field. IEEE Trans. Bio-Med. Electron. 10, 137–145 (1963).

    Google Scholar 

  13. Houben, M. M. J., Goumans, J. & van der Steen, J. Recording three-dimensional eye movements: scleral search coils versus video oculography. Investig. Ophthalmol. Vis. Sci. 47, 179–187 (2006).

    Google Scholar 

  14. Ebisawa, Y. & Fukumoto, K. Head-free, remote eye-gaze detection system based on pupil-corneal reflection method with easy calibration using two stereo-calibrated video cameras. IEEE Trans. Biomed. Eng. 60, 2952–2960 (2013).

    Google Scholar 

  15. Chi, J.-N. et al. Key techniques of eye gaze tracking based on pupil corneal reflection. in 2009 WRI Global Congress on Intelligent Systems, pp 133–138 (2009).

  16. Frey, M., Nau, M. & Doeller, C. F. Magnetic resonance-based eye tracking using deep neural networks. Nat. Neurosci. 24, 1772–1779 (2021).

    Google Scholar 

  17. Shi, Y. et al. Eye tracking and eye expression decoding based on transparent, flexible and ultra-persistent electrostatic interface. Nat. Commun. 14, 3315 (2023).

    Google Scholar 

  18. Villanueva, A. & Cabeza, R. A novel gaze estimation system with one calibration point. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 38, 1123–1138 (2008).

    Google Scholar 

  19. Ponz, V. et al. Topography-based detection of the iris centre using multiple-resolution images. in 2011 Irish Machine Vision and Image Processing Conference, pp 32–37 (2011).

  20. Valenti, R., Sebe, N. & Gevers, T. Combining head pose and eye location information for gaze estimation. IEEE Trans. Image Process. 21, 802–815 (2012).

    Google Scholar 

  21. Park, S., Spurr, A. & Hilliges, O. Deep pictorial gaze estimation. in Computer Vision - ECCV 2018, (Cham, 2018) pp 741–757.

  22. Martinikorena, I. et al. A. Fast and robust ellipse detection algorithm for head-mounted eye tracking systems. Mach. Vis. Appl. 29, 845–860 (2018).

    Google Scholar 

  23. Wu, Z. et al. EyeNet: a multi-task deep network for off-axis eye gaze estimation. in 2019 IEEE/CVF International Conference on Computer Vision Workshop, pp 3683–3687 (2019).

  24. Morimoto, C. H. et al. Pupil detection and tracking using multiple light sources. Image Vis. Comput. 18, 331–335 (2000).

    Google Scholar 

  25. Coutinho, F. L. & Morimoto, C. H., Free head motion eye gaze tracking using a single camera and multiple light sources. in 19th Brazilian Symposium on Computer Graphics and Image Processing, Manaus, BRAZIL, p 171 (2006).

  26. Mestre, C., Gautier, J. & Pujol, J. Robust eye tracking based on multiple corneal reflections for clinical applications. J. Biomed. Opt. 23, 1–9 (2018).

    Google Scholar 

  27. Beymer, D. & Flickner, M. Eye gaze tracking using an active stereo head. in 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings, Madison, WI, USA, p II-451 (2003).

  28. Guestrin, E. D. & Eizenman, M. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Trans. Biomed. Eng. 53, 1124–1133 (2006).

    Google Scholar 

  29. Wang, J. et al. Accurate eye tracking from dense 3D surface reconstructions using single-shot deflectometry. Nat. Commun. 16, 2902 (2025).

    Google Scholar 

  30. Tonsen, M. et al. Labelled pupils in the wild: a dataset for studying pupil detection in unconstrained environments. in Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, South Carolina, pp 139–142 (2016).

  31. Zhu, Z. & Ji, Q. Robust real-time eye detection and tracking under variable lighting conditions and various face orientations. Comput. Vis. Image Underst. 98, 124–154 (2005).

    Google Scholar 

  32. Yao, G. et al. Snowflake-inspired and blink-driven flexible piezoelectric contact lenses for effective corneal injury repair. Nat. Commun. 14, 3604 (2023).

    Google Scholar 

  33. Zhou, C. et al. Modulus-adjustable and mechanically adaptive dry microneedle electrodes for personalized electrophysiological recording. npj Flex. Electron. 9, 77 (2025).

    Google Scholar 

  34. Yao, G. et al. A programmable and skin temperature-activated electromechanical synergistic dressing for effective wound healing. Sci. Adv. 8, eabl8379 (2022).

    Google Scholar 

  35. Yao, G. et al. Smart contact lenses: Catalysts for science fiction becoming reality. Innovation 5, 100710 (2024).

    Google Scholar 

  36. Massin, L. et al. Multipurpose bio-monitored integrated circuit in a contact lens eye-tracker. Sensors 22, 595 (2022).

    Google Scholar 

  37. Zhu, H. et al. Frequency-encoded eye tracking smart contact lens for human–machine interaction. Nat. Commun. 15, 3588 (2024).

    Google Scholar 

  38. Gan, X. et al. Closed-eye intraocular pressure and eye movement monitoring via a stretchable bimodal contact lens. Microsyst. Nanoeng. 11, 83 (2025).

    Google Scholar 

  39. Fradkin, I. M. et al. Contact lens with moiré labels for precise eye tracking. in Proc. SPIE 13414, Optical Architectures for Displays and Sensing in Augmented, Virtual, and Mixed Reality (AR, VR, MR) VI, San Francisco, California, United States, p 134141W (2025).

  40. Khaldi, A. et al. A laser emitting contact lens for eye tracking. Sci. Rep. 10, 14804 (2020).

    Google Scholar 

  41. Othéguy, M., Nourrit, V. & de Bougrenet de la Tocnaye, J.-L. Instrumented contact lens to detect gaze movements independently of eye blinks. Transl. Vis. Sci. Technol. 13, 12 (2024).

    Google Scholar 

  42. Mokatren, M., Kuflik, T. & Shimshoni, I. 3D gaze estimation using RGB-IR cameras. Sensors 23, 381 (2023).

    Google Scholar 

  43. Fuhl, W. et al. ElSe: ellipse selection for robust pupil detection in real-world environments. in Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, South Carolina, pp 123–130 (2016).

  44. Fuhl, W. et al. ExCuSe: robust pupil detection in real-world scenarios. in Computer Analysis of Images and Patterns, Cham, pp 39–51 (2015).

  45. Santini, T., Fuhl, W. & Kasneci, E. PuRe: Robust pupil detection for real-time pervasive eye tracking. Comput. Vis. Image Underst. 170, 40–50 (2018).

    Google Scholar 

  46. Santini, T., Fuhl, W. & Kasneci, E., PuReST: robust pupil tracking for real-time pervasive eye tracking. in Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, Warsaw, Poland, pp 1–5 (2018).

  47. Świrski, L., Bulling, A. & Dodgson, N., Robust real-time pupil tracking in highly off-axis images. in Proceedings of the Symposium on Eye Tracking Research and Applications, Santa Barbara, California, pp 173–176 (2012).

  48. Zandi, B. et al. PupilEXT: flexible open-source platform for high-resolution pupillometry in vision research. Front. Neurosci. 15, 676220 (2021).

    Google Scholar 

  49. Timm, F. & Barth, E. Accurate eye centre localisation by means of gradients. in International Conference on Computer Vision Theory and Applications, pp 125–130 (2011).

  50. Poletti, M., Rucci, M. & Carrasco, M. Selective attention within the foveola. Nat. Neurosci. 20, 1413–1417 (2017).

    Google Scholar 

  51. Zen, C. & Jen-Bin, H. A vision-based method for the circle pose determination with a direct geometric interpretation. IEEE Trans. Robot. Autom. 15, 1135–1140 (1999).

    Google Scholar 

  52. Martina, P. An eye for detail: eye movements and attention at the foveal scale. Vis. Res. 211, 108277 (2023).

    Google Scholar 

  53. Zhu, H. Spatial-Chromatic Encoding Cosmetic Contact Lenses for Enhanced Natural Eye Tracking, Feature recognition for EMFE cosmetic contact lens. https://doi.org/10.5281/zenodo.18254391 (2026).

  54. zhuoyi0904, Standing Man. https://skfb.ly/oQnnK.

Download references

Acknowledgments

This research was funded by National Key R&D Program of China (2021YFA1401103), National Natural Science Foundation of China (62305153 and 62501260), Natural Science Foundation of Jiangsu Province (BK20243014), Quantum Science and Technology-National Science and Technology Major Project (2021ZD0300700), the China Postdoctoral Science Foundation (2024M761393 and 2025T180154), Basic Research Program of Jiangsu (BK20251253), and Jiangsu Funding Program for Excellent Postdoctoral Talent (2025ZB039). In addition, the authors gratefully acknowledge Dr. Lei Zhao and Dr. Qiyong Xu from the Yongjiang Laboratory for their assistance in the comparative experiments with the Pico Neo3 Pro eye tracker.

Author information

Authors and Affiliations

  1. National Laboratory of Solid State Microstructures, Collaborative Innovation Center of Advanced Microstructures, and College of Engineering and Applied Sciences, Nanjing University, Nanjing, 210023, China

    Hengtian Zhu, Heyu Huang, Huan Yang, Zixu Li, Zhenning Qi, Yining Xu, Yifeng Xiong & Fei Xu

  2. Department of Ophthalmology, The First Affiliated Hospital with Nanjing Medical University, Nanjing, 210094, China

    Yuan Fang & Songtao Yuan

  3. College of Physics, MIIT Key Laboratory of Aerospace Information Materials and Physics, State Key Laboratory of Mechanics and Control for Aerospace Structures, Nanjing University of Aeronautics and Astronautics, Nanjing, 211106, China

    Ye Chen

  4. Chemistry and Biomedicine Innovation Center (ChemBIC), Nanjing University, Nanjing, 210093, China

    Fei Xu

  5. Hefei National Laboratory, Hefei, 230088, China

    Fei Xu

Authors
  1. Hengtian Zhu
    View author publications

    Search author on:PubMed Google Scholar

  2. Heyu Huang
    View author publications

    Search author on:PubMed Google Scholar

  3. Huan Yang
    View author publications

    Search author on:PubMed Google Scholar

  4. Zixu Li
    View author publications

    Search author on:PubMed Google Scholar

  5. Zhenning Qi
    View author publications

    Search author on:PubMed Google Scholar

  6. Yuan Fang
    View author publications

    Search author on:PubMed Google Scholar

  7. Yining Xu
    View author publications

    Search author on:PubMed Google Scholar

  8. Yifeng Xiong
    View author publications

    Search author on:PubMed Google Scholar

  9. Ye Chen
    View author publications

    Search author on:PubMed Google Scholar

  10. Songtao Yuan
    View author publications

    Search author on:PubMed Google Scholar

  11. Fei Xu
    View author publications

    Search author on:PubMed Google Scholar

Contributions

All authors provided active and valuable feedback on the paper. F.X., H.Z., and H.Y. initiated the concept and designed the studies; F.X. supervised the work; H.Z. led the experiments and collected the overall data; H.H. and Z.L. contributed to the recognition algorithm and eye tracking algorithm; Z.Q. contributed to the design and fabrication of the head-mounted eye tracker; S.Y., Y.F., and Y.N.X. contributed to the human experiment; Y.F.X., Y.C., and H.Y. advised on the experiment and manuscript; F.X. and H.Z. co-wrote the paper.

Corresponding author

Correspondence to Fei Xu.

Ethics declarations

Competing interests

The authors declare no competing interests.

Peer review

Peer review information

Nature Communications thanks Cyril Lahuec, Yuan Lin, and the other, anonymous, reviewer(s) for their contribution to the peer review of this work. A peer review file is available.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary Information

Description of Additional Supplementary Information

Supplementary Video 1

Supplementary Video 2

Supplementary Video 3

Reporting Summary

Transparent Peer Review file

Source data

Source data

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhu, H., Huang, H., Yang, H. et al. Spatial-chromatic encoding cosmetic contact lenses for enhanced natural eye tracking. Nat Commun (2026). https://doi.org/10.1038/s41467-026-68918-y

Download citation

  • Received: 11 August 2025

  • Accepted: 21 January 2026

  • Published: 03 February 2026

  • DOI: https://doi.org/10.1038/s41467-026-68918-y

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Download PDF

Advertisement

Explore content

  • Research articles
  • Reviews & Analysis
  • News & Comment
  • Videos
  • Collections
  • Subjects
  • Follow us on Facebook
  • Follow us on Twitter
  • Sign up for alerts
  • RSS feed

About the journal

  • Aims & Scope
  • Editors
  • Journal Information
  • Open Access Fees and Funding
  • Calls for Papers
  • Editorial Values Statement
  • Journal Metrics
  • Editors' Highlights
  • Contact
  • Editorial policies
  • Top Articles

Publish with us

  • For authors
  • For Reviewers
  • Language editing services
  • Open access funding
  • Submit manuscript

Search

Advanced search

Quick links

  • Explore articles by subject
  • Find a job
  • Guide to authors
  • Editorial policies

Nature Communications (Nat Commun)

ISSN 2041-1723 (online)

nature.com sitemap

About Nature Portfolio

  • About us
  • Press releases
  • Press office
  • Contact us

Discover content

  • Journals A-Z
  • Articles by subject
  • protocols.io
  • Nature Index

Publishing policies

  • Nature portfolio policies
  • Open access

Author & Researcher services

  • Reprints & permissions
  • Research data
  • Language editing
  • Scientific editing
  • Nature Masterclasses
  • Research Solutions

Libraries & institutions

  • Librarian service & tools
  • Librarian portal
  • Open research
  • Recommend to library

Advertising & partnerships

  • Advertising
  • Partnerships & Services
  • Media kits
  • Branded content

Professional development

  • Nature Awards
  • Nature Careers
  • Nature Conferences

Regional websites

  • Nature Africa
  • Nature China
  • Nature India
  • Nature Japan
  • Nature Middle East
  • Privacy Policy
  • Use of cookies
  • Legal notice
  • Accessibility statement
  • Terms & Conditions
  • Your US state privacy rights
Springer Nature

© 2026 Springer Nature Limited

Nature Briefing AI and Robotics

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing: AI and Robotics