Abstract
Vision-based analysis of ocular features represents the predominant approach for eye tracking. However, these features are highly susceptible to interference from illumination, eyelid/eyelash occlusion, and individual variations, leading to low recognition rates and diminished tracking accuracy. To address these limitations, eye movement feature enhanced cosmetic contact lenses are proposed, implementing a spatial-chromatic encoding strategy. Employing a head-mounted eye tracker integrated with RGB cameras, this system enables accurate and robust gaze tracking in natural environments. Under challenging illumination, the lenses achieve a 93% feature recognition rate, significantly surpassing pupil recognition and tolerating highly off-axis camera placement. Eye movement model and human eye tracking demonstrate superior accuracy (<1°), gaze direction estimation, and continuous fixation positioning. Utilizing these lenses, diverse eye-tracking applications are demonstrated, including image identification, reading analysis, and outdoor interaction. This approach advances the development of lightweight, unobtrusive eye-tracking systems and facilitates broader application of gaze-based interaction technology in real-world settings.
Data availability
The authors declare that all data supporting the results in this study are present in the paper, and the data sources are uploaded in the Supplementary Information with this paper. Any additional requests for information can be directed to and will be fulfilled by the corresponding authors. Source data are provided with this paper.
Code availability
The codes supporting this study’s findings are available from https://github.com/yiyinju/feature-recognition-for-EMFE-cosmetic-contact-lens53
References
Valliappan, N. et al. Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nat. Commun. 11, 4553 (2020).
Chang, H. et al. Sleep microstructure organizes memory replay. Nature 637, 1161–1169 (2025).
Gehmacher, Q. et al. Eye movements track prioritized auditory features in selective attention to natural speech. Nat. Commun. 15, 3692 (2024).
Pärnamets, P. et al. Biasing moral decisions by exploiting the dynamics of eye gaze. Proc. Natl. Acad. Sci. USA 112, 4170–4175 (2015).
Adhanom, I. B., MacNeilage, P. & Folmer, E. Eye Tracking in virtual reality: a broad review of applications and challenges. Virtual Real. 27, 1481–1505 (2023).
Song, J.-H., van de Groep, J., Kim, S. J. & Brongersma, M. L. Non-local metasurfaces for spectrally decoupled wavefront manipulation and eye tracking. Nat. Nanotechnol. 16, 1224–1230 (2021).
Novák, J. S. et al. Eye tracking, usability, and user experience: a systematic review. Int. J. Hum.-Comput. Interact. 40, 4484–4500 (2024).
Clark, R. et al. The potential and value of objective eye tracking in the ophthalmology clinic. Eye 33, 1200–1202 (2019).
Fu, H. L. et al. Influence of cues on the safety hazard recognition of construction workers during safety training: evidence from an eye-tracking experiment. J. Civ. Eng. Educ. 150, 1 (2024).
Xu, J. W. et al. Left gaze bias between LHT and RHT: a recommendation strategy to mitigate human errors in left- and right-hand driving. IEEE Trans. Intell. Veh. 8, 4406–4417 (2023).
Homayounfar, S. Z. et al. Multimodal smart eyewear for longitudinal eye movement tracking. Matter 3, 1275–1293 (2020).
Robinson, D. A. A method of measuring eye movemnent using a scieral search coil in a magnetic field. IEEE Trans. Bio-Med. Electron. 10, 137–145 (1963).
Houben, M. M. J., Goumans, J. & van der Steen, J. Recording three-dimensional eye movements: scleral search coils versus video oculography. Investig. Ophthalmol. Vis. Sci. 47, 179–187 (2006).
Ebisawa, Y. & Fukumoto, K. Head-free, remote eye-gaze detection system based on pupil-corneal reflection method with easy calibration using two stereo-calibrated video cameras. IEEE Trans. Biomed. Eng. 60, 2952–2960 (2013).
Chi, J.-N. et al. Key techniques of eye gaze tracking based on pupil corneal reflection. in 2009 WRI Global Congress on Intelligent Systems, pp 133–138 (2009).
Frey, M., Nau, M. & Doeller, C. F. Magnetic resonance-based eye tracking using deep neural networks. Nat. Neurosci. 24, 1772–1779 (2021).
Shi, Y. et al. Eye tracking and eye expression decoding based on transparent, flexible and ultra-persistent electrostatic interface. Nat. Commun. 14, 3315 (2023).
Villanueva, A. & Cabeza, R. A novel gaze estimation system with one calibration point. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 38, 1123–1138 (2008).
Ponz, V. et al. Topography-based detection of the iris centre using multiple-resolution images. in 2011 Irish Machine Vision and Image Processing Conference, pp 32–37 (2011).
Valenti, R., Sebe, N. & Gevers, T. Combining head pose and eye location information for gaze estimation. IEEE Trans. Image Process. 21, 802–815 (2012).
Park, S., Spurr, A. & Hilliges, O. Deep pictorial gaze estimation. in Computer Vision - ECCV 2018, (Cham, 2018) pp 741–757.
Martinikorena, I. et al. A. Fast and robust ellipse detection algorithm for head-mounted eye tracking systems. Mach. Vis. Appl. 29, 845–860 (2018).
Wu, Z. et al. EyeNet: a multi-task deep network for off-axis eye gaze estimation. in 2019 IEEE/CVF International Conference on Computer Vision Workshop, pp 3683–3687 (2019).
Morimoto, C. H. et al. Pupil detection and tracking using multiple light sources. Image Vis. Comput. 18, 331–335 (2000).
Coutinho, F. L. & Morimoto, C. H., Free head motion eye gaze tracking using a single camera and multiple light sources. in 19th Brazilian Symposium on Computer Graphics and Image Processing, Manaus, BRAZIL, p 171 (2006).
Mestre, C., Gautier, J. & Pujol, J. Robust eye tracking based on multiple corneal reflections for clinical applications. J. Biomed. Opt. 23, 1–9 (2018).
Beymer, D. & Flickner, M. Eye gaze tracking using an active stereo head. in 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings, Madison, WI, USA, p II-451 (2003).
Guestrin, E. D. & Eizenman, M. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Trans. Biomed. Eng. 53, 1124–1133 (2006).
Wang, J. et al. Accurate eye tracking from dense 3D surface reconstructions using single-shot deflectometry. Nat. Commun. 16, 2902 (2025).
Tonsen, M. et al. Labelled pupils in the wild: a dataset for studying pupil detection in unconstrained environments. in Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, South Carolina, pp 139–142 (2016).
Zhu, Z. & Ji, Q. Robust real-time eye detection and tracking under variable lighting conditions and various face orientations. Comput. Vis. Image Underst. 98, 124–154 (2005).
Yao, G. et al. Snowflake-inspired and blink-driven flexible piezoelectric contact lenses for effective corneal injury repair. Nat. Commun. 14, 3604 (2023).
Zhou, C. et al. Modulus-adjustable and mechanically adaptive dry microneedle electrodes for personalized electrophysiological recording. npj Flex. Electron. 9, 77 (2025).
Yao, G. et al. A programmable and skin temperature-activated electromechanical synergistic dressing for effective wound healing. Sci. Adv. 8, eabl8379 (2022).
Yao, G. et al. Smart contact lenses: Catalysts for science fiction becoming reality. Innovation 5, 100710 (2024).
Massin, L. et al. Multipurpose bio-monitored integrated circuit in a contact lens eye-tracker. Sensors 22, 595 (2022).
Zhu, H. et al. Frequency-encoded eye tracking smart contact lens for human–machine interaction. Nat. Commun. 15, 3588 (2024).
Gan, X. et al. Closed-eye intraocular pressure and eye movement monitoring via a stretchable bimodal contact lens. Microsyst. Nanoeng. 11, 83 (2025).
Fradkin, I. M. et al. Contact lens with moiré labels for precise eye tracking. in Proc. SPIE 13414, Optical Architectures for Displays and Sensing in Augmented, Virtual, and Mixed Reality (AR, VR, MR) VI, San Francisco, California, United States, p 134141W (2025).
Khaldi, A. et al. A laser emitting contact lens for eye tracking. Sci. Rep. 10, 14804 (2020).
Othéguy, M., Nourrit, V. & de Bougrenet de la Tocnaye, J.-L. Instrumented contact lens to detect gaze movements independently of eye blinks. Transl. Vis. Sci. Technol. 13, 12 (2024).
Mokatren, M., Kuflik, T. & Shimshoni, I. 3D gaze estimation using RGB-IR cameras. Sensors 23, 381 (2023).
Fuhl, W. et al. ElSe: ellipse selection for robust pupil detection in real-world environments. in Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, South Carolina, pp 123–130 (2016).
Fuhl, W. et al. ExCuSe: robust pupil detection in real-world scenarios. in Computer Analysis of Images and Patterns, Cham, pp 39–51 (2015).
Santini, T., Fuhl, W. & Kasneci, E. PuRe: Robust pupil detection for real-time pervasive eye tracking. Comput. Vis. Image Underst. 170, 40–50 (2018).
Santini, T., Fuhl, W. & Kasneci, E., PuReST: robust pupil tracking for real-time pervasive eye tracking. in Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, Warsaw, Poland, pp 1–5 (2018).
Świrski, L., Bulling, A. & Dodgson, N., Robust real-time pupil tracking in highly off-axis images. in Proceedings of the Symposium on Eye Tracking Research and Applications, Santa Barbara, California, pp 173–176 (2012).
Zandi, B. et al. PupilEXT: flexible open-source platform for high-resolution pupillometry in vision research. Front. Neurosci. 15, 676220 (2021).
Timm, F. & Barth, E. Accurate eye centre localisation by means of gradients. in International Conference on Computer Vision Theory and Applications, pp 125–130 (2011).
Poletti, M., Rucci, M. & Carrasco, M. Selective attention within the foveola. Nat. Neurosci. 20, 1413–1417 (2017).
Zen, C. & Jen-Bin, H. A vision-based method for the circle pose determination with a direct geometric interpretation. IEEE Trans. Robot. Autom. 15, 1135–1140 (1999).
Martina, P. An eye for detail: eye movements and attention at the foveal scale. Vis. Res. 211, 108277 (2023).
Zhu, H. Spatial-Chromatic Encoding Cosmetic Contact Lenses for Enhanced Natural Eye Tracking, Feature recognition for EMFE cosmetic contact lens. https://doi.org/10.5281/zenodo.18254391 (2026).
zhuoyi0904, Standing Man. https://skfb.ly/oQnnK.
Acknowledgments
This research was funded by National Key R&D Program of China (2021YFA1401103), National Natural Science Foundation of China (62305153 and 62501260), Natural Science Foundation of Jiangsu Province (BK20243014), Quantum Science and Technology-National Science and Technology Major Project (2021ZD0300700), the China Postdoctoral Science Foundation (2024M761393 and 2025T180154), Basic Research Program of Jiangsu (BK20251253), and Jiangsu Funding Program for Excellent Postdoctoral Talent (2025ZB039). In addition, the authors gratefully acknowledge Dr. Lei Zhao and Dr. Qiyong Xu from the Yongjiang Laboratory for their assistance in the comparative experiments with the Pico Neo3 Pro eye tracker.
Author information
Authors and Affiliations
Contributions
All authors provided active and valuable feedback on the paper. F.X., H.Z., and H.Y. initiated the concept and designed the studies; F.X. supervised the work; H.Z. led the experiments and collected the overall data; H.H. and Z.L. contributed to the recognition algorithm and eye tracking algorithm; Z.Q. contributed to the design and fabrication of the head-mounted eye tracker; S.Y., Y.F., and Y.N.X. contributed to the human experiment; Y.F.X., Y.C., and H.Y. advised on the experiment and manuscript; F.X. and H.Z. co-wrote the paper.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Peer review
Peer review information
Nature Communications thanks Cyril Lahuec, Yuan Lin, and the other, anonymous, reviewer(s) for their contribution to the peer review of this work. A peer review file is available.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Source data
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Zhu, H., Huang, H., Yang, H. et al. Spatial-chromatic encoding cosmetic contact lenses for enhanced natural eye tracking. Nat Commun (2026). https://doi.org/10.1038/s41467-026-68918-y
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41467-026-68918-y