Abstract
Exploring high-latitude lunar regions presents a challenging visual environment for robots. The low sunlight elevation angle and minimal light scattering result in a visual field dominated by a strong contrast featuring long, dynamic shadows. Reproducing these conditions on Earth requires sophisticated simulators and specialized facilities. We introduce a unique dataset recorded at the LunaLab from the SnT - University of Luxembourg, an indoor test facility designed to replicate the optical characteristics of multiple lunar latitudes. Our dataset includes images, inertial measurements, and wheel odometry data from robots navigating different trajectories under multiple illumination scenarios, simulating high-latitude lunar conditions from dawn to nighttime with and without the aid of headlights, resulting in 88 distinct sequences containing a total of 1.3 M images. Data was captured using a stereo RGB-inertial sensor, a monocular monochrome camera, and, for the first time, a novel single-photon avalanche diode (SPAD) camera. We recorded both static and dynamic image sequences, with robots navigating at slow (5 cm/s) and fast (50 cm/s) speeds. All data is calibrated, synchronized, and timestamped, providing a valuable resource for validating perception tasks from vision-based autonomous navigation to scientific imaging for future lunar missions targeting high-latitude regions or those intended for robots operating across perceptually degraded environments.
Similar content being viewed by others
Data availability
The dataset is publicly available at Zenodo6.
Code availability
All code described in this paper can be accessed at https://GitHub.com/spaceuma/spice-hl3. We designed Python and Matlab scripts to be easily adaptable to the ultimate end-user needs. Further descriptions, requirements, and versions are stated in the git repository’s README.md.
References
Padma, T. India’s Moon mission: Four things Chandrayaan-3 has taught scientists. Nature 621, 456–456 (2023).
Kleinhenz, J. et al. Lunar surface missions for resource reconnaissance: NASA’s PRIME-1 and VIPER. Space Resources Round Table (2024).
Wang, C. et al. Scientific objectives and payload configuration of the Chang’E-7 mission. National Science Review 11(2), nwad329, https://doi.org/10.1093/nsr/nwad329 (2024).
Zhang, Y. et al. Analysis of illumination conditions in the Lunar South Polar Region using multi-temporal high-resolution orbital images. Remote Sensing 15(24), 5691, https://doi.org/10.3390/rs15245691 (2023).
Rodríguez-Martínez, D., Van Winnendael, M. & Yoshida, K. High-speed mobility on planetary surfaces: A technical review. Journal of Field Robotics 36(8), 1436–1455, https://doi.org/10.1002/rob.21912 (2019).
Rodríguez-Martínez, D., van de Meer, D., Bera, A., Pérez-del Pulgar, C. J. & Olivares-Mendez, M. A. SPICE-HL3: Single-Photon, Inertial, and Stereo Camera dataset for Exploration of High-Latitude Lunar Landscapes. Zenodo, v1.0.0, https://doi.org/10.5281/zenodo.13970077 (2024).
Ludivig, P., Calzada-Diaz, A., Olivares Mendez, M.A., Voos, H. & Lamamy, J. Building a piece of the Moon: Construction of two indoor lunar analogue environments. In 71st International Astronautical Congress (IAC)–The CyberSpace Edition (2020).
Rodríguez-Martínez, D. & Pérez-del Pulgar, C.J. Fast vision in the dark: A case for single-photon imaging in planetary navigation. In 18th Symposium on Advanced Space Technologies in Robotics and Automation (ASTRA). ESA (2025).
Pessia, R., Ishigami, G. & Jodelet, Q. Artificial lunar landscape dataset, Kaggle, https://www.kaggle.com/dsv/489236 (2019).
Richard, A. et al. OmniLRS: A photorealistic simulator for lunar robotics. In IEEE International Conference on Robotics and Automation (ICRA). IEEE, https://doi.org/10.1109/ICRA57147.2024.10610026 (2024).
Fong, T. Digital Proving Ground: VIPER rover simulator (RSIM). In 2024 Lunar Surface Innovation Consortium (LSIC) Fall Meeting. LSIC (2024).
Wang, Y., Yuan, T., Liu, C., Wu, Q. & Qian, J. The real Chang’e lunar landscape dataset. IEEE DataPort https://doi.org/10.21227/9y8q-dx27 (2024).
Furgale, P., Carle, P., Enright, J. & Barfoot, T. D. The Devon Island rover navigation dataset. The International Journal of Robotics Research 31(6), 707–713, https://doi.org/10.1177/0278364911433135 (2012).
Vayugundla, M. et al. Datasets of long range navigation experiments in a Moon analogue environment on Mount Etna. In ISR 2018; 50th International Symposium on Robotics (2018).
Giubilato, R., Stürzl, W., Wedler, A. & Triebel, R. Challenges of SLAM in extremely unstructured environments: The DLR planetary stereo, solid-state LiDAR, inertial dataset. IEEE Robotics and Automation Letters 7(4), 8721–8728, https://doi.org/10.1109/LRA.2022.3188118 (2022).
Gerdes, L. et al. BASEPROD: The Bardenas semi-desert planetary rover dataset. Scientific Data 11, 1054, https://doi.org/10.1038/s41597-024-03881-1 (2024).
Wong, U. et al. Polar Optical Lunar Analog Reconstruction (POLAR) Stereo Dataset. NASA Ames Research Center (2017).
Hansen, M., Wong, U. & Fong, T. The POLAR Traverse dataset: A dataset of stereo camera images simulating traverses across lunar polar terrain under extreme lighting conditions. NASA Ames Research Center (2024).
Morimoto, K. et al. Megapixel time-gated SPAD image sensor for 2D and 3D imaging applications. Optica 7(4), 346–354, https://doi.org/10.1364/OPTICA.386574 (2020).
Gramuglia, F. et al. Sub-10 ps minimum ionizing particle detection with geiger-mode APDs. Frontiers in Physics 10, https://doi.org/10.3389/fphy.2022.849237 (2022).
Michalet, X. et al. NIR Fluorescence lifetime macroscopic imaging with a time-gated SPAD camera. Multiphoton Microscopy in the Biomedical Sciences XXII 11965, 29–37, https://doi.org/10.1117/12.2607833 (2022).
Zhao, J. et al. Light detection and ranging with entangled photons. Optics Express 30(3), 3675–3683, https://doi.org/10.1364/OE.435898 (2022).
Song, J., Richard, A. and Olivares-Mendez, M. Joint spatial-temporal calibration for camera and global pose sensor. In 2024 International Conference on 3D Vision (3DV). IEEE, https://doi.org/10.1109/3DV62453.2024.00074 (2024).
Venkatanath, N., Praneeth, D., Sumohana, S.C. & Swarup, S.M. Blind image quality evaluation using perception based features. In 2015 Twenty First National Conference on Communications (NCC). IEEE, https://doi.org/10.1109/NCC.2015.7084843 (2015).
Labbé, M. & Michaud, F. RTAB-Map as an open-source lidar and visual simultaneous localization and mapping library for large-scale and long-term online operation. Journal of Field Robotics 36(2), 416, https://doi.org/10.1002/rob.21831 (2019).
Campos, C., Elvira, R., Rodríguez, J. J. G., Montiel, J. M. & Tardós, J. D. ORB-SLAM3: An accurate open-source library for visual, visual–inertial, and multimap SLAM. IEEE Transactions on Robotics 37(6), 1874–1890, https://doi.org/10.1109/TRO.2021.3075644 (2021).
Carrie, W. D. Lunar soil grain size distribution. The Moon 6(3), 250–263, https://doi.org/10.1007/BF00562206 (1973).
Acknowledgements
We would like to thank Pi Imaging Technologies for generously providing the SPAD512 camera used in the recording of this dataset. This work was supported in part by armasuisse Science and Technology (contract number 8003538860) under the project Monocular SPAD camera for enhanced vision in complex and uncertain environments. This work was partly conducted when the corresponding author was still affiliated with the Advanced Quantum Architecture Laboratory (AQUA) at EPFL, Switzerland.
Author information
Authors and Affiliations
Contributions
D.R.-M. conceived the experiments, formatted and prepared the dataset, built and wrote the code in the code repository, and drafted the original manuscript. D.R.-M., D.vdM., J.S., and A.B. refined the implementation of the experiments and prepared the rovers for data acquisition. D.vdM., J.S., A.B., and M.O.-M. arranged the experimental facility prior to the experiments. D.vdM. and A.B. operated the rovers and supervised the network. J.S. assisted and conducted sensor calibration. D.R.-M., D.vdM., and A.B. conducted the rover experiments. C.J.P.P. and M.A.O.-M. defined data formatting and advised on data validation. All authors reviewed, edited, and approved the final manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Rodríguez-Martínez, D., van der Meer, D., Song, J. et al. SPICE-HL3: Single-Photon, Inertial, and Stereo Camera dataset for Exploration of High-Latitude Lunar Landscapes. Sci Data (2026). https://doi.org/10.1038/s41597-026-06668-8
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41597-026-06668-8


