Abstract
Accurate, photorealistic, and operationally actionable 3D campus mapping is a key enabler for smart educational environments, which refers to digitally enabled campuses that integrate spatial data to support navigation, facility management, and infrastructure monitoring. This paper presents an end-to-end, replicable UAV workflow that combines RGB photogrammetry from multi-view imagery and UAV LiDAR point clouds to construct a high-fidelity 3D campus model of King Fahd University of Petroleum and Minerals (KFUPM) as a case study. Data were collected using a DJI Matrice 300 RTK equipped with a Zenmuse P1 (45 MP) camera and Zenmuse L2 LiDAR payload, with nadir grid flights (80%/70% overlap) and oblique orbits (~ 45°) at 60 m altitude (RGB GSD ≈ 2.5 cm/pixel; LiDAR mean spacing ≈ 5 cm). LiDAR scans were georeferenced and cleaned, then co-registered with the photogrammetric reconstruction in a common RTK frame. To improve visual realism without altering metric geometry, a lightweight 2× U-Net super-resolution module (U-NetSR) was applied only to the RGB textures used for mesh texturing. Experiments show that combining nadir and oblique views improves facade completeness and reduces surface deviation by ~ 30% relative to nadir-only acquisition, while super-resolved textures increase SSIM (0.88→0.93) and edge sharpness (~ 28%) at a modest post-processing cost. Finally, the model is exported to a WebGIS environment for interactive 3D exploration and campus-operations integration.
Similar content being viewed by others
Data availability
The data that support the findings of this study are available from the corresponding author upon reasonable request.
References
Brilakis, D. et al. Digital twins of built infrastructure: current state-of-the-art, challenges, and opportunities. Autom. Constr. 121 https://doi.org/10.1016/j.autcon.2020.103451 (2021).
Sacks, A. et al. Digital twins for smart facility and asset management. Adv. Eng. Inform. 50 https://doi.org/10.1016/j.aei.2021.101401 (2021).
Volkmann, R. & Wuest, J. Digital twin-based facility management of university campuses, in Proceedings of the European Conference on Smart Cities and Smart Infrastructure. (2022).
Dong, Z. Y., Zhang, Y., Yip, C., Swift, S. & Beswick, K. Smart campus: definition, framework, technologies and services. IET Smart Cities. https://doi.org/10.1049/iet-smc.2019.0072 (2020).
Gao, R. et al. Construction of a real-scene 3d digital campus using a multi-source data fusion: a case study of lanzhou jiaotong university, ISPRS Int. J. Geo-Inf., 14, 1, art. 19, (2025). https://doi.org/10.3390/ijgi14010019
Haggag, M. et al. Integrating advanced technologies for sustainable Smart Campus development: A comprehensive survey of recent studies, Adv. Eng. Informatics, 66, art. 103412, (2025). https://doi.org/10.1016/j.aei.2025.103412
Saidon, M. Saifizi & Syahirah, Nur & Mustafa, Wan & A Rahim, Hasliza & Nasrudin, Mohd Wafi. Using Unmanned Aerial Vehicle in 3D Modelling of UniCITI Campus to Estimate Building Size. Journal of Physics: Conference Series. 1962. 012057. 10.1088/1742-6596/1962/1/012057 (2021).
Masiero, A. et al. UAV mapping and 3D modeling as a tool for promotion and management of the urban space. ISPRS Annals Photogrammetry Remote Sens. Spat. Inform. Sci. VI-1/W1-2020, 167–174 (2020).
Kuznetsova, A. et al. Exploring view angle impact on urban scene reconstruction using UAV imagery. Remote Sens. 12, 15 (2020).
Ji Chang, D. et al. Physics-constrained sequence learning with attention mechanism for multi-horizon production forecasting, Geoenergy Science and Engineering, 231, 2949–8910, https://doi.org/10.1016/j.geoen.2023.212388. (2023).
Agüera-Vega, F., Ferrer-González, E., Martínez-Carricondo, P., Sánchez-Hermosilla, J. & Carvajal-Ramírez, F. Influence of the inclusion of off-nadir images on UAV-photogrammetry projects from nadir images and AGL (Above ground level) or AMSL (Above mean sea level. Flights Drones. 8 (11), 662. https://doi.org/10.3390/drones8110662 (2024).
Panagiotopoulou, A. et al. Super-resolution techniques in photogrammetric 3D reconstruction from close-range UAV imagery. Heritage 6 (3), 2701–2715. https://doi.org/10.3390/heritage6030143 (2023).
Li, R. & Zhao, X. LSwinSR: UAV imagery Super-Resolution based on linear Swin transformer. ArXiv:2303 10232. https://doi.org/10.48550/arXiv.2303.10232 (2023).
Rossi, L., Bernuzzi, V., Fontanini, T., Bertozzi, M. & Prati, A. Swin2-MoSE: A new single image Super-Resolution model for remote sensing. ArXiv:2404 18924. https://doi.org/10.48550/arXiv.2404.18924 (2024).
Keshk, H. M. & Yin, X. C. Obtaining super-resolution satellites images based on enhancement deep convolutional neural network. Int. J. Aeronaut. Space Sci. 22, 195–202. https://doi.org/10.1007/s42405-020-00297-0 (2021).
Qin, K. et al. Novel UAV-based 3D reconstruction using dense lidar point cloud and imagery: A geometry-aware 3D Gaussian splatting approach. Int. J. Appl. Earth Obs. Geoinf. 140 (104590). https://doi.org/10.1016/j.jag.2025.104590 (2025).
Shen, L. & Liu, M. UAV-based 3D mapping and real-time digital twin for construction progress monitoring. J. Comput. Civil Eng. 35, 6 (2021).
Wang, X., Zhang, Y. & Huang, F. Super-resolution of remote sensing images using deep residual networks. IEEE J. Sel. Top. Appl. Earth Observations Remote Sens. 13, 1159–1170 (2020).
Keshk, H. M. & Yin, X. C. Satellite super-resolution images depending on deep learning methods: A comparative study, 2017 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC), Xiamen, China, 1–7, (2017). https://doi.org/10.1109/ICSPCC.2017.8242625
Rewhel, E. M. et al. Deep learning methods used in remote sensing images: a review. J. Environ. Earth Sci. 5 (1), 33–64. https://doi.org/10.30564/jees.v5i1.5232 (2023).
Irwansyah, E., Gunawan, A. A. S., Pranoto, H., Surya Pramudya, F. & Fakhriadi, L. Deep learning with semantic segmentation approach for building rooftop mapping in urban irregular housing complexes, Eng. Technol. Appl. Sci. Res., 15, 2, 20580–20587, Apr. (2025).
Becker, C. et al. LiDAR point cloud fusion with high-resolution imagery for urban modeling. ISPRS J. Photogrammetry Remote Sens. 150, 75–86 (2019).
Vasilescu, V., Datcu, M. & Faur, D. A CNN-Based Sentinel-2 image super-resolution method using multiobjective training. IEEE Trans. Geosci. Remote Sens. 61, 1–14. https://doi.org/10.1109/TGRS.2023.3240296 (2023). Art 4700314.
Acknowledgements
King Fahd University of Petroleum & Minerals and Interdisciplinary Research Center for Aviation & Space Exploration.
Funding
The authors disclose that King Fahd University of Petroleum & Minerals and Interdisciplinary Research Center for Aviation & Space Exploration funds this work.
Author information
Authors and Affiliations
Contributions
H.K.: Conceptualization, Funding acquisition, Methodology, Project administration, Software, Validation, Visualization, Writing–review & editing. Ayman A.: Investigation, Data curation, Formal analysis, Writing– review & editing. S.A.: Methodology, Visualization, Writing–review & editing. Abdullah A.: Investigation, Data curation, Formal analysis, Writing–original draft.
Corresponding authors
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Keshk, H.M., Abdallah, A.M., Almutairi, S. et al. UAV photogrammetry and lidar integration for high-fidelity 3D campus mapping at KFUPM. Sci Rep (2026). https://doi.org/10.1038/s41598-026-39888-4
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41598-026-39888-4


