Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Advertisement

Nature Communications
  • View all journals
  • Search
  • My Account Login
  • Content Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • RSS feed
  1. nature
  2. nature communications
  3. articles
  4. article
Large-array sub-millimeter precision coherent flash three-dimensional imaging
Download PDF
Download PDF
  • Article
  • Open access
  • Published: 14 February 2026

Large-array sub-millimeter precision coherent flash three-dimensional imaging

  • Bin Wang1,2,3 na1,
  • Junze Tian1,2,3 na1,
  • Jianwei Wang4,5,
  • Shukang Xu1,2,3,
  • Shuangxiang Zhao1,2,3,
  • Jianhao Duan1,2,3,
  • Weifeng Zhang  ORCID: orcid.org/0000-0001-6454-13141,2,3,
  • Yangyang Liu  ORCID: orcid.org/0009-0001-2778-03724,5,
  • Tao Zeng1,2,3 &
  • …
  • Erke Mao1,2 

Nature Communications , Article number:  (2026) Cite this article

We are providing an unedited version of this manuscript to give early access to its findings. Before final publication, the manuscript will undergo further editing. Please note there may be errors present which affect the content, and all legal disclaimers apply.

Subjects

  • Imaging and sensing
  • Microwave photonics

Abstract

High-precision three-dimensional (3D) imaging is essential for accurately perceiving environments, providing critical depth and spatial awareness. Among various approaches, solid-state LiDAR systems have garnered significant attention. However, depth precision, detection range and pixel scalability remain key challenges for their widespread adoption. Here, we report a large-array coherent flash 3D imaging system that achieves a sub-millimeter range precision through stepped-frequency modulation and coherent detection with CCD sensors. A coherent image sensor is developed, and a prototype system is demonstrated, providing 3D imaging with a depth precision as high as 0.47 mm over a range of 30.50 m at an optical power of 15.86 mW and a maximum frame rate of 10 Hz. Our system features high range precision, exceptional sensitivity across long distances, and robust pixel scalability by directly leveraging well-established CCD sensors. This advancement introduces a scalable approach to long-range high-precision 3D imaging, with substantial implications for deformation monitoring, virtual reality, and cultural heritage preservation.

Data availability

The authors declare that the data supporting the findings of this study are available within the paper and its Supplementary Information. The experimental data generated in this study are deposited in Figshare at https://doi.org/10.6084/m9.figshare.30948620.

References

  1. Schwarz, B. Mapping the world in 3D. Nat. Photonics 4, 429–430 (2010).

    Google Scholar 

  2. Wulder, M. A. et al. Lidar sampling for large-area forest characterization: a review. Remote Sens. Environ. 121, 196–209 (2012).

    Google Scholar 

  3. Jaboyedoff, M. et al. Use of LIDAR in landslide investigations: a review. Nat. Hazards 61, 5–28 (2012).

    Google Scholar 

  4. Torres, Y. et al. Integration of LiDAR and multispectral images for rapid exposure and earthquake vulnerability estimation. Application in Lorca, Spain. Int. J. Appl. Earth Obs. Geoinf. 81, 161–175 (2019).

    Google Scholar 

  5. Trocha, P. et al. Ultrafast optical ranging using microresonator soliton frequency combs. Science 359, 887–891 (2018).

    Google Scholar 

  6. Riemensberger, J. et al. Massively parallel coherent laser ranging using a soliton microcomb. Nature 581, 164–170 (2020).

    Google Scholar 

  7. Kuse, N. & Fermann, M. E. Frequency-modulated comb LIDAR. APL Photonics 4, 106105 (2019).

    Google Scholar 

  8. Godin, G. et al. Active optical 3D imaging for heritage applications. IEEE Comput. Graph. Appl. 22, 24–35 (2002).

    Google Scholar 

  9. Luo, L. et al. Airborne and spaceborne remote sensing for archaeological and cultural heritage applications: a review of the century (1907–2017). Remote Sens. Environ. 232, 111280 (2019).

    Google Scholar 

  10. Xiong, J., Hsiang, E. L., He, Z., Zhan, T. & Wu, S. T. Augmented reality and virtual reality displays: emerging technologies and future perspectives. Light Sci. Appl. 10, 216 (2021).

    Google Scholar 

  11. Bouazizi, M., Ye, C. & Ohtsuki, T. 2-D LIDAR-based approach for activity identification and fall detection. IEEE Internet Things J. 9, 10872–10890 (2022).

    Google Scholar 

  12. Fang, Z. et al. A review of emerging electromagnetic-acoustic sensing techniques for healthcare monitoring. IEEE Trans. Biomed. Circuits Syst. 16, 1075–1094 (2022).

    Google Scholar 

  13. Jiang, Y., Karpf, S. & Jalali, B. Time-stretch LiDAR as a spectrally scanned time-of-flight ranging camera. Nat. Photonics 14, 14–18 (2020).

    Google Scholar 

  14. Stellinga, D. B. et al. Time-of-flight 3D imaging through multimode optical fibers. Science 374, 1395–1399 (2021).

    Google Scholar 

  15. Zhang, X., Kwon, K., Henriksson, J., Luo, J. & Wu, M. C. A large-scale microelectromechanical-systems-based silicon photonics LiDAR. Nature 603, 253–258 (2022).

    Google Scholar 

  16. Chen, R. et al. Breaking the temporal and frequency congestion of LiDAR by parallel chaos. Nat. Photonics 17, 306–314 (2023).

    Google Scholar 

  17. Rogers, C. et al. A universal 3D imaging sensor on a silicon photonics platform. Nature 590, 256–261 (2021).

    Google Scholar 

  18. Li, Z. et al. Towards an ultrafast 3D imaging scanning LiDAR system: a review. Photon. Res. 12, 1709–1729 (2024).

    Google Scholar 

  19. John, D. D. et al. Wideband electrically pumped 1050-nm MEMS-tunable VCSEL for ophthalmic imaging. J. Light. Technol. 33, 3461–3468 (2015).

    Google Scholar 

  20. Park, J. et al. All-solid-state spatial light modulator with independent phase and amplitude control for three-dimensional LiDAR applications. Nat. Nanotechnol. 16, 69–76 (2021).

    Google Scholar 

  21. Li, N. et al. A progress review on solid-state LiDAR and nanophotonics-based lidar sensors. Laser Photonics Rev. 16, 2100511 (2022).

    Google Scholar 

  22. Sun, J., Timurdogan, E., Yaacobi, A., Hosseini, E. S. & Watts, M. R. Large-scale nanophotonic phased array. Nature 493, 195–199 (2013).

    Google Scholar 

  23. Poulton, C. V. et al. Coherent LiDAR with an 8,192-element optical phased array and driving laser. IEEE J. Sel. Top. Quantum Electron. 28, 1–8 (2022).

    Google Scholar 

  24. Chen, B. et al. SiN-on-SOI optical phased array LiDAR for ultra-wide field of view and 4D sensing. Laser Photonics Rev. 18, 2301360 (2024).

    Google Scholar 

  25. Bruschini, C. et al. Single-photon avalanche diode imagers in biophotonics: review and outlook. Light Sci. Appl. 8, 87 (2019).

    Google Scholar 

  26. Beer, M., Haase, J. F., Ruskowski, J. & Kokozinski, R. Background light rejection in SPAD-based LiDAR sensors by adaptive photon coincidence detection. Sensors 18, 4338 (2018).

    Google Scholar 

  27. Zhang, C. et al. A 30-frames/s, 252 × 144 SPAD flash LiDAR with 1728 dual-clock 48.8-ps TDCs, and pixel-wise integrated histogramming. IEEE J. Solid-State Circuits 54, 1137–1151 (2019).

    Google Scholar 

  28. Padmanabhan, P. et al. A 256×128 3D-stacked (45 nm) SPAD flash LiDAR with 7-level coincidence detection and progressive gating for 100 m range and 10 klux background light. In Proc. 2021 IEEE International Solid-State Circuits Conference (ISSCC), 111–113 (IEEE, 2021).

  29. Hutchings, S. W. et al. A reconfigurable 3-D-stacked SPAD imager with in-pixel histogramming for flash LIDAR or high-speed time-of-flight imaging. IEEE J. Solid-State Circuits 54, 2947–2956 (2019).

    Google Scholar 

  30. Smith, G. E. Nobel Lecture: the invention and early history of the CCD. Rev. Mod. Phys. 82, 2307–2312 (2010).

    Google Scholar 

  31. Xiuda, Z., Huimin, Y. & Yanbing, J. Pulse-shape-free method for long-range three-dimensional active imaging with high linear accuracy. Opt. Lett. 33, 1219–1221 (2008).

    Google Scholar 

  32. Jin, C., Sun, X., Zhao, Y., Zhang, Y. & Liu, L. Gain-modulated three-dimensional active imaging with depth-independent depth accuracy. Opt. Lett. 34, 3550–3552 (2009).

    Google Scholar 

  33. Jo, S. et al. High resolution three-dimensional flash LIDAR system using a polarization modulating Pockels cell and a micro-polarizer CCD camera. Opt. Express 24, A1580–A1592 (2016).

    Google Scholar 

  34. Zhang, P., Du, X., Zhao, J., Song, Y. & Chen, H. High resolution flash three-dimensional LIDAR systems based on polarization modulation. Appl. Opt. 56, 3889–3895 (2017).

    Google Scholar 

  35. Song, Y. et al. Potassium tantalite niobite (KTN) crystal-based polarization-modulated 3D lidar with a large field of view. Opt. Lett. 45, 5319–5322 (2020).

    Google Scholar 

  36. Gu, Z. et al. Theoretical range precision obtained by maximum likelihood estimation in laser radar compared with the Cramer-Rao bound. Appl. Opt. 57, 9951–9956 (2018).

    Google Scholar 

  37. Blu, T., Dragotti, P. L., Vetterli, M., Marziliano, P. & Coulot, L. Sparse sampling of signal innovations. IEEE Signal Process. Mag. 25, 31–40 (2008).

    Google Scholar 

  38. Fossum, E. R. CMOS image sensors: electronic camera-on-a-chip. IEEE Trans. Electron Devices 44, 1689–1698 (1997).

    Google Scholar 

  39. Chen, X., Chen, Y., Song, X., Liang, W. & Wang, Y. Calibration of stereo cameras with a marked-crossed fringe pattern. Opt. Lasers Eng. 147, 106733 (2021).

    Google Scholar 

Download references

Acknowledgements

Financial support from the National Natural Science Foundation of China (Grant Number: 62227901 to T.Z. and U22A2018 to W.F.Z.).

Author information

Author notes
  1. These authors contributed equally: Bin Wang, Junze Tian.

Authors and Affiliations

  1. Radar Technology Research Institute, School of Information and Electronics, Beijing Institute of Technology, Beijing, China

    Bin Wang, Junze Tian, Shukang Xu, Shuangxiang Zhao, Jianhao Duan, Weifeng Zhang, Tao Zeng & Erke Mao

  2. Chongqing Innovation Center, Beijing Institute of Technology, Chongqing, China

    Bin Wang, Junze Tian, Shukang Xu, Shuangxiang Zhao, Jianhao Duan, Weifeng Zhang, Tao Zeng & Erke Mao

  3. Chongqing Key Laboratory of Novel Civilian Radar, Chongqing, China

    Bin Wang, Junze Tian, Shukang Xu, Shuangxiang Zhao, Jianhao Duan, Weifeng Zhang & Tao Zeng

  4. Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing, China

    Jianwei Wang & Yangyang Liu

  5. University of Chinese Academy of Sciences, Beijing, China

    Jianwei Wang & Yangyang Liu

Authors
  1. Bin Wang
    View author publications

    Search author on:PubMed Google Scholar

  2. Junze Tian
    View author publications

    Search author on:PubMed Google Scholar

  3. Jianwei Wang
    View author publications

    Search author on:PubMed Google Scholar

  4. Shukang Xu
    View author publications

    Search author on:PubMed Google Scholar

  5. Shuangxiang Zhao
    View author publications

    Search author on:PubMed Google Scholar

  6. Jianhao Duan
    View author publications

    Search author on:PubMed Google Scholar

  7. Weifeng Zhang
    View author publications

    Search author on:PubMed Google Scholar

  8. Yangyang Liu
    View author publications

    Search author on:PubMed Google Scholar

  9. Tao Zeng
    View author publications

    Search author on:PubMed Google Scholar

  10. Erke Mao
    View author publications

    Search author on:PubMed Google Scholar

Contributions

B.W., J.Z.T., W.F.Z., and Y.Y.L. conceived and designed the project. J.Z.T., S.X.Z, J.H.D., and S.K.X. performed most experiments; J.Z.T., J.W.W., T.Z., and E.K.M. analyzed data and organized figures; and all authors provided intellectual input and contributed to the text.

Corresponding authors

Correspondence to Weifeng Zhang, Yangyang Liu or Tao Zeng.

Ethics declarations

Competing interests

The authors declare no competing interests.

Peer review

Peer review information

Nature Communications thanks Xingjun Wang and the other anonymous reviewer(s) for their contribution to the peer review of this work. A peer review file is available.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary Information

Description of Additional Supplementary Files

Supplementary Movie S1

Supplementary Movie S2

Pee review file

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, B., Tian, J., Wang, J. et al. Large-array sub-millimeter precision coherent flash three-dimensional imaging. Nat Commun (2026). https://doi.org/10.1038/s41467-026-69188-4

Download citation

  • Received: 09 January 2025

  • Accepted: 20 January 2026

  • Published: 14 February 2026

  • DOI: https://doi.org/10.1038/s41467-026-69188-4

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Download PDF

Advertisement

Explore content

  • Research articles
  • Reviews & Analysis
  • News & Comment
  • Videos
  • Collections
  • Subjects
  • Follow us on Facebook
  • Follow us on X
  • Sign up for alerts
  • RSS feed

About the journal

  • Aims & Scope
  • Editors
  • Journal Information
  • Open Access Fees and Funding
  • Calls for Papers
  • Editorial Values Statement
  • Journal Metrics
  • Editors' Highlights
  • Contact
  • Editorial policies
  • Top Articles

Publish with us

  • For authors
  • For Reviewers
  • Language editing services
  • Open access funding
  • Submit manuscript

Search

Advanced search

Quick links

  • Explore articles by subject
  • Find a job
  • Guide to authors
  • Editorial policies

Nature Communications (Nat Commun)

ISSN 2041-1723 (online)

nature.com sitemap

About Nature Portfolio

  • About us
  • Press releases
  • Press office
  • Contact us

Discover content

  • Journals A-Z
  • Articles by subject
  • protocols.io
  • Nature Index

Publishing policies

  • Nature portfolio policies
  • Open access

Author & Researcher services

  • Reprints & permissions
  • Research data
  • Language editing
  • Scientific editing
  • Nature Masterclasses
  • Research Solutions

Libraries & institutions

  • Librarian service & tools
  • Librarian portal
  • Open research
  • Recommend to library

Advertising & partnerships

  • Advertising
  • Partnerships & Services
  • Media kits
  • Branded content

Professional development

  • Nature Awards
  • Nature Careers
  • Nature Conferences

Regional websites

  • Nature Africa
  • Nature China
  • Nature India
  • Nature Japan
  • Nature Middle East
  • Privacy Policy
  • Use of cookies
  • Legal notice
  • Accessibility statement
  • Terms & Conditions
  • Your US state privacy rights
Springer Nature

© 2026 Springer Nature Limited

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing