Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Advertisement

Scientific Reports
  • View all journals
  • Search
  • My Account Login
  • Content Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • RSS feed
  1. nature
  2. scientific reports
  3. articles
  4. article
A real-time mobile aquatic plant recognition algorithm based on deep learning for intelligent ecological monitoring
Download PDF
Download PDF
  • Article
  • Open access
  • Published: 12 January 2026

A real-time mobile aquatic plant recognition algorithm based on deep learning for intelligent ecological monitoring

  • Daoli Wang1,
  • Zengchuan Dong1,
  • Guang Yang1,
  • Zhonglin Zhao1,
  • Ranyu Liu1,
  • Jitao Zhang1,
  • Wenzhuo Wang1 &
  • …
  • Youwei Qin2 

Scientific Reports , Article number:  (2026) Cite this article

  • 581 Accesses

  • Metrics details

We are providing an unedited version of this manuscript to give early access to its findings. Before final publication, the manuscript will undergo further editing. Please note there may be errors present which affect the content, and all legal disclaimers apply.

Subjects

  • Ecology
  • Environmental sciences

Abstract

The importance of aquatic plants in aquatic ecosystems is drawing growing attention, and accurate species identification is essential for advancing intelligent and precise ecological monitoring. Traditional methods fall short in large-scale, real-time monitoring, and while YOLOv8 is effective, it lacks sufficient lightweight optimization for mobile devices, limiting its practical application. Existing lightweight models also face challenges in balancing accuracy and speed in complex environments, such as dense growth, similar species, and occlusions. This paper introduces APlight-YOLOv8n, an enhanced YOLOv8n-based approach designed to address these challenges using the Faster Detect and Universal Inverted Bottleneck (UIB) modules. Evaluated on an aquatic plant dataset, APlight-YOLOv8n outperforms YOLOv8n: the mean average precision (mAP50) increased to 74.4%, a 1.9% improvement; the number of parameters (Params) was reduced to 2.74 M, a 13.3% decrease; floating-point operations (FLOPs) dropped to 5.5G, a 32.9% reduction; and the inference speed (FPS) remained stable at 32.70. This model enables fast, accurate recognition in complex environments, providing efficient support for real-time field monitoring. In conclusion, APlight-YOLOv8n demonstrates superior performance in balancing accuracy and computational efficiency for aquatic plant detection and offers new insights for mobile ecological monitoring and broader smart environmental applications.

Data availability

The data used in this study are confidential and cannot be shared publicly. However, they are available from the first author, Daoli Wang, upon reasonable request (contact: 230301010003@hhu.edu.cn).

References

  1. Hama Aziz, K. H. et al. Heavy metal pollution in the aquatic environment: Efficient and low-cost removal approaches to eliminate their toxicity: A review. RSC Adv. 13, 17595–17610 (2023).

    Google Scholar 

  2. Okereafor, U. et al. Toxic metal implications on agricultural soils, plants, animals, aquatic life and human health. IJERPH 17, 2204 (2020).

    Google Scholar 

  3. Vymazal, J. & Březinová, T. The use of constructed wetlands for removal of pesticides from agricultural runoff and drainage: A review. Environ. Int. 75, 11–20 (2015).

    Google Scholar 

  4. Gu, B. et al. Cost-effective mitigation of nitrogen pollution from global croplands. Nature 613, 77–84 (2023).

    Google Scholar 

  5. Huang, J. et al. Characterizing the river water quality in china: recent progress and on-going challenges. Water Res. 201, 117309 (2021).

    Google Scholar 

  6. Mukhopadhyay, A., Duttagupta, S. & Mukherjee, A. Emerging organic contaminants in global community drinking water sources and supply: A review of occurrence, processes and remediation. J. Environ. Chem. Eng. 10, 107560 (2022).

    Google Scholar 

  7. Hao, Z., Lin, L., Post, C. J. & Mikhailova, E. A. Monitoring the spatial–temporal distribution of invasive plant in urban water using deep learning and remote sensing technology. Ecol. Ind. 162, 112061 (2024).

    Google Scholar 

  8. Palai, S. P. et al. A review on exploring pyrolysis potential of invasive aquatic plants. J. Environ. Manag. 371, 123017 (2024).

    Google Scholar 

  9. Patel, M., Jernigan, S., Richardson, R., Ferguson, S. & Buckner, G. Autonomous robotics for identification and management of invasive aquatic plant species. Appl. Sci. 9, 2410 (2019).

    Google Scholar 

  10. Wang, Z., Cui, J. & Zhu, Y. Review of plant leaf recognition. Artif. Intell. Rev. 56, 4217–4253 (2023).

    Google Scholar 

  11. Iqbal, Z. et al. An automated detection and classification of citrus plant diseases using image processing techniques: A review. Comput. Electron. Agric. 153, 12–32 (2018).

    Google Scholar 

  12. Lønborg, C. et al. Submerged aquatic vegetation: Overview of monitoring techniques used for the identification and determination of spatial distribution in European coastal waters. Integr. Envir Assess. Manag. 18, 892–908 (2022).

    Google Scholar 

  13. Xie, G. et al. FlowerMate 2.0: identifying plants in China with artificial intelligence. Innov. 5, 100636 (2024).

    Google Scholar 

  14. Kabir, H., Juthi, T., Islam, M. T., Rahman, M. W. & Khan, R. WaterHyacinth: A comprehensive image dataset of various water hyacinth species from different regions of Bangladesh. Data Brief. 52, 109872 (2024).

    Google Scholar 

  15. Garcia-Ruiz, F., Campos, J., Llop-Casamada, J. & Gil, E. Assessment of map based variable rate strategies for copper reduction in hedge vineyards. Comput. Electron. Agric. 207, 107753 (2023).

    Google Scholar 

  16. Wang, P. et al. Weed25: A deep learning dataset for weed identification. Front. Plant. Sci. 13, 1053329 (2022).

    Google Scholar 

  17. Bai, Y. & Bai, X. Deep learning-based aquatic plant recognition technique and natural ecological aesthetics conservation. Crop Prot. 184, 106765 (2024).

    Google Scholar 

  18. Wang, D. et al. APNet-YOLOv8s: A real-time automatic aquatic plants recognition algorithm for complex environments. Ecol. Ind. 167, 112597 (2024).

    Google Scholar 

  19. Howard, A. et al. Searching for MobileNetV3. Preprint at (2019). https://doi.org/10.48550/arXiv.1905.02244.

  20. Howard, A. G. et al. MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. Preprint at (2017). https://doi.org/10.48550/arXiv.1704.04861.

  21. Qin, D. et al. MobileNetV4 -- Universal Models for the Mobile Ecosystem. Preprint at (2024). https://doi.org/10.48550/arXiv.2404.10518.

  22. Sandler, M., Howard, A., Zhu, M., Zhmoginov, A. & Chen, L. C. MobileNetV2: Inverted Residuals and Linear Bottlenecks. Preprint at (2019). https://doi.org/10.48550/arXiv.1801.04381.

  23. Redmon, J. & Farhadi, A. YOLOv3: An incremental improvement. Preprint at (2018). http://arxiv.org/abs/1804.02767.

  24. Bochkovskiy, A., Wang, C. Y. & Liao, H. Y. M. YOLOv4: Optimal speed and accuracy of object detection. Preprint at (2020). http://arxiv.org/abs/2004.10934.

  25. Glenn, J. YOLOv5 release v6.1. (2022). https://github.com/ultralytics/yolov5/releases/tag/v6.1.

  26. Glenn, J. Ultralytics YOLOv8. (2023). https://github.com/ultralytics/ultralytics.

  27. Ma, B. et al. Using an improved lightweight YOLOv8 model for real-time detection of multi-stage Apple fruit in complex orchard environments. Artif. Intell. Agric. 11, 70–82 (2024).

    Google Scholar 

  28. Wang, C. Y., Yeh, I. H. & Liao, H. Y. M. YOLOv9: Learning What you want to learn using programmable gradient information. Preprint at (2024). https://doi.org/10.48550/arXiv.2402.13616.

  29. Zhang, Y. et al. Real-time strawberry detection using deep neural networks on embedded system (rtsd-net): an edge AI application. Comput. Electron. Agric. 192, 106586 (2022).

    Google Scholar 

  30. Lan, M. et al. RICE-YOLO: In-Field rice Spike detection based on improved YOLOv5 and drone images. Agronomy 14, 836 (2024).

    Google Scholar 

  31. Wan, J. et al. DSC-YOLOv8n: an advanced automatic detection algorithm for urban flood levels. J. Hydrol. 643, 132028 (2024).

    Google Scholar 

  32. Aboah, A., Wang, B., Bagci, U. & Adu-Gyamfi, Y. Real-time Multi-Class Helmet Violation Detection Using Few-Shot Data Sampling Technique and YOLOv8. in IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) 5350–5358 (IEEE, Vancouver, BC, Canada, 2023). https://doi.org/10.1109/CVPRW59228.2023.00564.

  33. Tu, W. et al. Farmed fish detection by improved YOLOv8 based on channel non-degradation with spatially coordinated attention. J. DALIAN OCEAN. Univ. 38, 717–725 (2023).

    Google Scholar 

  34. Hu, H., Chen, M., Huang, L. & Guo, C. BHI-YOLO: A lightweight instance segmentation model for strawberry diseases. Appl. Sci. 14, 9819 (2024).

    Google Scholar 

  35. Zhang, Q. & Chen, S. Research on improved lightweight fish detection algorithm based on Yolov8n. JMSE 12, 1726 (2024).

    Google Scholar 

  36. Kaur, R. & Singh, S. A comprehensive review of object detection with deep learning. Digit. Signal Proc. 132, 103812 (2023).

    Google Scholar 

  37. Tong, K., Wu, Y. & Zhou, F. Recent advances in small object detection based on deep learning: A review. Image Vis. Comput. 97, 103910 (2020).

    Google Scholar 

  38. Pont-Tuset, J. & Gool, L. V. Boosting Object Proposals: From Pascal to COCO. in IEEE International Conference on Computer Vision (ICCV) 1546–1554 (IEEE, Santiago, Chile, 2015). https://doi.org/10.1109/ICCV.2015.181.

  39. Durgut, O., Ünsalan, C. A. & Swin Transformer YOLO, and Weighted Boxes Fusion-Based Approach for Tree Detection in Satellite Images. in 32nd Signal Processing and Communications Applications Conference (SIU) 1–4 (IEEE, Mersin, Turkiye, 2024). https://doi.org/10.1109/SIU61531.2024.10601134.

  40. Su, P., Han, H., Liu, M., Yang, T. & Liu, S. MOD-YOLO: rethinking the YOLO architecture at the level of feature information and applying it to crack detection. Expert Syst. Appl. 237, 121346 (2024).

    Google Scholar 

  41. Solimani, F. et al. Optimizing tomato plant phenotyping detection: boosting YOLOv8 architecture to tackle data complexity. Comput. Electron. Agric. 218, 108728 (2024).

    Google Scholar 

Download references

Funding

This work was supported by the National Key Research and Development Program of China (Grant No. 2023YFC3206800).

Author information

Authors and Affiliations

  1. College of Hydrology and Water Resources, Hohai University, Nanjing, 210024, China

    Daoli Wang, Zengchuan Dong, Guang Yang, Zhonglin Zhao, Ranyu Liu, Jitao Zhang & Wenzhuo Wang

  2. State Key Laboratory of Hydrology-Water Resources and Hydraulic Engineering, Centre for Global Change and Water Cycle, Hohai University, Nanjing, 210024, China

    Youwei Qin

Authors
  1. Daoli Wang
    View author publications

    Search author on:PubMed Google Scholar

  2. Zengchuan Dong
    View author publications

    Search author on:PubMed Google Scholar

  3. Guang Yang
    View author publications

    Search author on:PubMed Google Scholar

  4. Zhonglin Zhao
    View author publications

    Search author on:PubMed Google Scholar

  5. Ranyu Liu
    View author publications

    Search author on:PubMed Google Scholar

  6. Jitao Zhang
    View author publications

    Search author on:PubMed Google Scholar

  7. Wenzhuo Wang
    View author publications

    Search author on:PubMed Google Scholar

  8. Youwei Qin
    View author publications

    Search author on:PubMed Google Scholar

Contributions

D.W.: Writing—original draft, Software, Methodology, Investigation, Conceptualization. Z.D.: Writing—review & editing, Validation, Supervision, Methodology, Funding acquisition. G.Y.: Writing—review & editing, Validation, Supervision. Z.Z.: Supervision. R.L.: Supervision. J.Z.: Supervision. W.W.: Writing—review & editing, Supervision. Y.Q.: Writing—review & editing, Validation, Supervision.

Corresponding author

Correspondence to Zengchuan Dong.

Ethics declarations

Competing interests

The authors declare no competing interests.

Ethical approval and consent to participate

All authors agreed to publish this manuscript.

Consent for publication

Consent and approval for publication was obtained from all authors.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, D., Dong, Z., Yang, G. et al. A real-time mobile aquatic plant recognition algorithm based on deep learning for intelligent ecological monitoring. Sci Rep (2026). https://doi.org/10.1038/s41598-026-35310-1

Download citation

  • Received: 02 September 2025

  • Accepted: 05 January 2026

  • Published: 12 January 2026

  • DOI: https://doi.org/10.1038/s41598-026-35310-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Keywords

  • Aquatic plants
  • Computer vision
  • Target detection
  • Mobile smart devices
  • YOLOv8
Download PDF

Advertisement

Explore content

  • Research articles
  • News & Comment
  • Collections
  • Subjects
  • Follow us on Facebook
  • Follow us on Twitter
  • Sign up for alerts
  • RSS feed

About the journal

  • About Scientific Reports
  • Contact
  • Journal policies
  • Guide to referees
  • Calls for Papers
  • Editor's Choice
  • Journal highlights
  • Open Access Fees and Funding

Publish with us

  • For authors
  • Language editing services
  • Open access funding
  • Submit manuscript

Search

Advanced search

Quick links

  • Explore articles by subject
  • Find a job
  • Guide to authors
  • Editorial policies

Scientific Reports (Sci Rep)

ISSN 2045-2322 (online)

nature.com sitemap

About Nature Portfolio

  • About us
  • Press releases
  • Press office
  • Contact us

Discover content

  • Journals A-Z
  • Articles by subject
  • protocols.io
  • Nature Index

Publishing policies

  • Nature portfolio policies
  • Open access

Author & Researcher services

  • Reprints & permissions
  • Research data
  • Language editing
  • Scientific editing
  • Nature Masterclasses
  • Research Solutions

Libraries & institutions

  • Librarian service & tools
  • Librarian portal
  • Open research
  • Recommend to library

Advertising & partnerships

  • Advertising
  • Partnerships & Services
  • Media kits
  • Branded content

Professional development

  • Nature Awards
  • Nature Careers
  • Nature Conferences

Regional websites

  • Nature Africa
  • Nature China
  • Nature India
  • Nature Japan
  • Nature Middle East
  • Privacy Policy
  • Use of cookies
  • Legal notice
  • Accessibility statement
  • Terms & Conditions
  • Your US state privacy rights
Springer Nature

© 2026 Springer Nature Limited

Nature Briefing Anthropocene

Sign up for the Nature Briefing: Anthropocene newsletter — what matters in anthropocene research, free to your inbox weekly.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing: Anthropocene