Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Advertisement

Scientific Reports
  • View all journals
  • Search
  • My Account Login
  • Content Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • RSS feed
  1. nature
  2. scientific reports
  3. articles
  4. article
Lightweight multiscale behavior recognition for caged laying hens using an enhanced YOLOv8 framework
Download PDF
Download PDF
  • Article
  • Open access
  • Published: 25 March 2026

Lightweight multiscale behavior recognition for caged laying hens using an enhanced YOLOv8 framework

  • Yurong Tang1,2,
  • JingGe Wei1,3,
  • Binbin Xie1,3,
  • Rui Kang1,3,
  • Chao Yuan1,3,
  • Jing Liu1,3,
  • Zhichao Mo1,3,
  • Longshen Liu1,3 &
  • …
  • Mingxia Shen1,3 

Scientific Reports , Article number:  (2026) Cite this article

  • 299 Accesses

  • Metrics details

We are providing an unedited version of this manuscript to give early access to its findings. Before final publication, the manuscript will undergo further editing. Please note there may be errors present which affect the content, and all legal disclaimers apply.

Subjects

  • Engineering
  • Mathematics and computing

Abstract

This study presents a lightweight deep-learning framework for recognizing key health-related behaviors of caged laying hens, addressing the challenges of dense housing, frequent occlusions, and subtle action differences. Building upon YOLOv8n (serving as our base model), we introduce three major enhancements: (1) a C2f-FasterNet-EMA backbone that improves multiscale feature extraction; (2) a FasterNet-based neck combined with the Dysample upsampler to refine small-object localization while reducing computational cost; (3) an EMASlideLoss function that alleviates sample imbalance and stabilizes the training process. Evaluated on the Lukou Dataset, which includes four target behaviors (eating, open-mouth breathing, self-pecking, and mutual pecking), the improved model achieves mAP@50 scores of 98.15%, 81.03%, 93.65%, and 94.32% for each behavior, respectively. Overall, compared with the retrained baseline YOLOv8n under identical experimental settings, the proposed method attains a 2.26% improvement in overall mAP@50 while reducing the model size by 22.92% relative to the baseline.

Data availability

The datasets generated and/or analyzed during the current study are available from the corresponding author on reasonable request.

References

  1. National Bureau of Statistics.Livestock product[EB/OL]. [2024-03-20]. https://data.stats.gov.cn/easyquery.htm?Cn=C01&zb=A060601&sj=2023 (2023).

  2. Bloch, V. et al. Automatic broiler temperature measuring by thermal camera. Biosystems Eng. 199, 127–134 (2020).

    Google Scholar 

  3. Sozzi, M. et al. Measuring comfort behavioursin laying hens using deep-learning tools. Anima-ls 13 (1), 33 (2022).

    Google Scholar 

  4. Jacob, F. G. et al. The use of infrared thermography in the identification of pododermatitis in broilers[J]. Engenharia Agrícola. 36 (2), 253–259 (2016).

    Google Scholar 

  5. Chien, Y. R. & Chen, Y. X. An RFID-based smart nest box: an experimental study of laying performance and behavior of individual hens[J]. Sensors 18 (3), 859 (2018).

    Google Scholar 

  6. Li et al. Temporal aggregation network using micromotion features for early lameness recognition in dairy cows. Comput. Electron. Agric. 204, 107562 (2023).

    Google Scholar 

  7. Amraei, S., Abdanan, M. S. & Salari, S. Broiler weight estimation based on machine vision and artificial neural network[J]. Br. Poult. Sci. 58 (2), 200–205 (2017).

    Google Scholar 

  8. Taylor, P. S. et al. Ranging behaviour of commercial free-range broiler chickens 2: individual variation[J]. Animals 7 (7), 55 (2017).

    Google Scholar 

  9. Fazzari, E. et al. ARTEMIS: animal recognition through enhanced multimodal integration system[J]. Int. J. Mach. Learn. Cyber. 16, 5877–5892 (2025).

    Google Scholar 

  10. Ng, X. L. et al. Animal Kingdom: A Large and Diverse Dataset for Animal Behavior Understanding. 2022 IEEE/CVF Conference on Computer Vision and (CVPR), New Orleans, LA, USA, pp. 19001–19012, https://doi.org/10.1109/CVPR52688.2022.01844 (2022).

  11. Mortensen, A. K., Lisouski, P. & Ahrendt, P. Weight prediction of broiler chickens using 3D computer vision[J]. Comput. Electron. Agric. 123, 319–326 (2016).

    Google Scholar 

  12. Johansen, S. V. et al. Broiler weight forecasting using dynamic neural network models with input variable selection[J]. Comput. Electron. Agric. 159, 97–109 (2019).

    Google Scholar 

  13. Linhoss, J. E. et al. Light intensity and uniformity in commercial broiler houses using lighting programs derived from Global Animal Partnership(GAP) lighting standards[J]. J. Appl. Poult. Res. 32 (1), 100309 (2023).

    Google Scholar 

  14. Costantino, A. et al. Climate control in broiler houses: a thermal model for the calculation of the energy use and indoor environmental conditions[J]. Energy Build. 169, 110–126 (2018).

    Google Scholar 

  15. Edoardo Fazzari, D. & Romano, F. Falchi Cesare Stefanini,Animal behavior analysis methods using deep learning: A survey. Expert Syst. Appl. 289, 128330 (2025).

    Google Scholar 

  16. Mohialdin, A. M. & Abdullah Magdy Elbarrany and Ayman Atia. Chicken Behavior Analysis for Surveillance in Poultry Farms. Int. J. Adv. Comput. Sci. Appl. (IJACSA) 14 (3) (2023). http://dx.doi.org/10.14569/IJACSA.2023.01403106.

  17. Yao, W. et al. Poultry sub-health status monitoring and health warning prospect. J. Nanjing Agricultural Univ. 46 (4), 635–644 (2023).

    Google Scholar 

  18. Wu, D. H. et al. Information perception in modern poultry farming:a review. Comput. Electron. Agric. 199, 107131 (2022).

    Google Scholar 

  19. Yang, X. et al. Monitoring activity index and behaviors of cage-free hens with advanced deep learning technologies[J]. Poult. Sci. 103 (11), 104193 (2024).

    Google Scholar 

  20. Li, J. et al. YOLOv8-EMA: Enhanced multi-scale feature fusion for real-time object detection. Pattern Recognit. Lett. 178, 45–52 (2024). (EMA integration for YOLOv8, focusing on small-object detection).

    Google Scholar 

  21. Zhang, H. et al. Attention-driven YOLOv9 for fine-grained action recognition in agricultural scenes. Comput. Electron. Agric. 221, 108123 (2025). EMA-based multi-scale feature aggregation for livestock behavior analysis.

    Google Scholar 

  22. Wang, Y. et al. Efficient multi-scale attention for lightweight YOLO models: Application to edge-device deployment. IEEE Access. 11, 98765–98778 (2023). (EMA optimization for resource-constrained YOLO variants).

    Google Scholar 

  23. Süzen, A. A., Duman, B. & Şen, B. Benchmark analysis of jetson tx2, jetson nano and raspberry pi using deep-cnn. In: 2020 International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA). 1–5. (IEEE, 2020).

  24. https://pypi.org/project/labelImg/

  25. Feroz, M. A. et al. Object detection and classification from a real-time video using SSD and YOLO models. Computational Intelligence in Pattern Recognition: Proceedings of CIPR 2021. 37–47. (Singapore: Springer Singapore, 2021).

  26. Ilani, M. A., Banad, Y. M. & LabelImg CNN-Based Surface Defect Detection[J]. arXiv preprint arXiv:2509.05813, (2025).

  27. Li, J., Chen, C. & Wang, H. C2f-Net: Cross-level feature fusion for real-time object detection. IEEE Trans. Circuits Syst. Video Technol. 32 (7), 4561–4574 (2021).

    Google Scholar 

  28. Block, A., Zhang, Z., Faster, E. M. A. & Accelerated exponential moving average for efficient deep learning training. Adv. Neural Inf. Process. Syst. 36, 12890–12902 (2023).

    Google Scholar 

  29. Liu, G., Liu, F., Okada, T., Freeman, J. M. & Sapiro, G. Image inpainting for irregular holes using partial convolutions. In: Proc. Eur. Conf. Comput. Vis. (ECCV), Munich, Germany, pp. 85–100. (2018).

  30. van Hertem, T. et al. Predicting broiler gait scores from activity monitoring and flock data. Biosyst. Eng. 173, 93–102 (2018).

    Google Scholar 

  31. Fazzari, E., Romano, D& F.Falchi, and C.Stefanini Real-Time Behavior Recognition Using a Legged Robot for Animal–Robot Interaction. J. Field Robot. 0, 1–12. (2025). https://doi.org/10.1002/rob.70123.

    Google Scholar 

  32. Fazzari, E. et al. Selective state models are what you need for animal action recognition. Ecol. Inf. 85, 102955 (2025).

    Google Scholar 

Download references

Funding

This work was supported by the National Key R&D Program of China (Grant No. 2023YFD2000800).

Author information

Authors and Affiliations

  1. Key Laboratory of Breeding Equipment Ministry of Agriculture and Rural Affairs of China, 210031, Nanjing, China

    Yurong Tang, JingGe Wei, Binbin Xie, Rui Kang, Chao Yuan, Jing Liu, Zhichao Mo, Longshen Liu & Mingxia Shen

  2. College of Engineering, Nanjing Agricultural University, Nanjing, 210031, China

    Yurong Tang

  3. School of Artificial Intelligence, Nanjing Agricultural University, Nanjing, 210031, China

    JingGe Wei, Binbin Xie, Rui Kang, Chao Yuan, Jing Liu, Zhichao Mo, Longshen Liu & Mingxia Shen

Authors
  1. Yurong Tang
    View author publications

    Search author on:PubMed Google Scholar

  2. JingGe Wei
    View author publications

    Search author on:PubMed Google Scholar

  3. Binbin Xie
    View author publications

    Search author on:PubMed Google Scholar

  4. Rui Kang
    View author publications

    Search author on:PubMed Google Scholar

  5. Chao Yuan
    View author publications

    Search author on:PubMed Google Scholar

  6. Jing Liu
    View author publications

    Search author on:PubMed Google Scholar

  7. Zhichao Mo
    View author publications

    Search author on:PubMed Google Scholar

  8. Longshen Liu
    View author publications

    Search author on:PubMed Google Scholar

  9. Mingxia Shen
    View author publications

    Search author on:PubMed Google Scholar

Contributions

Y.T. conducted the experiments, curated the dataset, and led the model development.J.W. contributed to data annotation, preprocessing, and experimental validation.B.X. assisted in methodology design and performed comparative analyses.R.K. contributed to equipment deployment and video acquisition in the poultry houses.C.Y. supported algorithm implementation and participated in result interpretation.J.L. contributed to data management, figure preparation, and manuscript editing.Z.M. assisted in system construction and visualization of detection results.L.L. supervised the engineering workflow, provided project coordination, and contributed to manuscript revisions.M.S. conceived and supervised the project, secured funding, and guided the overall research direction.

Corresponding author

Correspondence to Mingxia Shen.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tang, Y., Wei, J., Xie, B. et al. Lightweight multiscale behavior recognition for caged laying hens using an enhanced YOLOv8 framework. Sci Rep (2026). https://doi.org/10.1038/s41598-026-43523-7

Download citation

  • Received: 05 December 2025

  • Accepted: 04 March 2026

  • Published: 25 March 2026

  • DOI: https://doi.org/10.1038/s41598-026-43523-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Keywords

  • Caged laying hens
  • Behavior detection
  • Improve YOLOv8n
  • Smart farming
  • Intelligent chicken farming equipment
Download PDF

Associated content

Collection

AI and ML in veterinary medicine

Advertisement

Explore content

  • Research articles
  • News & Comment
  • Collections
  • Subjects
  • Follow us on Facebook
  • Follow us on X
  • Sign up for alerts
  • RSS feed

About the journal

  • About Scientific Reports
  • Contact
  • Journal policies
  • Guide to referees
  • Calls for Papers
  • Editor's Choice
  • Journal highlights
  • Open Access Fees and Funding

Publish with us

  • For authors
  • Language editing services
  • Open access funding
  • Submit manuscript

Search

Advanced search

Quick links

  • Explore articles by subject
  • Find a job
  • Guide to authors
  • Editorial policies

Scientific Reports (Sci Rep)

ISSN 2045-2322 (online)

nature.com footer links

About Nature Portfolio

  • About us
  • Press releases
  • Press office
  • Contact us

Discover content

  • Journals A-Z
  • Articles by subject
  • protocols.io
  • Nature Index

Publishing policies

  • Nature portfolio policies
  • Open access

Author & Researcher services

  • Reprints & permissions
  • Research data
  • Language editing
  • Scientific editing
  • Nature Masterclasses
  • Research Solutions

Libraries & institutions

  • Librarian service & tools
  • Librarian portal
  • Open research
  • Recommend to library

Advertising & partnerships

  • Advertising
  • Partnerships & Services
  • Media kits
  • Branded content

Professional development

  • Nature Awards
  • Nature Careers
  • Nature Conferences

Regional websites

  • Nature Africa
  • Nature China
  • Nature India
  • Nature Japan
  • Nature Middle East
  • Privacy Policy
  • Use of cookies
  • Legal notice
  • Accessibility statement
  • Terms & Conditions
  • Your US state privacy rights
Springer Nature

© 2026 Springer Nature Limited

Nature Briefing AI and Robotics

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing: AI and Robotics