Abstract
This study presents a lightweight deep-learning framework for recognizing key health-related behaviors of caged laying hens, addressing the challenges of dense housing, frequent occlusions, and subtle action differences. Building upon YOLOv8n (serving as our base model), we introduce three major enhancements: (1) a C2f-FasterNet-EMA backbone that improves multiscale feature extraction; (2) a FasterNet-based neck combined with the Dysample upsampler to refine small-object localization while reducing computational cost; (3) an EMASlideLoss function that alleviates sample imbalance and stabilizes the training process. Evaluated on the Lukou Dataset, which includes four target behaviors (eating, open-mouth breathing, self-pecking, and mutual pecking), the improved model achieves mAP@50 scores of 98.15%, 81.03%, 93.65%, and 94.32% for each behavior, respectively. Overall, compared with the retrained baseline YOLOv8n under identical experimental settings, the proposed method attains a 2.26% improvement in overall mAP@50 while reducing the model size by 22.92% relative to the baseline.
Data availability
The datasets generated and/or analyzed during the current study are available from the corresponding author on reasonable request.
References
National Bureau of Statistics.Livestock product[EB/OL]. [2024-03-20]. https://data.stats.gov.cn/easyquery.htm?Cn=C01&zb=A060601&sj=2023 (2023).
Bloch, V. et al. Automatic broiler temperature measuring by thermal camera. Biosystems Eng. 199, 127–134 (2020).
Sozzi, M. et al. Measuring comfort behavioursin laying hens using deep-learning tools. Anima-ls 13 (1), 33 (2022).
Jacob, F. G. et al. The use of infrared thermography in the identification of pododermatitis in broilers[J]. Engenharia Agrícola. 36 (2), 253–259 (2016).
Chien, Y. R. & Chen, Y. X. An RFID-based smart nest box: an experimental study of laying performance and behavior of individual hens[J]. Sensors 18 (3), 859 (2018).
Li et al. Temporal aggregation network using micromotion features for early lameness recognition in dairy cows. Comput. Electron. Agric. 204, 107562 (2023).
Amraei, S., Abdanan, M. S. & Salari, S. Broiler weight estimation based on machine vision and artificial neural network[J]. Br. Poult. Sci. 58 (2), 200–205 (2017).
Taylor, P. S. et al. Ranging behaviour of commercial free-range broiler chickens 2: individual variation[J]. Animals 7 (7), 55 (2017).
Fazzari, E. et al. ARTEMIS: animal recognition through enhanced multimodal integration system[J]. Int. J. Mach. Learn. Cyber. 16, 5877–5892 (2025).
Ng, X. L. et al. Animal Kingdom: A Large and Diverse Dataset for Animal Behavior Understanding. 2022 IEEE/CVF Conference on Computer Vision and (CVPR), New Orleans, LA, USA, pp. 19001–19012, https://doi.org/10.1109/CVPR52688.2022.01844 (2022).
Mortensen, A. K., Lisouski, P. & Ahrendt, P. Weight prediction of broiler chickens using 3D computer vision[J]. Comput. Electron. Agric. 123, 319–326 (2016).
Johansen, S. V. et al. Broiler weight forecasting using dynamic neural network models with input variable selection[J]. Comput. Electron. Agric. 159, 97–109 (2019).
Linhoss, J. E. et al. Light intensity and uniformity in commercial broiler houses using lighting programs derived from Global Animal Partnership(GAP) lighting standards[J]. J. Appl. Poult. Res. 32 (1), 100309 (2023).
Costantino, A. et al. Climate control in broiler houses: a thermal model for the calculation of the energy use and indoor environmental conditions[J]. Energy Build. 169, 110–126 (2018).
Edoardo Fazzari, D. & Romano, F. Falchi Cesare Stefanini,Animal behavior analysis methods using deep learning: A survey. Expert Syst. Appl. 289, 128330 (2025).
Mohialdin, A. M. & Abdullah Magdy Elbarrany and Ayman Atia. Chicken Behavior Analysis for Surveillance in Poultry Farms. Int. J. Adv. Comput. Sci. Appl. (IJACSA) 14 (3) (2023). http://dx.doi.org/10.14569/IJACSA.2023.01403106.
Yao, W. et al. Poultry sub-health status monitoring and health warning prospect. J. Nanjing Agricultural Univ. 46 (4), 635–644 (2023).
Wu, D. H. et al. Information perception in modern poultry farming:a review. Comput. Electron. Agric. 199, 107131 (2022).
Yang, X. et al. Monitoring activity index and behaviors of cage-free hens with advanced deep learning technologies[J]. Poult. Sci. 103 (11), 104193 (2024).
Li, J. et al. YOLOv8-EMA: Enhanced multi-scale feature fusion for real-time object detection. Pattern Recognit. Lett. 178, 45–52 (2024). (EMA integration for YOLOv8, focusing on small-object detection).
Zhang, H. et al. Attention-driven YOLOv9 for fine-grained action recognition in agricultural scenes. Comput. Electron. Agric. 221, 108123 (2025). EMA-based multi-scale feature aggregation for livestock behavior analysis.
Wang, Y. et al. Efficient multi-scale attention for lightweight YOLO models: Application to edge-device deployment. IEEE Access. 11, 98765–98778 (2023). (EMA optimization for resource-constrained YOLO variants).
Süzen, A. A., Duman, B. & Şen, B. Benchmark analysis of jetson tx2, jetson nano and raspberry pi using deep-cnn. In: 2020 International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA). 1–5. (IEEE, 2020).
Feroz, M. A. et al. Object detection and classification from a real-time video using SSD and YOLO models. Computational Intelligence in Pattern Recognition: Proceedings of CIPR 2021. 37–47. (Singapore: Springer Singapore, 2021).
Ilani, M. A., Banad, Y. M. & LabelImg CNN-Based Surface Defect Detection[J]. arXiv preprint arXiv:2509.05813, (2025).
Li, J., Chen, C. & Wang, H. C2f-Net: Cross-level feature fusion for real-time object detection. IEEE Trans. Circuits Syst. Video Technol. 32 (7), 4561–4574 (2021).
Block, A., Zhang, Z., Faster, E. M. A. & Accelerated exponential moving average for efficient deep learning training. Adv. Neural Inf. Process. Syst. 36, 12890–12902 (2023).
Liu, G., Liu, F., Okada, T., Freeman, J. M. & Sapiro, G. Image inpainting for irregular holes using partial convolutions. In: Proc. Eur. Conf. Comput. Vis. (ECCV), Munich, Germany, pp. 85–100. (2018).
van Hertem, T. et al. Predicting broiler gait scores from activity monitoring and flock data. Biosyst. Eng. 173, 93–102 (2018).
Fazzari, E., Romano, D& F.Falchi, and C.Stefanini Real-Time Behavior Recognition Using a Legged Robot for Animal–Robot Interaction. J. Field Robot. 0, 1–12. (2025). https://doi.org/10.1002/rob.70123.
Fazzari, E. et al. Selective state models are what you need for animal action recognition. Ecol. Inf. 85, 102955 (2025).
Funding
This work was supported by the National Key R&D Program of China (Grant No. 2023YFD2000800).
Author information
Authors and Affiliations
Contributions
Y.T. conducted the experiments, curated the dataset, and led the model development.J.W. contributed to data annotation, preprocessing, and experimental validation.B.X. assisted in methodology design and performed comparative analyses.R.K. contributed to equipment deployment and video acquisition in the poultry houses.C.Y. supported algorithm implementation and participated in result interpretation.J.L. contributed to data management, figure preparation, and manuscript editing.Z.M. assisted in system construction and visualization of detection results.L.L. supervised the engineering workflow, provided project coordination, and contributed to manuscript revisions.M.S. conceived and supervised the project, secured funding, and guided the overall research direction.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Tang, Y., Wei, J., Xie, B. et al. Lightweight multiscale behavior recognition for caged laying hens using an enhanced YOLOv8 framework. Sci Rep (2026). https://doi.org/10.1038/s41598-026-43523-7
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41598-026-43523-7