Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Advertisement

Scientific Reports
  • View all journals
  • Search
  • My Account Login
  • Content Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • RSS feed
  1. nature
  2. scientific reports
  3. articles
  4. article
Research on identification method and application of unsafe behavior of coal mine personnel
Download PDF
Download PDF
  • Article
  • Open access
  • Published: 03 April 2026

Research on identification method and application of unsafe behavior of coal mine personnel

  • Liang Juan1,
  • Quanjie Zhu2,
  • Dongsheng Jiang2,
  • Yan Liu3,
  • Shaojie Chen2 &
  • …
  • Yingnan Hao2 

Scientific Reports , Article number:  (2026) Cite this article

We are providing an unedited version of this manuscript to give early access to its findings. Before final publication, the manuscript will undergo further editing. Please note there may be errors present which affect the content, and all legal disclaimers apply.

Subjects

  • Energy science and technology
  • Engineering
  • Mathematics and computing

Abstract

Accurate analysis and identification of underground monitoring videos in coal mines can prevent safety accidents caused by unsafe behaviors of underground personnel and protect their safety. In light of this context, this study enhances the traditional YOLOv11 algorithm for target detection and proposes a fast and effective method for identifying unsafe behaviors among underground personnel in the complex environment of coal mines. Firstly, a statistical analysis of the most common types of unsafe behaviors in current underground coal mines is conducted, exploring the classification of miners’ unsafe behaviors into item-type, action-type, and area-type categories. Secondly, based on the characteristics of these unsafe behaviors, we propose dataset augmentation and denoising preprocessing techniques to enhance fine-grained feature extraction. Simultaneously, we introduce the parameter-free SimAM to improve the saliency mapping of miners’ behaviors. Finally, we optimize the YOLOv11 algorithm by incorporating a function enhancement module and the K-means + + anchor frame, and we propose a dual-model recognition method for target detection that integrates the YOLOv11 algorithm with the YOLOv11-Pose algorithm. To validate the performance of our non-standard miners’ behavior recognition method, we test it on a self-constructed dataset. The research results demonstrate that our method can quickly and effectively recognize unsafe behaviors among underground personnel. Compared to traditional methods, our approach significantly improves recognition accuracy on both the self-constructed dataset and the public dataset, achieving a mean Average Precision (mAP) of 95.7%, an accuracy rate of 95.3%, and a recall rate of 95.1%. These findings are significant for preventing underground safety accidents.

Data availability

No datasets were generated or analysed during the current study.

References

  1. Cao, X. et al. Unsafe mining behavior identification method based on an improved ST-GCN. Sustainability 15 (2), 1041 (2023).

    Google Scholar 

  2. Yao, W. et al. Study on the recognition of coal miners’ unsafe behavior and status in the hoist cage based on machine vision. Sensors 23 (21), 8794 (2023).

    Google Scholar 

  3. Lijuan, L. et al. YOLOv5-SFE: An algorithm fusing spatio-temporal features for detecting and recognizing workers’ operating behaviors. Adv. Eng. Inform. 56, 101988 (2023).

    Google Scholar 

  4. Liang, F. Construction site safety helmet wearing detection method based on improved YOLOv5. J. Phys. Conf. Ser. 2560 (1), 012042 (2023).

    Google Scholar 

  5. Ren, S. et al. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Trans. Pattern Anal. Mach. Intell. 39 (6), 1137–1149 (2016).

    Google Scholar 

  6. Khan, M. N., Das, S. & Liu, J. Predicting pedestrian-involved crash severity using inception-v3 deep learning model. Accid. Anal. Prev. 197, 107457 (2024).

    Google Scholar 

  7. Kolar, Z., Chen, H. & Luo, X. Transfer learning and deep convolutional neural networks for safety guardrail detection in 2D images. Autom. Constr. 89, 58–70 (2018).

    Google Scholar 

  8. Zhe, C. et al. OpenPose: Realtime multi-person 2D pose estimation using part affinity fields. IEEE Trans. Pattern Anal. Mach. Intell. 43 (1), 172–186 (2019).

    Google Scholar 

  9. Vukicevic, A. M. et al. Deep learning-based recognition of unsafe acts in manufacturing industry. IEEE Access. 11, 103406–103418 (2023).

    Google Scholar 

  10. Srivastava, N., Mansimov, E. & Salakhudinov, R. Unsupervised learning of video representations using lstms. Int. Conf. Mach. Learn. PMLR, 843–852 (2015).

    Google Scholar 

  11. Guo, Y. et al. Wearable sensor data based human behavior recognition: a method of data feature extraction. J. Comput.-Aided Des. Comput. Graph. 33 (8), 1246–1253 (2021).

    Google Scholar 

  12. Liu, Y. et al. Real-time human activity recognition based on time-domain features of multi-sensor. Zhongguo Guanxing Jishu Xuebao (Journal Chin. Inert. Technol.). 25 (4), 455–460 (2017).

    Google Scholar 

  13. Gao, H. et al. Healthcare system from multisensor collaboration and human action recognition. Sens. Mater. 36 (2024).

  14. Yang, X. et al. Indoor activity and vital sign monitoring for moving people with multiple radar data fusion. Remote Sens. 13 (18), 3791 (2021).

    Google Scholar 

  15. Chen, C. et al. Study on human hazardous behavior recognition and monitoring system in slide facilities based on improved HRNet network. Int. J. Adv. Comput. Sci. Appl., 16(3). (2025).

  16. Lee, T. & Mihailidis, A. An intelligent emergency response system: preliminary development and testing of automated fall detection. J. Telemed. Telecare. 11 (4), 194–198 (2005).

    Google Scholar 

  17. Özyer, T., Ak, D. S. & Alhajj, R. Human action recognition approaches with video datasets—A survey. Knowl. Based Syst. 222, 106995 (2021).

    Google Scholar 

  18. Yan, S., Xiong, Y. & Lin, D. Spatial temporal graph convolutional networks for skeleton-based action recognition. Proc. AAAI Conf. Artif. Intell. 32(1) (2018).

  19. Aslanyan, M. On mobile pose estimation and action recognition design and implementation. Pattern Recognit. Image Anal. 34 (1), 126–136 (2024).

    Google Scholar 

  20. Wang, C., Zhang, H. & Zhai, Z. Real time dangerous action warning system based on graph convolution neural network. Acad. J. Comput. Inform. Sci. 5 (6), 89–94 (2022).

    Google Scholar 

  21. Standard for Classification of Casualties. In Enterprise Employees (GB 6441-1986) (State Bureau of Standards, 1986).

  22. Bishop, C. M. & Nasrabadi, N. M. Pattern Recognition and Machine Learning (Springer, 2006).

  23. Redmon, J. & Farhadi, A. Yolov3: An incremental improvement. arXiv preprint arXiv:1804.02767 (2018).

  24. Graves, A. & Graves, A. Long short-term memory. Supervised sequence labelling with recurrent neural networks. 37–45 (2012).

  25. Mnih, V. et al. Human-level control through deep reinforcement learning. Nature 518 (7540), 529–533 (2015).

    Google Scholar 

Download references

Funding

This work was supported and financed from the Langfang Science and Technology Program (Grant number: 2024013001), the Hebei Natural Science Foundation (Grant number: E2023508021), the Fundamental Research Funds for the Central Universities (Grant number: 3142021002, 3142024005).

Author information

Authors and Affiliations

  1. School of Humanities and Social Science, Institute of Disaster Prevention, Langfang, 065201, Hebei, China

    Liang Juan

  2. School of Safety Emergency Technology and Management, North China Institute of Science and Technology, Langfang, 065201, Hebei, China

    Quanjie Zhu, Dongsheng Jiang, Shaojie Chen & Yingnan Hao

  3. Shiyan Tobacco Company of Hubei Province, Shiyan, 442099, Hubei, China

    Yan Liu

Authors
  1. Liang Juan
    View author publications

    Search author on:PubMed Google Scholar

  2. Quanjie Zhu
    View author publications

    Search author on:PubMed Google Scholar

  3. Dongsheng Jiang
    View author publications

    Search author on:PubMed Google Scholar

  4. Yan Liu
    View author publications

    Search author on:PubMed Google Scholar

  5. Shaojie Chen
    View author publications

    Search author on:PubMed Google Scholar

  6. Yingnan Hao
    View author publications

    Search author on:PubMed Google Scholar

Contributions

Conceptualization, J.L. and Q.Z.; Data curation, J.L., Q.Z. and D.J.; Formal analysis, J.L., Q.Z. and D.J.; Funding acquisition, Q.Z. and Y.L.; Investigation, J.L., Q.Z. and S.C.; Methodology, J.L., Q.Z., D.J. and Y.L.; Project administration, J.L. and Q.Z.; Resources, S.C., Y.L. and Q.Z.; Software, Q.Z., D.J. and Y.H.; Supervision, J.L. and Q.Z.; Validation, D.J., Y.H. and S.C.; Visualization, Q.Z., D.J. and Y.H.; Writing-original draft, J.L., Q.Z. and D.J.; Writing-review and editing, Q.Z. and D.J.

Corresponding author

Correspondence to Quanjie Zhu.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Juan, L., Zhu, Q., Jiang, D. et al. Research on identification method and application of unsafe behavior of coal mine personnel. Sci Rep (2026). https://doi.org/10.1038/s41598-026-47077-6

Download citation

  • Received: 17 July 2025

  • Accepted: 29 March 2026

  • Published: 03 April 2026

  • DOI: https://doi.org/10.1038/s41598-026-47077-6

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Keywords

  • Mine safety
  • Unsafe behavior
  • YOLOv11
  • Denoising
  • Object detection
Download PDF

Advertisement

Explore content

  • Research articles
  • News & Comment
  • Collections
  • Subjects
  • Follow us on Facebook
  • Follow us on X
  • Sign up for alerts
  • RSS feed

About the journal

  • About Scientific Reports
  • Contact
  • Journal policies
  • Guide to referees
  • Calls for Papers
  • Editor's Choice
  • Journal highlights
  • Open Access Fees and Funding

Publish with us

  • For authors
  • Language editing services
  • Open access funding
  • Submit manuscript

Search

Advanced search

Quick links

  • Explore articles by subject
  • Find a job
  • Guide to authors
  • Editorial policies

Scientific Reports (Sci Rep)

ISSN 2045-2322 (online)

nature.com footer links

About Nature Portfolio

  • About us
  • Press releases
  • Press office
  • Contact us

Discover content

  • Journals A-Z
  • Articles by subject
  • protocols.io
  • Nature Index

Publishing policies

  • Nature portfolio policies
  • Open access

Author & Researcher services

  • Reprints & permissions
  • Research data
  • Language editing
  • Scientific editing
  • Nature Masterclasses
  • Research Solutions

Libraries & institutions

  • Librarian service & tools
  • Librarian portal
  • Open research
  • Recommend to library

Advertising & partnerships

  • Advertising
  • Partnerships & Services
  • Media kits
  • Branded content

Professional development

  • Nature Awards
  • Nature Careers
  • Nature Conferences

Regional websites

  • Nature Africa
  • Nature China
  • Nature India
  • Nature Japan
  • Nature Middle East
  • Privacy Policy
  • Use of cookies
  • Legal notice
  • Accessibility statement
  • Terms & Conditions
  • Your US state privacy rights
Springer Nature

© 2026 Springer Nature Limited

Nature Briefing AI and Robotics

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing: AI and Robotics