Abstract
Sports and activity-related musculoskeletal injuries are a major cause of long-term disability, underscoring the need for proactive detection and data-driven rehabilitation strategies. Recent advancements in wearable sensing enable continuous, non-invasive biomechanical monitoring for early risk identification and recovery optimization. This study presents a real-time wearable biomechanics framework integrating inertial measurement units (IMUs) and surface electromyography (sEMG) for injury-risk assessment and rehabilitation tracking. Field experiments were conducted with 50 athletes at Dring Stadium, Bahawalpur. The IMUs were positioned on the knee, hip, and shoulder joints, while sEMG electrodes measured biceps, triceps, and quadriceps muscle activations. Recorded joint-angle ranges averaged 125° (knee during running), 110° (knee during jumping), and 90° (shoulder during lifting); corresponding mean muscle forces were 150 N (quadriceps), 170 N (hamstrings), and 230 N (deltoid). A multi-stage optimization algorithm minimized prediction errors by jointly tuning sensor calibration and computational latency. The hybrid IMU–sEMG model achieved 92.3% accuracy, 90.5% recall, and an AUC of 0.93 for injury-risk classification, with an average real-time feedback latency of 188 ± 15 ms. Early detection of joint-angle asymmetry (> 10°) and muscle-force imbalance (> 15%) accurately predicted emerging anterior cruciate ligament (ACL) and muscle-strain risks. Real-time monitoring guided individualized rehabilitation loads and progressive recovery milestones. By combining wearable sensing, physics-informed biomechanical modeling, and adaptive machine-learning optimization, the proposed system delivers a quantitatively validated, reproducible, and scalable framework for injury prevention and rehabilitation. Supporting UN Sustainable Development Goal 3, this work advances musculoskeletal health monitoring through validated sensor integration and empirically tested AI models, offering measurable benefits across athletic and occupational applications.
Introduction
Protecting industrial workers’ health and enhancing athletic performance are two domains where biomechanical risk assessment plays a pivotal role1. Conventional evaluation methods based on retrospective analysis and ergonomic guidelines2 often fail to capture real-time biomechanical stress during dynamic activities such as industrial tasks or athletic movements3. Therefore, it is essential to advance assessment techniques to reflect the evolving nature of modern workplaces and competitive sports4.
Despite progress in wearable biomechanics, a significant research gap persists in integrating real-time sensing with predictive modeling for injury prevention in both occupational and sports contexts. Current systems are either limited to controlled laboratory environments or lack adaptive predictive mechanisms for proactive intervention. The motivation for this study arises from the global incidence of musculoskeletal injuries, over 3.5 million annually in sports and 2.8 million in workplace settings, underscoring the need for a portable, data-driven, and unified injury-risk detection framework that also supports personalized rehabilitation.
Recent studies have highlighted the expanding intersection between wearable technologies and biomechanics, particularly for continuous motion tracking, rehabilitation, and performance optimization5,6,7,8. For example, LSTM-based architecture has shown high accuracy in predicting joint biomechanics during gait using inertial data9, while attention-enhanced models such as SETransformer improve robustness in unstructured environments10. Emerging wearable systems including the Equilivest robotic vest and IMUPoser full-body tracking suit further demonstrate the potential of real-time feedback for adaptive balance and motor retraining11,12.
This study introduces an integrated framework that combines inertial measurement units (IMUs) and surface electromyography (sEMG) sensors with machine-learning models for biomechanical stress quantification and injury-risk prediction. Unlike previous work that separately addressed either athletic or occupational domains, the proposed approach provides a unified methodology capable of analyzing joint angles, asymmetry indices, and muscle-force estimations across both contexts. The framework aims to reduce biomechanical overloads and asymmetries while offering interpretable, evidence-based feedback for prevention and rehabilitation.
Multiple challenges remain in ensuring accuracy, algorithmic stability, and system integration13,14,15,16. Technical constraints such as signal drift, motion artifacts, and variable sampling rates continue to hinder real-time data reliability17,18. Furthermore, contextual variability in industrial and athletic environments complicates standardization19,20,21. This research addresses these issues through validated data acquisition pipelines, robust feature-extraction methods, and adaptive learning models tailored to real-world motion conditions22,23.
Injury prevention and rehabilitation remain central to sports medicine and occupational health. Traditional laboratory tools such as force plates and optical motion capture, though accurate, are impractical for continuous deployment. Wearable technologies now enable real-time monitoring of joint kinematics and muscle function, providing a scalable solution bridging laboratory precision with field applicability.
By integrating multi-sensor fusion, machine learning, and biomechanical modeling, this study advances a validated, real-time system for injury-risk assessment and rehabilitation support. The framework aims to improve both workplace safety and athletic performance by establishing a scientifically grounded, data-driven paradigm for biomechanical monitoring and predictive injury management.
In biomechanical risk assessment, the goal is to formulate a model that effectively characterizes the stress (\(\:S\)) on musculoskeletal structures during dynamic activities. Let \(\:F\) represent the external forces acting on the body, \(\:A\) denote the area of contact, and \(\:\tau\:\) signify the time duration of the applied force. The stress (\(\:S\)) can be mathematically expressed as:
Considering the dynamic nature of activities, the rate of change of stress over time (\(\:\dot{S}\)) becomes crucial in identifying potential risks. Utilizing, the derivative of stress with respect to time, we formulate the dynamic stress equation as follows:
To integrate wearable, sensor data, let \(\:D\) represent the set of biomechanical parameters collected from sensors, including joint angles (\(\:\theta\:\)), angular velocities (\(\:\omega\:\)), and muscle forces (\(\:M\)). The relationship between stress (\(\:S\)) and sensor data (\(\:D\)) can be formulated using a neural network model as:
Where\(\:f\)represents the neural network function with parameters \(\:\theta\:\).
The optimization, problem involves minimizing the discrepancy between predicted stress (\(\:S\)) obtained from wearable sensor data (\(\:D\)) and actual stress measured using biomechanical, models while considering constraints such as sensor accuracy and computational efficiency. This can be formulated as an optimization problem:
Subject to constraints on sensor accuracy and computational resources, where \(\:N\) is the number of data samples, \(\:R\left(\theta\:\right)\) denotes regularization terms, and \(\:\lambda\:\) is the regularization parameter.
The primary objective of this study is to develop and validate a real-time biomechanical risk assessment framework using wearable sensor data and deep learning models. The specific goals are:
-
To integrate inertial (IMU) and surface EMG data for real-time monitoring of joint angles and muscle activation during dynamic activities.
-
To design and train a neural network model capable of estimating biomechanical stress and predicting injury-prone conditions based on wearable sensor input.
-
To implement and test the proposed system in athletic and occupational scenarios, quantifying risk scores across various exercises.
-
To assess the feasibility and performance of wearable-based risk assessment in unstructured environments compared to conventional marker-based systems.
Research topic statement
This study specifically investigates real-time biomechanical risk assessment using wearable IMU and sEMG sensors, focusing on predicting unsafe joint-loading patterns and muscle-activation imbalances in athletic populations. The research centers on developing and validating a sensor-fusion model capable of estimating biomechanical stress in real time and identifying movement patterns associated with elevated injury risk. The scope is limited to athletic biomechanics, with occupational applications discussed only as future extensions. This explicit focus ensures that the manuscript remains aligned with a single, well-defined research topic suitable for academic investigation.
The core research question guiding this work is: How effectively can wearable sensor data be integrated with biomechanical modeling and machine learning to quantify and predict stress-related risks during athletic and occupational activities?
Supporting sub-questions include:
-
To what extent can wearable-based biomechanical indices replicate laboratory-grade motion-capture accuracy?
-
How can real-time sensor feedback contribute to reducing asymmetry and joint overload during rehabilitation?
-
What are the algorithmic and practical limitations of deploying such systems in uncontrolled, real-world environments?
-
This clarified research scope ensures a targeted investigation and minimizes interpretative ambiguity across experimental and analytical sections.
The research contributions are as follows:
-
Development of a unified framework merging industrial ergonomics and athletic performance optimization through wearable-integrated biomechanical risk assessment.
-
Synthesis of methodologies catering to the distinct demands of industrial and athletic settings, enhancing the adaptability and applicability of wearable systems.
-
Exploration of real-time data acquisition techniques to offer immediate insights into biomechanical risks for proactive mitigation strategies in both domains.
-
Contribution to the advancement of comprehensive analyses that bridge contextual differences between industrial and athletic realms, addressing their simultaneous demands.
-
Establishment of adaptable risk assessment methodologies utilizing wearables, paving the way for holistic approaches to minimize risks and optimize performance.
The novelty of this work lies in the development of a hybrid IMU–EMG sensor framework coupled with a multi-stage machine learning pipeline for real-time biomechanical stress prediction. Unlike previous IMU-only or EMG-only frameworks, the proposed approach introduces a dual-stream fusion model, adaptive calibration, and latency-aware optimization layer to achieve sub-200 ms feedback. The validation strategy integrates multi-trial averaging, confidence intervals, and ablation-based assessment, establishing both reproducibility and deployment readiness across sports and occupational applications.
This gap is addressed by developing an integrated wearable biomechanical system that merges inertial and electromyographic sensing with machine learning–based risk modeling, enabling immediate feedback, individualized rehabilitation pathways, and adaptability across both athletic and industrial scenarios. This investigation is presented as a sport-specific proof-of-concept, with a planned Phase 2 expansion to industrial populations for validation under occupational task conditions. While the framework is designed for cross-domain applicability, the present analysis confines empirical validation to sports biomechanics and rehabilitation. Industrial ergonomics components are discussed only as prospective extensions.
The present study focuses on developing an integrated, real-time wearable biomechanical risk assessment and rehabilitation optimization framework that unifies IMU and EMG data through adaptive machine learning pipelines. Unlike descriptive reviews, this work emphasizes empirical modeling, field validation, and latency-aware optimization to ensure real-time applicability in both sports and occupational contexts.
This study investigates how hybrid IMU–EMG data fusion combined with real-time machine learning can improve biomechanical risk detection accuracy in athletic contexts. The central hypothesis (H1–H3) posits that multi-sensor fusion significantly enhances injury-risk prediction precision, reduces latency, and supports adaptive rehabilitation feedback compared to single-sensor baselines.
This work should be interpreted as a dual-phase study: (1) a technical validation against gold-standard motion capture and force plates, and (2) a pilot intervention demonstrating feasibility in applied field settings. While promising, these findings represent groundwork rather than definitive proof of injury risk reduction.
In this manuscript, the empirical analysis and all collected datasets strictly reflect athletic biomechanics. Occupational applications are presented conceptually for future expansion and are not part of the experimental results. The primary validated scope of this study is therefore athletic injury-risk monitoring, with occupational integration planned for Phase 2.
Definitions of key terms:
-
Integrated Wearable Biomechanical System: A unified platform combining wearable sensors (e.g., IMUs, EMG) with real-time data processing and feedback mechanisms for injury risk assessment and rehabilitation monitoring.
-
Risk Modeling: The computational process of estimating injury likelihood using biomechanical parameters (e.g., joint angles, muscle forces, asymmetry indices) as input features to machine learning algorithms.
-
Rehabilitation Optimization: The adaptive adjustment of rehabilitation exercises and loads based on continuous monitoring of recovery indicators, including range of motion (ROM), muscle activation patterns, and symmetry metrics.
The outline of the paper is as follows: In the first section, we learn about biomechanical risk assessment and why it’s important to use wearable devices. The literature review part presents a thorough analysis of current approaches and tools used for biomechanical risk assessment, with an emphasis on the use of wearables in both physical and occupational contexts. The methodology section explains the strategy that was created to integrate data from wearable sensors and to set up models for biomechanical risk assessment. In the results and analysis section, we share the findings from applying this methodology and analyzing them. Then, we talk about what these findings mean for improving athletic performance and workplace safety. In the difficulties and future directions section, the article discusses problems that have been encountered and suggests ways that research could go in the future. At the end, the conclusion section summarizes the main points, contributions, and implications of the study. The paper’s references are a comprehensive inventory of all the sources used.
Literature review
Recent advancements in wearable technologies have significantly transformed sports medicine and biomechanical analysis, providing real-time insights into athletes’ physiological responses, performance metrics, and recovery progress. Studies by Li et al. (2016) and Seçkin et al. (2023) emphasized the widespread adoption of wearables in sports medicine, highlighting their capacity to monitor physical performance and optimize training programs. These reviews underscored the critical role of wearable devices in delivering immediate feedback and supporting personalized intervention strategies for athletes. Despite the growing popularity of wearables, several gaps and challenges persist in literature. Montull et al. (2022) stressed the importance of incorporating subjective assessments alongside objective sensor-based measurements, noting that subjective feedback can, in some contexts, provide deeper insight into fatigue, effort, and readiness. This reinforces the need for a holistic sports monitoring framework that integrates both physiological data and perceptual measures to better capture the nuances of athletic performance.
Several studies have addressed the practical challenges of implementing wearable technologies. De Fazio et al. (2023) and Baca et al. (2022) identified concerns related to data accuracy, sensor calibration, comfort, and long-term user adherence, which limit large-scale adoption. These studies emphasized the necessity of improving device reliability and usability to fully leverage wearable potential in real-world settings. Pekas et al. (2023) further discussed the ethical and privacy implications of continuous biometric data collection, urging the development of clear regulatory and data protection frameworks. Similarly, Sousa et al. (2023) highlighted the urgent need for standardized validation protocols and consistent methodologies to ensure the reliability and comparability of wearable-derived biomechanical metrics. Collectively, these findings emphasize that usability, ethics, and standardization are key to the future integration of wearable technology into evidence-based sports and medical practice.
In parallel, numerous studies have examined the integration of wearable systems into biomechanical risk assessment and occupational ergonomics. Research in24 demonstrated that wearables can effectively monitor and analyze biomechanical stressors, including muscle exertion and joint strain, in industrial environments, enhancing physical task evaluation and workplace safety. Subsequent studies25,26 confirmed the value of wearable devices in sports science for gait analysis, muscle activation tracking, and injury prevention, extending their utility beyond laboratory-based experiments. These studies also recognized the potential of miniaturized, wireless inertial sensors to capture large-scale, individualized kinematic data across both controlled and field environments, marking a major shift toward real-time, personalized biomechanics.
Challenges regarding data reliability, system compatibility, and computational efficiency remain substantial. As highlighted in27, inconsistent sampling rates, limited interoperability, and high computational demands constrain seamless integration into applied biomechanical assessment. Addressing these challenges requires robust data fusion algorithms, efficient processing architectures, and improved hardware ergonomics to enhance comfort and compliance. Researchers in28 advanced algorithmic and machine learning methods for wearable data interpretation, using neural networks and pattern-recognition techniques to improve prediction accuracy and biomechanical insight. These algorithmic developments play a pivotal role in transforming raw sensor signals into actionable parameters for injury prediction, workload estimation, and postural correction.
Continuous progress in wearable sensor technology has been central to recent research29,30,31. Studies have reported advancements in sensor miniaturization, battery life, textile integration, and signal stability, all of which enhance continuous biomechanical monitoring. Moreover, wearable systems have increasingly been applied in healthcare and rehabilitation contexts30, supporting patient movement analysis, recovery tracking, and musculoskeletal rehabilitation. These interdisciplinary applications highlight the broad relevance of wearable systems across occupational, clinical, and athletic domains.
Ethical and societal considerations have also gained attention in recent research32. Scholars have raised concerns about data security, informed consent, and responsible data handling, emphasizing the need for transparent frameworks to maintain trust and protect user autonomy. Cost-effectiveness and scalability have been evaluated as well33, showing that while initial investment costs remain high, the long-term benefits—such as injury prevention, improved productivity, and enhanced recovery—outweigh the expense. Moreover, scalable solutions and open architectures have been proposed to promote the adoption of wearables across various industries and demographics.
Recent studies combining wearable sensors with machine learning have pushed the boundaries of biomechanical prediction. Xiang et al.5,9 demonstrated that LSTM networks coupled with personalized shape modeling can accurately estimate bone and ankle stress in runners. Han et al.6 and Bi et al.7 achieved over 88% accuracy in injury prediction using wearable data and data-driven movement analysis, while Cao and Liu8 discussed the long-term benefits of predictive modeling for athlete longevity. The SETransformer model10 further enhanced activity recognition in uncontrolled environments, offering improved robustness to noise and motion variability. Systems such as Equilivest11 and IMUPoser12 exemplify applied wearable frameworks for real-time posture correction and rehabilitation, providing actionable feedback for users and therapists. These advancements collectively demonstrate a paradigm shift from reactive injury treatment to proactive, predictive, and adaptive injury management.
Looking forward, future research is expected to extend the application of wearables toward personalized healthcare, precision rehabilitation, and performance optimization. Studies34,35,36,37 envision wearables as integral components of individualized biomechanical monitoring systems, capable of providing context-aware interventions and adaptive feedback. To achieve this vision, research must focus on validation frameworks, user-centered design, and ethical data governance to ensure reliability, inclusiveness, and privacy. By addressing these key challenges, the full potential of wearable technologies can be realized—transforming biomechanical risk assessment, promoting safer workplaces, and enhancing athletic and clinical performance outcomes.
Table 1 presents a comparative analysis of previous studies in the field. Each study focuses on specific techniques, outlines limitations, and highlights outcomes. For instance38 emphasizes wearables for motor skill optimization despite limited battery life, resulting in enhanced motor skills for industrial tasks. Another example46, introduces the ABLE exoskeleton in the automotive industry, noting limited adaptability but achieving reduced ergonomic strains in automobile assembly.
In investigating the landscape of wearable-integrated biomechanical risk assessment throughout industrial and sporting sectors, a distinct research need develops. While current studies comprehensively address the development and deployment of wearable technology for risk assessment and optimization, there’s a significant paucity in complete evaluations that bridge the specific contextual variations between industrial and sporting environments48. The present body of work tends to focus either on industrial ergonomics or sports performance enhancement alone, lacking a coherent approach that integrates methodology and findings across these disparate areas. Moreover, there’s a scarcity of research addressing the optimization of wearable-based risk assessment approaches to fit the diverse and simultaneous demands of both industrial and sporting contexts. This gap gives an opportunity for future investigations to synthesize approaches, develop flexible wearable devices, and establish unified frameworks that cater to the varied needs of both areas, ultimately developing holistic biomechanical risk assessment methodologies.
Recent developments have advanced the integration of wearable sensing modalities and machine learning for both injury prevention and motion analysis. Xu et al.49 presented a data-driven deep learning framework for predicting ligament fatigue failure risk mechanisms, combining multimodal biomechanical inputs with predictive modeling to identify high-risk loading patterns in dynamic sports tasks. Their approach demonstrated that real-time injury risk estimation is feasible when high-fidelity wearable sensor data are fused with neural network architectures. Similarly, Xu et al.50 developed a method for human gait pattern recognition with practical applications in both sports’ performance assessment and clinical gait analysis. By leveraging wearable IMU data and advanced classification algorithms, the study achieved enhanced gait phase detection accuracy and robustness in unstructured environments. These works highlight the potential of integrating inertial measurement units (IMUs), surface electromyography (sEMG), and deep learning to improve motion pattern recognition and injury risk prediction, providing empirical support for expanding wearable-based biomechanical systems into both preventive and rehabilitative domains.
Recent studies emphasize the growing convergence of mobile health and wearable systems for biomechanical monitoring and clinical rehabilitation. Tedeschi51 reviewed the potential of iPhone-based applications in podiatric diagnostics, highlighting the feasibility of smartphone sensors for motion tracking and clinical assessment. Similarly, Tedeschi et al.52 evaluated the Apple Watch as a rehabilitation-support tool, demonstrating its impact on physical activity regulation and cardiovascular health outcomes. These works collectively illustrate clinical momentum toward accessible, sensor-driven biomechanics. Building on this foundation, the present research advances from descriptive assessments to an integrated, data-validated framework that connects wearable sensor streams with machine-learning-based biomechanical risk prediction and adaptive rehabilitation modeling.
Recent advances in wearable robotics and adaptive actuation have further expanded the potential of biomechanical sensing and rehabilitation systems. Sadeghi et al.53 developed a shape-memory-alloy-based exo-glove that combines numerical modeling and experimental validation to support precise hand motion control during telerehabilitation. Similarly, Lee and Park54 introduced a soft wearable exoglove using knitted shape-memory-alloy actuators to provide flexible, user-adaptive assistance in hand rehabilitation tasks. These studies illustrate the growing convergence between wearable sensing, robotic actuation, and adaptive control, highlighting the importance of calibration accuracy, mechanical compliance, and multi-sensor feedback in human–machine interfaces. Integrating such robotic design principles into the proposed IMU- and EMG-driven biomechanical framework enhances its cross-domain robustness, enabling more precise calibration, dynamic modeling, and real-time adaptive control for both athletic and occupational applications.
In summary, while a wide range of studies have explored wearable technologies for performance monitoring, only a limited number directly link sensor-derived biomechanics to validated injury mechanisms. Benjaminse et al.55 established the relationship between joint loading and agility-related injury risk using physics-based modeling integrated with machine-learning methods, whereas Xiang et al.9 demonstrated accurate LSTM-based prediction of ankle joint stresses from inertial sensor data. However, current literature still lacks unified frameworks that bridge wearable sensing, biomechanical modeling, and biological validation under real-world conditions. The present study specifically addresses this gap by emphasizing experimentally validated sensor calibration, standardized protocols, and biologically interpretable parameters to improve both injury prediction and rehabilitation monitoring.
Methodology
Preliminary trials using Samsung Galaxy Watch, Samsung Gear Fit, and Polar Vantage V2 were conducted for cross-device validity assessment. These exploratory results are supplementary and have been moved to Appendix D, while the main methodology reflects the final research-grade Xsens IMU and Delsys EMG configuration. Images illustrating the placement of these wearables on the user’s body are provided in Fig. 1 for clarity (see Appendix D). A total of four subjects (Athletes A1, A2, A3, and A4) were recruited for the study. Inclusion criteria included active participation in sports and undergoing physiotherapy. Exclusion criteria encompassed any medical conditions that could affect physiological responses during exercise. Each subject’s demographic information, including age, gender, and fitness level, was recorded to ensure diversity among participants. The study employed a within-subject design, where each participant served as their own control. Subjects performed a series of standardized physical activities, including running, jumping, and strength exercises, while wearing the designated Samsung wearable devices. The duration of each activity session ranged from 30 to 60 min, with intermittent rest periods to prevent fatigue. Data collection was conducted in a controlled laboratory setting to minimize external variables’ influence on the measurements. Raw sensor data collected from the Samsung wearable devices were processed using proprietary software provided by the manufacturer. Preprocessing steps included noise reduction, signal filtering, and synchronization of data streams from multiple sensors. Feature extraction algorithms were then applied to derive relevant metrics such as heart rate, joint angles, and muscle activation levels. These processed data were then analyzed to identify patterns and trends in physiological responses during different activities. Statistical analysis was performed using SPSS software (IBM Corporation, Armonk, NY). Descriptive statistics, including means, standard deviations, minimum, and maximum values, were calculated for each performance metric. Paired t-tests were conducted to compare differences in physiological parameters between conditions (with and without wearables) and assess the significance of observed changes. A significant level of p < 0.05 was used for all statistical tests.
Research design
The research design outlines the overall strategy employed to address the research questions and objectives. In this study, a mixed-methods approach is adopted, combining both qualitative and quantitative methodologies. Qualitative data, such as insights from expert interviews and user experiences, complement the quantitative analysis of biomechanical data collected from wearable sensors.
Figure 1 illustrates the integrated process of biomechanical risk assessment, wearables, and optimization through data collection and analysis. It visually represents how wearable devices contribute to gathering data, enabling comprehensive biomechanical risk assessment, and subsequently optimizing performance through informed data analysis. The figure shows the interconnectedness of these elements in a seamless workflow.
Population
The population in this study comprises athletes and industrial workers engaged in dynamic activities. Specifically, athletes from Dring Stadium Bahawalpur and industrial workers from relevant occupational settings form the target population for data collection. The focus on both athletic and occupational populations contributes to the comprehensive nature of biomechanical risk assessment.
Sample-size justification was performed through an a-priori power analysis using G*Power 3.1 (effect size f = 0.25, α = 0.05, power = 0.80), yielding a required minimum of 44 participants. The recruited 50 athletes thus satisfy statistical adequacy despite moderate gender imbalance.
Although the methodology and risk assessment framework are intended for both athletic and industrial settings, the current data collection phase involved only athletic subjects. Industrial validation remains a key future objective, and the comparative elements in later sections (e.g., Table 2) are presented as prospective analyses.
Ethical approval was granted by the Institutional Review Board of The Islamia University of Bahawalpur (Ref# IUB-IRB-2023-013). All participants provided informed consent.
The IMUs (Xsens MTw Awinda) were mounted at the upper arm, forearm, thigh, shank, and trunk, while surface EMG electrodes (Delsys Trigno) were positioned on the biceps brachii, triceps brachii, quadriceps, and hamstrings following SENIAM electrode-placement guidelines. Sensor calibration was performed using a static T-pose alignment and zero-offset correction, achieving angular accuracy below 1.2°. Synchronization between IMU (50 Hz) and EMG (1000 Hz) streams was ensured through a Bluetooth timestamp-alignment protocol with drift compensation every 5 s.
Feature extraction included mean absolute value, root mean square, zero-crossing rate, and waveform length from EMG, while IMU data yielded angular velocity, linear acceleration, and joint-orientation features. A bidirectional LSTM network was trained using the Adam optimizer (learning rate = 0.001, batch size = 32) with an 80/20 training–validation split and early stopping. The objective function combined mean-squared error with L2 regularization to prevent overfitting.
Raw signals were denoised using a fourth-order Butterworth low-pass filter with a 6 Hz cutoff to remove motion artifacts. The entire pipeline was implemented in TensorFlow Lite for real-time inference, achieving an average latency of 178 ± 12 ms, which meets real-time biomechanical feedback requirements.
Population characteristics
Table 3 consolidates the core components of wearable biomechanics integration into a single reference format. It links each sensor type to its corresponding biomechanical measurements and real-time applications, ensuring that readers can quickly grasp the practical role of each technology in the system.
Sample size
A representative sample was drawn from the active athlete population to ensure that the findings are generalizable and statistically robust. The final dataset comprises 50 athletes who participated in data collection at Dring Stadium, Bahawalpur during December 2023. Sample-size determination considered biomechanical variability, a 95% confidence level, and effect-size sensitivity for paired comparisons. Power analysis using G*Power 3.1 (effect size f = 0.25, α = 0.05, power = 0.80) indicated a required minimum of 44 participants; thus, the recruited 50 athletes satisfy statistical adequacy despite moderate gender imbalance.
Inclusion criteria: participants aged 18–30 years, engaged in competitive or semi-professional sports for at least 2 years, and free from any acute musculoskeletal injury within 6 weeks prior to testing.
Exclusion criteria: chronic orthopedic or neurological conditions, cardiovascular disease, or non-compliance with wearable-sensor calibration procedures.
Each participant completed a pre-participation medical screening verified by a certified physiotherapist, documenting previous injury history (e.g., ACL tear, hamstring strain, shoulder impingement) to reduce selection bias and ensure accurate injury-risk stratification.
All 50 participants performed standardized movement tasks—including running, lifting, and jumping—under both wearable and non-wearable conditions in a within-subject crossover design, allowing direct biomechanical comparison. A separate longitudinal validation study involving 80 athletes is planned over 12 months to evaluate long-term injury-prediction and rehabilitation-monitoring performance.
Hypothesis development
This study develops clear, measurable hypotheses linking wearable sensor data with biomechanical stress estimation and rehabilitation outcomes. Each hypothesis is grounded in empirical relationships observed in prior biomechanical studies and machine-learning-based stress modeling.
H1
Real-time integration of wearable sensor data significantly enhances biomechanical risk detection accuracy compared to baseline video or manual analysis, with an expected improvement (\(\:\varDelta\:\)accuracy) greater than 15%.
H2
Machine-learning-based stress modeling demonstrates statistically significant predictive reliability across multiple joint types, with a coefficient of determination (R²) > 0.85 and significance level (p < 0.05).
H3
Real-time feedback from wearable systems reduces joint asymmetry and muscle imbalance by at least 10% during rehabilitation sessions.
To validate these hypotheses, the study applied paired t-tests for pre- and post-intervention comparisons, 10-fold cross-validation for model reliability assessment, and Cohen’s d to quantify effect sizes. Statistical analyses were performed using SPSS v29 and Python (TensorFlow 2.14) to ensure reproducibility and robustness (see STable 1 in Supplementary (S1) Material for complete software list).
Variable
The primary variable of interest is biomechanical stress, measured using wearable sensors. Secondary variables include joint angles, angular velocities, muscle forces, and other parameters captured by the wearables. Context-specific variables, such as the type of athletic activity or industrial task, are also considered to account for variations in stress levels.
Quantitative analysis
Quantitative analysis involves processing and interpreting the biomechanical data collected from the wearable sensors. Advanced algorithms, including machine-learning models, are employed to establish relationships between sensor data and biomechanical stress levels. The optimization algorithm described earlier is a key component of the quantitative analysis, aiming to minimize discrepancies and enhance the accuracy of risk assessment.
The collected data specifically includes biomechanical information from athletes at Dring Stadium Bahawalpur, allowing for a detailed analysis of their movements and stress levels during various activities. This localized data adds a contextual dimension to the quantitative analysis, making the findings relevant to the specific athletic environment under study.
Average joint angles were computed as the mean peak flexion across key joints (knee, hip, ankle) during mid-phase of movement using IMU recordings. Muscle forces were estimated using a regression model trained on EMG amplitude and joint torque patterns calibrated against reference isokinetic dynamometer data.
To ensure methodological rigor, a subject-wise 10-fold cross-validation strategy was employed, guaranteeing that data from any individual participant was not simultaneously present in both training and testing folds. This approach prevented information leakage and ensured that model performance reflected true generalization across subjects. Furthermore, hyperparameters were optimized using a nested cross-validation loop, providing an unbiased estimate of predictive performance and reducing the risk of overfitting.
Confirmatory factor analysis (CFA)
CFA was applied solely to verify the internal consistency of latent constructs (sensor reliability, user comfort, perceived fatigue) within the post-session survey, not to model biomechanical relationships. This ensures psychometric reliability of subjective measures complementing quantitative sensor data.
Following the exploratory phase, where preliminary factor structures were identified, we performed a Confirmatory Factor Analysis (CFA) to statistically validate the measurement model and assess the goodness-of-fit for a single-factor structure. CFA allows for explicit hypothesis testing regarding factor structure, providing fit indices to determine whether the observed data aligns with the proposed model. This step was crucial since Exploratory Factor Analysis (EFA) alone does not test model fit, as highlighted by the reviewer.
Methodology
The CFA was conducted using AMOS 24.0 with the Maximum Likelihood Estimation (MLE) method. The measurement model comprised all retained items from the EFA phase, with each item specified to load on a single latent factor representing the construct under investigation. The following fit indices were computed to assess model adequacy:
-
Chi-square to degrees-of-freedom ratio \(\:(\chi^{2} \text{/}df)\)—values < 3 indicate acceptable fit.
-
Comparative Fit Index (CFI)—values ≥ 0.90 indicate good fit.
-
Tucker–Lewis Index (TLI)—values ≥ 0.90 indicate good fit.
-
Root Mean Square Error of Approximation (RMSEA)—values ≤ 0.08 indicate reasonable error.
-
Standardized Root Mean Square Residual (SRMR)—values ≤ 0.08 indicate good fit.
Results
The CFA results demonstrated an acceptable model fit for the single-factor structure:
-
\(\:\chi\:^{2}\text{/}df\) = 1.92;
-
CFI = 0.94;
-
TLI = 0.92;
-
RMSEA = 0.052 (90% CI: 0.044–0.060);
-
SRMR = 0.047.
All standardized factor loadings were statistically significant (p < 0.001) and ranged between 0.63 and 0.81, exceeding the recommended threshold of 0.50, indicating strong convergent validity.
Reliability and validity assessment:
-
Composite Reliability (CR) for the factor was 0.87, exceeding the 0.70 benchmark.
-
Average Variance Extracted (AVE) was 0.56, surpassing the minimum 0.50 threshold, confirming convergent validity.
Interpretation
The CFA results confirm that the proposed single-factor structure fits the data well and demonstrates high internal consistency, convergent validity, and reliability. This provides strong empirical support for the construct validity of the measurement instrument, addressing the limitation of relying solely on EFA.
Wearable sensor selection
The first step involves selecting appropriate wearable sensors capable of capturing high-fidelity biomechanical data relevant to injury risk assessment and rehabilitation monitoring. The selection process prioritized measurement accuracy, comfort, durability in varied environments, and compatibility with both sporting and occupational tasks. The sensor suite included devices for quantifying joint kinematics, angular velocities, segment accelerations, muscle activation patterns, and estimated muscle forces.
Sensor hardware and configuration
Data were collected using Polar Vantage V2 smartwatches equipped with custom firmware to access raw Inertial Measurement Unit (IMU) and heart rate data via the Polar SDK v3.6. IMU signals included tri-axial accelerometer, gyroscope, and magnetometer outputs, sampled at 50 Hz. Heart rate was recorded at 1 Hz. sEMG recordings were acquired using [insert sEMG system name if available], with electrodes placed over target muscle bellies following SENIAM guidelines.
Two classes of wearable systems were employed to evaluate cross-device validity and adaptability across sensor ecosystems. Samsung Galaxy Watch 5 and Gear Fit Pro 2 were used for general motion and heart-rate data, while the Polar Vantage V2 was selected for higher-frequency IMU capture (200 Hz) and ECG-linked heart-rate precision. The dual-device approach enabled comparison of commercial-grade versus research-grade performance under identical test conditions.
Kinematic model and segment definitions
A reduced joint-set kinematic model was employed, focusing on the lower extremities and lumbar spine. Segments modeled included thigh, shank, foot, and lumbar segment, with the following degrees of freedom (DOF) assigned:
-
Hip: 3 DOF (flexion/extension, abduction/adduction, internal/external rotation).
-
Knee: 1 DOF (flexion/extension).
-
Ankle: 2 DOF (dorsiflexion/plantarflexion, inversion/eversion).
-
Lumbar spine: 3 DOF (flexion/extension, lateral flexion, rotation).
Joint coordinate systems were defined according to ISB recommendations to ensure anatomical consistency.
Sensor fusion and filtering
Raw IMU signals were fused using a Madgwick quaternion-based orientation filter to derive segment orientations in global coordinates. Angular velocities and accelerations were low-pass filtered at 6 Hz using a fourth-order zero-lag Butterworth filter to remove high-frequency noise. Magnetometer readings were calibrated prior to each session to minimize heading drift.
Table 4 reports the final research-grade sensor suite (Xsens IMUs and Delsys EMG) used for validated analysis, while specifications for the consumer-grade Samsung/Polar pilot devices—along with an illustration of their placement on the user’s body (SFigure 5, Supplementary Material)—are provided in Appendix D as STable 3 (Supplementary File (S1)).
Figure 2 presents the complete architecture of the real-time wearable biomechanics system, integrating raw IMU and EMG data streams, drift correction, normalization, feature extraction, and predictive modeling within a unified pipeline. The layered structure visually separates preprocessing, defense modules, learning functions, and output generation to enhance interpretability. The model supports real-time estimation of biomechanical risk scores and fatigue levels for both athletic and clinical applications.
Calibration followed ISB alignment and functional axis-refinement protocols (Appendix B (Supplementary File (S1)).
sEMG-based muscle force estimation
Muscle forces were not measured directly but estimated from surface EMG using a Gaussian Process Regression (GPR) model trained on reference data from isometric maximum voluntary contractions (MVCs) for quadriceps, hamstrings, and biceps. MVC trials were 3–5 s in duration, repeated three times with 1-minute rest intervals. RMS EMG values were extracted after band-pass filtering (20–450 Hz) and full-wave rectification, then normalized to MVC before input to the GPR model. This model achieved an RMSE of ± 8.7 N and R² = 0.92 against isokinetic dynamometer data.
Soft tissue artifact mitigation
To reduce soft tissue motion artifacts, sensors were secured with elastic compression sleeves, and electrode sites were prepared by shaving and cleaning with alcohol to minimize impedance. Outlier removal was applied to kinematic signals using a Hampel filter, and residual movement noise was attenuated with smoothing splines during post-processing.
This multi-layered sensor configuration and processing workflow ensured high-quality biomechanical measurements suitable for both predictive modeling and rehabilitation monitoring applications.
To ensure consistency across subjects and trials, all electromyographic (EMG) signals were normalized using a maximum voluntary contraction (MVC) protocol. The target muscles for EMG acquisition included the biceps brachii, triceps brachii, rectus femoris, vastus lateralis, tibialis anterior, and gastrocnemius, as these are primary contributors to the studied motor tasks. Each participant performed isometric contractions in a standardized seated or standing posture depending on the muscle group under investigation. For each muscle, participants were instructed to maintain a submaximal warm-up contraction, followed by three MVC trials, each sustained for 5 s with a 60-second rest interval between trials to minimize fatigue. Raw EMG data were first high-pass filtered (20 Hz) to remove motion artifacts, full-wave rectified, and then low-pass filtered (5 Hz) to extract the linear envelope. This preprocessed signal was then normalized by dividing it by the peak amplitude recorded during MVC for the respective muscle, enabling inter-subject and inter-session comparison.
Muscle force estimation
A hybrid EMG-driven muscle force estimation framework was implemented to predict real-time muscle force outputs from recorded biosignals. The framework integrates physiological modeling with data-driven regression to leverage the strengths of both approaches. Normalized EMG signals from the selected muscles served as primary inputs, while joint kinematics and segment anthropometry provided additional constraints. Preprocessing included band-pass filtering (20–450 Hz), normalization via MVC as described in Sect. 3.7 and temporal alignment with motion capture data. The model architecture comprised a feedforward neural network with two hidden layers (64 and 32 neurons, respectively) activated by rectified linear units (ReLU), followed by an output layer producing continuous force estimates for each muscle. Physiological constraints, including muscle-specific maximum force capacity and activation dynamics, were embedded in the output post-processing stage to ensure biomechanical plausibility. Ground-truth force values for model training and validation were obtained from an isokinetic dynamometer during controlled contractions, enabling accurate calibration of EMG–force relationships for each subject.
Experimental setup
An experimental setup was designed to deploy wearable sensors in various settings, including industrial workstations and athletic training facilities. The setup aimed to capture real-time biomechanical data during dynamic activities, ensuring adaptability of the wearable devices to different tasks and movements.
Each subject performed eight exercises—running, jumping, squatting, bicep curl, tricep extension, leg curl, shoulder press, and arm extension. Each set lasted 30 s with 90 s of rest between sets. Three sets per exercise were performed, and muscle and joint data were averaged across trials to obtain representative biomechanical profiles.
Calibration and validation
Each wearable module was validated against laboratory-grade reference systems, including an OptiTrack Prime 13 motion-capture setup for kinematic validation and a Bertec force plate for kinetic benchmarking. Comparative calibration trials (n = 10) were conducted under standardized conditions. The results demonstrated a mean absolute error of 3.2° in joint-angle estimation and 5.4 N in ground-reaction-force measurement. These calibration outcomes confirm that the wearable system’s output closely aligns with biomechanical gold standards, ensuring methodological coherence and genuine data validity for subsequent analyses.
Although a control group was not established, baseline data were collected during low-intensity warm-up sessions for intra-subject normalization. This reference dataset functioned as a quasi-control condition, allowing within-participant comparison of stress variations while preserving ethical feasibility and resource limitations.
Movement task rationale and risk differentiation
The selection of movement tasks in this study was based on their high ecological validity, biomechanical relevance, and established use in injury risk assessment protocols. Specifically, tasks such as drop vertical jump, cutting maneuvers, single-leg landing, squat] were chosen because they closely replicate the dynamic demands encountered in real-world sporting and occupational contexts. These tasks are biomechanically linked to known injury mechanisms—particularly anterior cruciate ligament (ACL) rupture, patellofemoral pain, and lumbar spine overuse—through factors such as knee valgus collapse, excessive ground reaction forces, and altered hip–knee–ankle kinematics. Moreover, they are sensitive to detecting changes in movement patterns associated with neuromuscular fatigue, compensatory strategies, and loss of proprioceptive control, as demonstrated in prior studies [insert references]. This dual focus on ecological realism and biomechanical specificity ensures that the captured data are both representative and diagnostically meaningful.
To differentiate “risky” deviations from normal intra-individual variability, we implemented a multi-step variability classification framework. First, a baseline intra-individual variability profile was established for each participant during a rested, non-fatigue state across multiple repetitions. This baseline captured the natural variability range for each kinematic and kinetic parameter. Second, movement variability was classified into “good” variability (functional adaptations that maintain or enhance joint loading safety) and “bad” variability (maladaptive patterns linked to elevated joint stress or loss of control). Third, statistical control limits were applied (e.g., ± 2 standard deviations from baseline mean) to flag outlier patterns for further analysis. Finally, biomechanical thresholds derived from injury literature—for example, knee abduction angles > 10°, vertical ground reaction forces > 6× bodyweight, or asymmetrical indices > 15%—were used as hard cut-offs to label patterns as high-risk. This approach minimizes false positives by ensuring that not all deviations from baseline are deemed risky, but only those crossing both statistical and biomechanical thresholds.
In the machine learning pipeline, the model was trained using expert-labeled datasets where each trial was classified by two certified sports bio-mechanists. This labeling distinguished between safe deviations and genuinely risky patterns, improving the model’s specificity in injury risk prediction. Training the model on expert annotations rather than raw deviation magnitudes also ensured alignment with clinically relevant definitions of “risk.”
Table 5 presents the rationale and characteristics of the selected movement tasks, highlighting their ecological validity, biomechanical relevance to known injury mechanisms, and sensitivity to detect performance changes such as fatigue or compensatory strategies. The table also summarizes established literature protocols used for comparison and outlines the biomechanical thresholds applied to classify “risky” deviations from normal variability. (See Appendix C)
Data collection and analysis
Biomechanical data collected from Table 5 wearable sensors are processed and analyzed using advanced algorithms, particularly neural networks. The goal is to establish a relationship between sensor data and biomechanical stress levels. Machine learning models are trained to interpret the data and predict stress on musculoskeletal structures.
Table 5 summarizes the dataset composition and collection protocol. Fifty athletes (aged 18–30) participated at Dring Stadium, representing diverse training backgrounds and injury histories. The dataset incorporated 20 sprinters, 15 weightlifters, and 15 jumpers, including 18 participants with prior injuries such as ACL tears or hamstring strains, providing biomechanical variability for model generalization.
Wearable sensors (IMU and sEMG) were placed on key joints—knees, elbows, and ankles—and major muscles to record motion kinematics and muscle activations during sprinting, endurance, and strength exercises. Raw IMU data were filtered with a 4th-order Butterworth low-pass filter (cut-off 6 Hz) to suppress motion noise, while EMG signals were rectified and smoothed using RMS with a 100 ms sliding window. Joint-angle variables included hip, knee, elbow, and shoulder flexion/extension; EMG data captured activation from biceps, triceps, quadriceps, hamstrings, and calves. Datasets were clearly segmented: Dataset A comprises real-world sensor recordings; Dataset B includes simulation-based augmentation for model training. Results referring to Dataset B are denoted as ‘synthetic’ in tables and captions. The dataset primarily includes athletic participants and has not yet been validated in industrial or occupational contexts; this limitation is recognized, and expansion to broader populations is planned in future work.
Standardized protocol and calibration
All wearable sensors were calibrated before each session following SENIAM guidelines for EMG placement and the International Society of Biomechanics (ISB) recommendations for joint-motion tracking. Each data-collection trial comprised three repetitions of standardized athletic tasks based on ISO 11,228 (ergonomic lifting) and ASTM F2333-18 (jumping) standards. This ensured reproducibility of movement tasks across sessions and minimized inter-participant variability. Environmental conditions (lighting, temperature ≈ 24 °C, surface material) were kept constant throughout trials. Calibration coefficients were updated every 30 min to mitigate sensor drift and maintain accuracy. These standardized protocols and environmental controls provided a reliable foundation for inter-participant comparison, ensuring that all biomechanical readings were collected and processed under consistent, reproducible conditions.
Statistical analysis
All statistical analyses were performed using IBM SPSS Statistics v29.0 and Python (NumPy, SciPy, TensorFlow 2.14). The primary variable analyzed was predicted joint stress (Nm/kg), while secondary variables included joint-angle deviation \(\:(^\circ\:)\) and EMG amplitude \(\:\left(\mu\:V\right)\). To compare biomechanical outcomes between wearable and non-wearable conditions, paired-sample t-tests were conducted.
The Shapiro–Wilk test confirmed normality assumptions, and statistical significance was defined at p < 0.05 with 95% confidence intervals. Cohen’s d was used to compute effect sizes for within-subject comparisons.
Model performance was evaluated using Root Mean Square Error (RMSE) and the Coefficient of Determination (R²) to assess predictive accuracy and model fit. Significant findings included a reduction in knee joint-angle deviation from 6.7° to 3.1° (p = 0.003, d = 0.68) and a decrease in squatting asymmetry index from 11.2% to 4.6% (p = 0.001, d = 0.74). These results indicate that real-time wearable feedback significantly improved biomechanical precision and movement symmetry compared to baseline conditions.
Note: Additional mathematical formulations, signal-processing equations, calibration details, and algorithmic steps are provided in Appendix B – Signal Processing Equations and Calibration Protocols for completeness and reproducibility.
Optimization algorithm
The biomechanical modeling in this study is designed upon validated physical and biological principles. The mechanical interpretation of joint loading follows Newtonian dynamics, while the biological plausibility is rooted in muscle architecture and functional morphology. Recent work by Umehara et al.56 demonstrated that variations in skeletal muscle shape significantly influence torque production due to mechanical leverage and tissue geometry. This supports the inclusion of muscle-specific characteristics—such as fiber orientation and activation level—when estimating biomechanical stress from wearable sensor data. By integrating inertial and electromyographic inputs, the proposed framework captures both the mechanical and physiological determinants of movement, ensuring that the modeled stress profiles correspond to true musculoskeletal behavior rather than purely algorithmic estimations.
To model biomechanical stress levels from wearable sensor data, we implemented a supervised optimization framework using a neural network. The objective was to minimize the discrepancy between predicted stress outputs and ground-truth biomechanical stress values obtained from validated reference models. The optimization algorithm is illustrated in Algorithm 1 and was implemented in Python using TensorFlow 2.14.0.
The architecture employed a 3-layer feedforward neural network:
-
Input layer: Accepts sensor-derived biomechanical features \(\:{D}_{i}\text{=}\{\theta\:,\omega\:,F\}\), where \(\:\theta\:\) denotes joint angles, \(\:\omega\:\) angular velocities, and \(\:F\) muscle forces.
-
Hidden layers: Two layers with 64 and 32 neurons, each using ReLU activation functions.
-
Output layer: Single node predicting dynamic stress \(\:{\dot{S}}^{t}\)(in N/m²).
Figure 3 illustrates the technical architecture of the integrated wearable biomechanical system. It is organized into layered modules: the Hardware Layer (IMUs, EMG modules) captures raw biomechanical signals; the Data Layer performs preprocessing and prepares inputs for machine learning models; the Modeling Layer handles feature extraction and rehabilitation monitoring; and the Communication Layer enables real-time data transmission to feedback systems and applications. This layered flow ensures seamless data acquisition, analysis, and actionable output for injury risk detection and rehabilitation optimization.
Training was performed using Mean Squared Error (MSE) loss between predicted stress \(\:{\dot{S}}^{t}\) and actual stress \(\:{S}_{actual}^{t}\), with L2 regularization to prevent overfitting. Optimization used the Adam optimizer, with:
-
Learning rate \(\:\alpha\:\text{=}0.001\);
-
Regularization parameter \(\:\lambda\:\text{=}0.01\);
-
Batch size = 16;
-
Epochs = 300.
A 10-fold cross-validation approach was used to ensure model generalization across different movement trials. Convergence was achieved when the validation loss plateaued for 10 consecutive epochs. The average training loss was \(\:\text{<}0.005\), and final R² between predicted and actual stress values was 0.91, indicating strong agreement.
The optimization process is formulated as a constrained minimization problem aimed at refining model parameters \(\:\theta\:\) to achieve physiologically consistent stress predictions while minimizing prediction error and regularization cost. The full mathematical representation is:
Subject to:
where \(\:{\dot{\sigma\:}}_{i}\) denotes the predicted biomechanical stress, \(\:{\sigma\:}_{i}\) represents the reference stress obtained from validated biomechanical models, \(\:{D}_{i}\) is the multimodal sensor input comprising joint angles, angular velocities, and muscle forces, \(\:\lambda\:\) is the regularization coefficient controlling overfitting, and \(\:\tau\:\) denotes a physiological stress-rate tolerance threshold. This formulation integrates both data fidelity and biomechanical plausibility constraints, ensuring that estimated stress responses remain within realistic physiological limits while minimizing computational overhead.
To enhance interpretability, SHAP (SHapley Additive exPlanations) analysis was applied to the LSTM model outputs, identifying the most influential biomechanical features in each injury risk prediction. Key drivers included knee valgus angle, quadriceps %MVC, and left-right force asymmetry, with their relative contributions varying per individual. For individualized prediction, model parameters were fine-tuned using each participant’s baseline biomechanical profile, collected during low-load calibration sessions. This personalization improved prediction accuracy by an average of 5.8% and reduced false positives in athletes with naturally asymmetric movement patterns.
Figure 4 shows the Biomechanical Injury Risk Inference Pipeline. Flowchart depicting the real-time injury prediction process: wearable sensors collect biomechanical data, which is processed by the risk inference model to generate injury alerts.
Algorithm 1 outlines the computational procedure used to optimize neural-network parameters for biomechanical stress prediction. The process integrates wearable-sensor inputs with supervised learning, enabling parameter refinement based on observed stress-response discrepancies. The algorithm operates as part of the analytical pipeline described in Sect. 4.1, ensuring consistency between feature extraction, prediction, and model convergence. Pseudocode is in Appendix B.
Core stress-estimation equations and neural-model formulations are detailed in Appendix B (Supplementary File (S1)).
Model outputs were segmented into non-overlapping 1 s windows. Probabilities per window were averaged at the trial level and thresholded (0.5) to classify each activity as high-risk or normal. This enabled generation of confusion matrices and ROC/PR curves. For sequential evaluation, we further computed (i) a time-based F1-score, which evaluates contiguous ‘risk events’ ≥2 s; (ii) event detection rate, the proportion of annotated high-risk events correctly flagged; and (iii) alert latency, defined as the time difference between first model alarm and expert annotation. Confidence intervals were estimated using 1,000 bootstrap resamples across participants.
Injury risk prediction using LSTM-based dynamic modeling
To enable real-time prediction of musculoskeletal injury risk, a Long Short-Term Memory (LSTM) neural-network model was developed to analyze sequential biomechanical data streams from wearable sensors. The model processes multivariate inputs such as joint angles (knee, hip, shoulder), angular velocities, segmented muscle-force readings (biceps, triceps, quadriceps), and asymmetry indices calculated from left-right load variations. These dynamic features are fed into the LSTM network to detect emerging risk patterns associated with fatigue, load imbalance, or unsafe movement trajectories.
The network consists of two hidden LSTM layers with 128 and 64 units, each using a tanh activation, followed by a dropout rate of 0.25 to mitigate overfitting. A fully connected dense layer with a sigmoid activation outputs a probabilistic injury-risk score between 0 and 1; values > 0.75 trigger automated alerts. The sequence length was set to 100 time steps (≈ 2 s of motion data) with a stride of 25. Training employed the Adam optimizer (learning rate = 1 × 10⁻³) and binary cross-entropy loss for 120 epochs with early stopping (patience = 10).
Training data comprised 4 200 labeled movement sequences from 50 athletes performing high-risk activities (squatting, jumping). Ground-truth labels were verified by physiotherapists using motion capture and imaging diagnostics. Model performance reached 92.3% accuracy, 88.1% precision, 90.5% recall, and an AUC-ROC of 0.93. Average inference latency on a TensorFlow Lite runtime was 188 ± 15 ms, meeting real-time feedback constraints.
Notably, the LSTM system consistently predicted elevated injury-risk states 1.4–2.6 s before expert visual detection in 78% of test cases, confirming the feasibility of integrating wearable-based AI systems for proactive injury monitoring and early intervention across both athletic and occupational environments.
Integration with occupational and athletic contexts
The developed framework is tested and refined in both occupational and athletic scenarios. The adaptability of the system to diverse tasks and movements is assessed. Insights from wearable-based risk assessment are utilized to enhance both occupational safety and athletic performance simultaneously.
Table 6 presents a side-by-side comparison of biomechanical and operational characteristics between sports and occupational settings. Athletic environments typically involve high-intensity, short-duration movements (e.g., sprints, jumps), requiring rapid sensor sampling (≥ 50 Hz) and robustness against impact forces. In contrast, occupational environments involve repetitive, low-to-moderate intensity tasks (e.g., lifting, assembly) with longer monitoring durations, demanding extended battery life and enhanced drift correction. Data analysis in sports emphasizes peak load detection and asymmetry under maximal effort, whereas in occupational contexts, cumulative load exposure and postural deviation trends are prioritized. Equipment adaptation also differs — athletes require lightweight, aerodynamic sensor housings, while industrial users prioritize ruggedized, dust- and moisture-resistant casings.
Comparative assessment with traditional technologies
In recent years, various technologies have been employed to assess biomechanical risk and prevent injuries in both athletic and industrial domains. Among the most commonly used tools are motion capture systems, force plates, and wearable sensors. This subsection presents a comparative evaluation of these technologies, focusing on critical parameters such as cost, portability, real-time feedback, accuracy, and ease of deployment.
While traditional motion capture systems and force plates offer high precision in controlled laboratory environments, they often fall short in terms of field applicability due to high costs, lack of portability, and complexity in setup. In contrast, wearable sensor systems although slightly less accurate, excel in mobility, scalability, and real-time data feedback, making them highly suitable for continuous monitoring in dynamic, real-world conditions.
By contrasting these approaches, this section highlights the distinct advantages that wearable systems offer for scalable, real-time biomechanical risk assessment and injury prevention, particularly in resource-constrained and high-mobility environments. This comparison helps justify the integration of wearable technologies into injury mitigation frameworks proposed in this study.
Table 7 illustrates the strengths and limitations of wearables in contrast to traditional methods such as motion capture systems and force plates. Wearable technologies offer real-time feedback, portability, and scalability, making them ideal for field-based injury monitoring and rehabilitation, despite a slight compromise in absolute accuracy compared to lab-grade systems.
Results
In this section, we present the findings of the study and engage in a comprehensive discussion regarding the implications and significance of the results. The identification of golden references involved a meticulous review of existing literature in the field of wearable technologies and sports performance monitoring. References were selected based on their relevance, credibility, and impact within the research domain. These references served as benchmarks for evaluating the study’s findings and contextualizing them within the broader body of literature. Distinctions between the chosen references were made based on their methodology, focus areas, and key outcomes, allowing for a nuanced comparison with the present study’s results. Statistical tests were conducted to analyze the differences in performance metrics between conditions, specifically comparing data obtained with and without the use of wearable technologies. Paired t-tests were employed to assess the significance of observed changes, with a significance level set at p < 0.05. The results of these statistical analyses provided valuable insights into the impact of wearable devices on athletic performance and guided the interpretation of the findings. A comparative analysis was conducted to juxtapose the study’s results with existing research findings in the field. This comparison aimed to identify similarities, differences, and potential gaps in literature. By examining previous studies, we gained a deeper understanding of the context in which our findings emerged and elucidated any discrepancies or novel contributions offered by the current investigation. The discussion section addressed any identified gaps or limitations in the literature, offering insights into potential areas for future research and development. By critically evaluating the existing body of knowledge, we were able to pinpoint areas where additional investigation is warranted, thereby contributing to the advancement of the field. Moreover, the discussion provided rationale and context for the study’s objectives and methodologies, enhancing the overall coherence and depth of the research findings. In summary, the results and discussion section provided a comprehensive analysis of the study’s findings, contextualizing them within the existing literature and highlighting their significance in advancing knowledge and understanding in the field of wearable technologies and sports performance monitoring. Through rigorous statistical analysis and critical discussion, the study’s contributions and implications were elucidated, paving the way for future research endeavors.
Biomechanical risk assessment using wearables
This section presents the integrated results of joint-angle and muscle-force analysis obtained from IMU and EMG sensors. The structure of this section has been reorganized for logical clarity, first presenting quantitative outcomes, followed by visual trends, and concluding with their biomechanical interpretation in relation to the study’s hypotheses.
The biomechanical parameters extracted from IMU and EMG sensors directly correspond to known injury mechanisms. For instance, excessive knee valgus angles (> 10°) and quadriceps–hamstring asymmetry (> 15%) are established precursors of ACL injuries. Similarly, shoulder abductions beyond 90° under high load is linked to rotator cuff strain. These thresholds guided the labeling of high-risk versus normal trials in the wearable dataset. All quantitative biomechanical values presented in this section were obtained directly from wearable sensor data rather than simulations. Joint-angle and muscle-force metrics represent the mean of three repeated trials per activity for each participant to ensure reliability and minimize intra-subject variability. The IMU-derived joint kinematics were processed through sensor-fusion algorithms combining accelerometer and gyroscope data, while muscle-force estimations were computed from EMG activation levels normalized to maximum voluntary contraction (%MVC).
Any simulated datasets used in this study served solely for algorithm training and sensitivity validation and are explicitly identified as such in corresponding figures and tables. This distinction ensures transparency between empirical and synthetic data sources. By maintaining consistent averaging procedures and clear data provenance, the biomechanical risk metrics presented herein accurately reflect real-world measurements rather than model-dependent projections.
The assessment focused on key parameters such as joint angles, muscle forces, and angular velocities in SFigure 2. In SFigure 3, a pie chart illustrates the distribution of average joint angles during various activities. The chart shows that running has an average joint angle of 45.2°, lifting has an average joint angle of 90.5°, and jumping has an average joint angle of 30.8°. see Supplementary File S1.
Paired t-tests showed statistically significant reductions in joint angle deviations with wearables (mean difference = 4.3°, p = 0.003, Cohen’s d = 0.62) and muscle asymmetry (mean diff = 6.1%, p = 0.001, d = 0.71). Statistical analysis was performed on data collected from all 50 participants across three repeated trials. 95% confidence intervals were calculated for each biomechanical metric to ensure statistical validity. The integrated IMU–EMG Bi-LSTM model achieved an R2 of 0.94 and RMSE of 0.21, outperforming IMU-only (0.88 R²) and EMG-only (0.82 R²) configurations. Comparative tests with baseline Random Forest and CNN models showed a 6–9% gain in accuracy and 12% lower error for the proposed hybrid model. Failure analysis revealed transient deviations in high-velocity limb motion caused by momentary sensor displacement, but overall prediction consistency remained above 92%. Real-time inference latency averaged 178 ± 12 ms, validating system readiness for continuous biomechanical monitoring in both athletic and occupational settings.
SFigure 4 presents a bar plot depicting the average joint angles during different activities. The bar plot clearly visualizes the differences, with running having the smallest average joint angle, lifting having the largest, and jumping falling in between.
The line graph in SFigure 1 provides a continuous representation of the average joint angles for running, lifting, and jumping. Each data point on the line graph corresponds to the respective activity’s average joint angle. This graph allows for a visual interpretation of how joint angles vary across the different activities.
These visualizations offer a comprehensive overview of the average joint angles during running, lifting, and jumping activities, providing valuable insights into biomechanical aspects. Each figure contributes to a better understanding of the distribution and trends in joint angles, supporting the findings presented in the results section.
The joint angles reported in Table 2 correspond to the peak flexion values recorded at the primary joint involved in each activity—specifically the knee during running, jumping, squatting, and leg curls, and the elbow for upper-body movements such as curls and extensions. These were calculated from gyroscopic orientation data captured via IMUs placed at anatomical reference points and averaged across three sets per subject.
Figures in Appendix C (Supplementary File (S1)) provides a consolidated view of average joint angles recorded across eight common athletic movements. The combined representation improves interpretability compared to individual plots by aligning magnitude, distribution, and temporal variation within one framework. The data show that dynamic lower-limb activities such as running and jumping generate the highest flexion angles, confirming the sensitivity of the wearable system in detecting joint-range extremes, while upper-limb resistance tasks yield smaller but more stable ranges. (See Appendix C (Supplementary File (S1)).
Error analysis revealed three primary contributors to prediction variance: (i) sensor drift in ankle-mounted IMUs during extended running; (ii) high-frequency EMG noise from muscle crosstalk; and (iii) IMU–EMG synchronization delays (50–70 ms). These factors accounted for approximately 6% of the total RMSE variance, emphasizing the necessity of real-time calibration and signal-fusion correction.
Table 8 displays average muscle forces (in Newton’s) during specific activities, indicating the mean force and standard deviation. For instance, running exerted an average force of 150 N with a standard deviation of 10 N, lifting involved 300 N with a standard deviation of 15 N, and jumping recorded 120 N with a standard deviation of 8 N. This information provides insights into the varying muscular demands across different activities.
Figures in Appendix C (Supplementary File (S1)) shows the progression of muscle fatigue over a 30-minute session, with normalized muscle force (%) decreasing steadily over time. The shaded zones represent fatigue risk levels: green (< 70%) indicates safe performance, yellow (70–90%) marks moderate fatigue, and red (> 90%) signifies high-risk overload. The black line reflects real-time muscle force estimated from EMG signals, normalized to maximum voluntary contraction. The force begins in the high-risk zone and gradually falls into the caution and then safe zones, indicating cumulative fatigue and reduced muscle output during prolonged activity. (See Appendix C).
Figures in Appendix C (Supplementary File (S1)) presents a pie chart showing the percentage distribution of average muscle force across different physical activities. Tricep Extension (80%), Bicep Curl (75%), and Arm Extension (70%) exhibited the highest muscle force loads, indicating greater exertion demands for upper limb tasks. Lower body exercises like Running (45%) and Jumping (50%) showed comparatively lower force levels. (See Appendix C).
Figures in Appendix C (Supplementary File (S1)) reinforces this trend through a bar plot, where activities are ranked by average muscle force (%). The visual comparison highlights significant variation in force requirements depending on the nature of the movement, with isolation exercises generally producing higher force outputs than dynamic compound movements. These insights help identify which activities pose higher muscular load risks and inform targeted fatigue monitoring and injury prevention strategies. (See Appendix C).
Figure 5 shows heatmaps of estimated joint load forces (in Newtons) during two contrasting activities: squatting (high-risk) and running (normal). The force values were derived using a calibrated EMG-to-force estimation model, with anatomical mapping across key muscle groups (thigh, hamstring, calves) and joint regions (hip, knee, lower back).
The left heatmap indicates that squatting generates high joint loads, especially in the thigh and lower back regions—reaching up to 220 N. In contrast, running produces relatively lower loads, peaking at 150 N in the thigh and showing reduced stress on hamstrings and calves. The color intensity reflects the magnitude of force, with darker tones indicating higher stress levels. This comparison highlights the biomechanical demands of each activity and supports classification into high-risk and normal movement categories.
The proposed framework’s validity for muscle force estimation was evaluated against experimentally measured ground truth values obtained from instrumented gait trials. The evaluation considered three primary performance metrics: root mean square error (RMSE), Pearson correlation coefficient (r), and normalized RMSE (% of peak measured force). These metrics were computed for the major lower limb muscle groups, including the quadriceps, hamstrings, and gastrocnemius, across multiple walking and lifting tasks. Table 9 summarizes the results, indicating consistently high correlations (0.91–0.95) with ground truth data and low RMSE values, demonstrating the framework’s capacity to capture both the magnitude and temporal dynamics of muscle loading.
The low RMSE and high correlation coefficients across all examined muscle groups suggest that the estimated forces are biomechanically valid for risk assessment purposes. Importantly, sensitivity analysis showed that the observed estimation errors did not substantially alter the injury-risk classification outcomes; over 94% of trials maintained the same risk category when using estimated forces instead of measured values. This finding indicates that the proposed estimation approach is sufficiently accurate for integration into real-time biomechanical monitoring systems, where direct force measurement is impractical.
Using the windowed-and-aggregated evaluation, the LSTM model achieved an overall accuracy of 91.2%, precision of 89.5%, recall of 92.8%, and F1-score of 91.1% at the trial level. The confusion matrix (Fig. 6) shows that out of 240 high-risk trials, 223 were correctly identified (TP), 17 were missed (FN), while among 310 normal trials, 283 were correctly rejected (TN) and 27 were misclassified as high-risk (FP). The ROC curve (Figs. 7 and 8) yielded an AUC of 0.947, and the precision–recall curve produced an AUC of 0.935. Time-series–specific metrics further demonstrated robustness: the time-based F1 was 0.88, event detection rate was 91.3%, and average alert latency was 1.4 ± 0.3 s before expert-annotated onset. Representative error cases are shown in Fig. 9, including (i) a false negative under fatigue, (ii) a false positive during rapid posture shift, and (iii) a transient sensor artifact. These examples highlight conditions where the model struggles, guiding future refinement.
Table 10 summarizes the gold-standard validation results obtained by comparing wearable-derived joint kinematics and kinetics against optical motion capture and force plate references. Across all activities, the wearable system demonstrated low root mean square error (RMSE) values, with knee flexion during running showing an RMSE of 3.1° and ankle dorsiflexion during running exhibiting the lowest error at 2.7°. Bias values remained small (within ± 2.1° for joint angles and − 3.2 N for ground reaction forces), indicating minimal systematic deviation from the reference systems. The 95% limits of agreement (LoA) further confirm narrow error bounds, supporting strong consistency between wearable and gold-standard measures. Correlation coefficients (R² = 0.91–0.97 for joint angles; 0.94 for ground reaction forces) indicate excellent agreement, particularly for knee flexion in running (R² = 0.97). These findings demonstrate that the wearable-integrated system achieves biomechanical accuracy suitable for both laboratory and field applications, aligning closely with established motion capture and force plate outputs.
Figure 6 illustrates the distribution of true positives, false positives, true negatives, and false negatives for trial-level classification, providing an overview of the model’s prediction reliability and error balance.
The ROC curve in Fig. 7 demonstrates the trade-off between sensitivity and specificity, with the AUC value summarizing the overall discriminative power of the model.
The PR curve in Fig. 8 highlights the relationship between precision and recall across thresholds, offering insights into model robustness under class imbalance.
Figure 9 overlays raw sensor signals, model risk predictions, and expert annotations, with shaded regions marking false positives and false negatives, enabling qualitative assessment of error patterns in time-series predictions.
Table 11 summarizes the time-series-specific evaluation metrics used to assess injury risk prediction. The model achieved a strong time-based F1-score of 0.88 (95% CI: 0.84–0.91), indicating reliable precision–recall performance across risk events longer than two seconds. The event detection rate reached 91.3% ± 4.1, demonstrating that the majority of high-risk activities were accurately identified. Additionally, the system achieved an average alert latency of 1.4 ± 0.3 s, meaning that risk warnings were consistently generated ahead of expert-annotated risk intervals. Together, these results highlight the robustness of the framework in capturing injury risk dynamics within sequential data.
Achievement of hypotheses
The hypotheses developed for the study were aimed at validating the effectiveness of wearable-based biomechanical risk assessment in improving safety and performance. The results achieved support the hypotheses as outlined below:
-
Hypothesis 1: The integration of wearables significantly improves real-time biomechanical risk assessment.
-
Hypothesis 2: Wearable-based optimization leads to a reduction in biomechanical risks in both industrial and athletic settings.
-
Hypothesis 3: The comprehensive approach to biomechanical risk assessment using wearables enhances both occupational safety and athletic performance concurrently.
Hypothesis 1 and 2 were confirmed by statistically significant differences; Hypothesis 3 is partially supported and will be further validated in future studies.
The detailed results and achieved hypothesis validations demonstrate the potential of integrating wearables for advanced biomechanical risk assessment.
Discussion of results and achieved hypothesis
The integration of wearable sensors has shown clear improvements in biomechanical risk assessment across both industrial and athletic settings. Real-time monitoring of joint angles and muscle forces enabled the identification of high-stress areas during dynamic activities, supporting proactive injury mitigation. The adaptability of wearables across tasks highlights their scalability for broader applications in safety and performance optimization.
Key findings indicate measurable improvements in joint angle control and muscle activation among athletes using the wearable system. These improvements are attributed to the real-time biomechanical feedback, which enabled participants to make personalized adjustments to technique and workload. While initial projections suggest a potential 25–35% reduction in injury risk, these estimates remain theoretical and derived from observed biomechanical trends rather than confirmed outcomes. Accordingly, they should be interpreted as hypotheses or projections rather than established results. Future research will be needed to empirically validate these projections through longitudinal injury tracking and prospective trials.
Importantly, the system detected limb asymmetries exceeding safe thresholds, such as a 13.5% difference in lifting force between dominant and non-dominant legs and a 2.8° discrepancy in knee angles during jumping. These variations are linked to increased injury risk, such as ACL tears and muscle strains. Wearables not only identify these risks early but also assist in rehabilitation by tracking improvements in symmetry and load tolerance over time.
A limitation of the present study is the absence of an independent control group. While the within-subject design allowed participants to act as their own controls, this approach restricts causal inference. Without a parallel group of athletes not receiving wearable feedback, it is not possible to fully isolate the effects of the intervention. Future work should include a properly randomized control group to validate the intervention’s efficacy in reducing biomechanical risks.
Comparison between industrial and athletic optimization
To assess the effectiveness of wearables in both industrial and athletic settings, a comparative analysis was conducted. Table 12 provides a comparison of biomechanical risk scores between the two environments.
Recent work by Turner et al.57 validated OpenCap as an accurate, low-cost markerless system for joint kinematics in controlled tasks. However, such systems require fixed cameras and stable lighting, limiting field applicability. In contrast, wearable sensors used in this study provide real-time feedback, portability, and adaptability in dynamic, unstructured environments, making them more practical for continuous biomechanical monitoring and real-world injury prevention.
While promising, challenges remain. Sensor drift, accuracy during high motion, and user adherence must be addressed to ensure long-term reliability. Still, the findings confirm that wearable-based biomechanical monitoring supports injury prediction, rehabilitation guidance, and enhanced performance, offering a unified, scalable framework for both sports and occupational health applications.
The relatively small sample size remains a constraint, particularly when stratified by sport and injury history. While results demonstrate promising accuracy, replication in a larger, multi-center cohort will be necessary to validate generalizability.
Occupational ergonomics and workplace strain monitoring will be addressed in Phase II validation, where the predictive models will be evaluated in prolonged, real-world industrial settings. These applications are included as future extensions and are not part of the current empirical dataset.
Wearables for injury recovery and rehab monitoring
Wearable sensors provide an effective means for tracking injury recovery progress in real-time, especially during musculoskeletal rehabilitation. In our pilot rehabilitation study (n = 20), wearables monitored lower-limb muscle activity and joint symmetry during squats and leg curls. The system detected a 17% improvement in left-right muscle force balance after two weeks of guided rehab, rising from a baseline asymmetry of 21.3% to 4.2%. Additionally, joint angle recovery—measured through knee flexion range—improved from 83.5° (injured limb) to 102.7° over three weeks. Real-time feedback from wearable devices enabled physiotherapists to adapt exercise loads dynamically, preventing overexertion and promoting safe recovery progression. EMG fatigue monitoring also flagged early signs of overload in 3 out of 20 cases, helping avoid exercise-related setbacks. These findings highlight the role of wearables in personalizing rehab interventions, maintaining adherence, and supporting outcome-based recovery strategies in both athletic and clinical contexts.
Clinical validation strategy
To ensure the clinical reliability of wearable-derived biomechanical data, a multi-tiered validation approach was employed. The methodology incorporated expert physiotherapist collaboration, functional movement screenings (FMS), and diagnostic imaging as reference standards.
Validation of the wearable-based biomechanical model was conducted through comparative analysis with gold-standard laboratory instruments, including the OptiTrack Prime 13 motion-capture system and Bertec force plates. Data from 15 randomly selected participants were used for validation trials to ensure representative accuracy across different movement types.
Inter-method reliability was quantified using the Intraclass Correlation Coefficient (ICC[2,1] = 0.91, indicating excellent consistency between wearable-derived and reference measurements. Additionally, Bland–Altman analysis revealed limits of agreement within ± 4.3%, demonstrating minimal systematic bias across modalities. The regression analysis between wearable and force-plate outputs yielded R² = 0.94 (p < 0.01), confirming a strong linear relationship.
These validation outcomes substantiate the robustness and reproducibility of the wearable system for estimating joint stress and muscle-force dynamics under real-world motion conditions, addressing the methodological reliability concerns and ensuring quantitative transparency in inter-rater concordance.
Figure 10 illustrates the relationship between knee joint angles captured by the wearable sensor array and those obtained from the OptiTrack motion-capture reference system. The scatter plot demonstrates a high degree of correlation (R² = 0.94) with a regression slope of 0.97, signifying near one-to-one correspondence. The 95% confidence band tightly encloses the regression line, confirming that wearable measurements reliably replicate laboratory-grade kinematic profiles. This visual and statistical agreement validates the accuracy and consistency of the wearable-based biomechanical monitoring framework for real-world motion tracking.
Physiotherapist evaluation
A group of three certified physiotherapists from Bahawal Victoria Hospital manually assessed 20 randomly selected athletes from the dataset. Key biomechanical variables such as joint mobility, postural alignment, and muscle activation were evaluated during standardized movements (e.g., squat, jump landing, overhead press). The wearable data showed an 87.5% agreement rate with physiotherapist assessments for identifying abnormal muscle engagement and compensatory movement patterns.
Functional movement screening (FMS)
FMS scoring (0–21 scale) was used to benchmark movement quality. Participants with scores < 14 showed a strong correlation with elevated muscle force asymmetry (> 7%) and high joint angle deviations (outside ± 2σ of the baseline). Wearable-derived asymmetry indices successfully identified 91% of at-risk subjects as flagged by FMS scoring.
Diagnostic imaging (MRI/ultrasound correlation)
For 8 athletes presenting clinical symptoms (e.g., lower back pain, shoulder strain), musculoskeletal MRI and ultrasound were conducted. MRI-confirmed lumbar disc compression (n = 2) and supraspinatus inflammation (n = 1) aligned with abnormal force spikes (> 320 N) and reduced joint mobility detected via wearables.
Key findings
-
Wearables predicted 6 out of 8 confirmed injuries (75% diagnostic precision).
-
Strong correlation (R² = 0.82) between wearable asymmetry metrics and clinical FMS scores.
-
Real-time feedback from wearables enabled early intervention before imaging-confirmed diagnosis in 3 cases.
These validation results reinforce the reliability of wearable technologies for injury detection and rehabilitation monitoring. They demonstrate compatibility with clinical gold standards, making wearable-driven risk models a promising alternative for proactive and scalable sports medicine practices.
This study received ethical approval from the Institutional Review Board (IRB) under protocol ID IRB/2024-PT/0031. All participants signed informed consent forms before participating. To protect subject privacy, all wearable-derived biomechanical data, including joint angles, muscle forces, and gait metrics—were anonymized at the point of collection using alphanumeric identifiers (e.g., P001). Data transmissions were encrypted using AES-256 and stored on secure institutional servers with restricted, role-based access. A private ISO/IEC 27,001-certified institutional cloud was used for encrypted daily backups. According to the IRB-approved data policy, raw data will be retained for five years. All data activity was tracked through a controlled audit trail for transparency and compliance.
Adaptive rehab using real-time wearable feedback
Wearables enable dynamic personalization of rehabilitation by adjusting loads and activities in real-time based on patient-specific biomechanical data. In a pilot trial (n = 28), integrating wearable feedback to modulate resistance levels during leg curls reduced muscle fatigue asymmetry by 22% over three weeks. EMG-driven thresholds allowed therapists to adjust intensity based on real-time recovery trends, reducing overexertion events by 36%. Real-time symmetry tracking during squats also shortened neuromuscular recovery time from 14.6 to 10.1 days (p < 0.05). These findings suggest wearables can adapt rehab protocols on-the-fly, improving safety, efficiency, and personalization, especially in outpatient or unsupervised environments.
Wearables for injury recovery and rehab monitoring
Wearable sensors provide an effective means for tracking injury recovery progress in real-time, especially during musculoskeletal rehabilitation. In our pilot rehabilitation study (n = 20), wearables monitored lower-limb muscle activity and joint symmetry during squats and leg curls. The system detected a 17% improvement in left-right muscle force balance after two weeks of guided rehab, rising from a baseline asymmetry of 21.3% to 4.2%. Additionally, joint angle recovery—measured through knee flexion range—improved from 83.5° (injured limb) to 102.7° over three weeks. Real-time feedback from wearable devices enabled physiotherapists to adapt exercise loads dynamically, preventing overexertion and promoting safe recovery progression. EMG fatigue monitoring also flagged early signs of overload in 3 out of 20 cases, helping avoid exercise-related setbacks. Rehabilitation indicators were operationalized into three primary categories: (1) Range of Motion (ROM)—measured in degrees using IMU joint angle tracking (e.g., knee flexion during squats), benchmarked against normative recovery thresholds; (2) Muscle Activation Patterns—quantified as %MVC via sEMG, with symmetry indices computed to detect inter-limb imbalance; (3) Fatigue Resistance—assessed via rate of decline in normalized muscle force over time. Wearable-derived ROM correlated strongly (R²=0.84) with physiotherapist goniometer readings, while %MVC patterns enabled identification of compensatory activation in quadriceps and hamstrings during late-stage rehab. These metrics informed progressive loading decisions and provided objective benchmarks for session-to-session progress. These findings highlight the role of wearables in personalizing rehab interventions, maintaining adherence, and supporting outcome-based recovery strategies in both athletic and clinical contexts.
Difficulties and future directions
The Difficulties and Future Directions section outlines the methodological and practical challenges encountered and highlights directions for further research.
Clarification on scope
Industrial-application scenarios discussed in this study are conceptual illustrations of scalability rather than experimentally validated findings. These sections were included to demonstrate the broader potential of the proposed framework beyond sports biomechanics. Empirical validation of industrial ergonomics use cases will be conducted in future phases under controlled factory-floor and assembly-line environments to evaluate transferability and safety outcomes.
Current challenges
Key implementation challenges include data accuracy, system compatibility, and long-term user adherence. These limitations stem from sensor drift, inconsistent sampling across commercial devices, and ergonomic constraints during prolonged use. Addressing these issues requires continuous advancement in sensor miniaturization, real-time data-fusion algorithms, and adaptive calibration protocols.
Ethical and practical considerations
Privacy, data protection, and informed-consent mechanisms remain critical to ensure responsible use of sensitive biomechanical information. Ethical governance should follow GDPR-aligned data-handling standards and include anonymization protocols before cloud storage or AI model training.
Research limitations
A major limitation of the present study is the absence of true longitudinal injury-outcome tracking. While biomechanical risk markers (e.g., asymmetry indices, joint-angle deviations) were monitored, no season-long injury confirmations were recorded. Hence, findings reflect predictive risk indicators rather than verified injury prevention.
Future research directions
A forthcoming 12-month longitudinal study involving 80 athletes across three sports will integrate daily wearable tracking, adaptive injury-risk modeling, and physiotherapist-verified outcomes. Preliminary pilot data suggest potential reductions of 22% in asymmetry index and 15% in peak lumbar load, which could translate to an estimated 25–35% decrease in injury incidence—subject to confirmation in long-term trials.
Wearable devices also hold promise for rehabilitation and clinical monitoring; however, user comfort, device durability, and continuous-use compliance remain challenges. Future iterations should integrate ergonomic design, battery optimization, adaptive learning algorithms, and secure decentralized data frameworks to enable sustained deployment in both athletic and occupational settings.
From a deployment perspective, the current wearable framework maintained stable performance for up to 9.2 h on a 3000 mAh battery during continuous monitoring sessions. Sensor comfort was rated 8.1 out of 10 by participants, with minimal restriction to natural motion. Robustness testing across both athletic (running, lifting) and occupational (assembly, carrying) environments confirmed consistent signal quality above 92% integrity even under sweat or vibration exposure. Data privacy compliance was ensured through edge-based processing and anonymized local storage, minimizing risk of sensitive biomechanical data leakage.
A Phase 2 industrial validation study is planned to assess system durability and algorithmic adaptability over 12 months, focusing on long-term field use, model generalization across work profiles, and energy-efficient computation. These insights will support large-scale deployment of real-time biomechanical risk assessment tools across sports, healthcare, and manufacturing sectors. All abbreviations and acronyms used in this manuscript are defined in Appendix A (Supplementary Material). The complete signal processing pipeline, mathematical formulations for biomechanical stress estimation, and optimization algorithm pseudocode are detailed in Appendix B (Supplementary Material). This includes the three-stage data processing workflow, calibration protocol following ISB guidelines, and the software tools used for data acquisition and analysis. The multi-step variability classification framework used to differentiate risky movement deviations from normal intra-individual variability is presented in STable2 (Appendix C, Supplementary Material). Additional visualizations of joint angle distributions and muscle force analysis are also provided in Appendix C. Prior to adopting the research-grade Xsens IMU and Delsys EMG configuration, preliminary trials were conducted using consumer-grade devices including Samsung Galaxy Watch, Samsung Gear Fit, and Polar Vantage V2 to evaluate cross-device validity. The complete pilot study methodology and sensor specifications are documented in Appendix D (Supplementary Material).
Conclusion
This study presents a validated and data-driven framework for real-time biomechanical risk assessment and rehabilitation monitoring using wearable inertial measurement units (IMUs) and surface electromyography (sEMG). By integrating physics-informed sensing with machine learning, the framework effectively identifies high-risk biomechanical deviations—such as knee valgus exceeding 10° or muscle asymmetry above 15%—which are established indicators of anterior cruciate ligament (ACL) and muscular strain injuries. The model achieved 92.3% accuracy and an AUC of 0.93, providing reliable early detection capability with a latency of less than 200 ms. Unlike prior IMU- or EMG-only systems, this work introduces a hybrid sensor fusion and adaptive optimization layer that enhances both prediction accuracy and computational efficiency. Field experiments conducted with 50 athletes at Dring Stadium, Bahawalpur, confirmed that the framework produces consistent and reproducible measurements of joint kinematics and muscle forces (150–230 N range). These findings highlight the framework’s robustness under real-world conditions, bridging the long-standing gap between laboratory biomechanics and practical deployment. Three core contributions define this research. First, it demonstrates a quantitative, real-time injury risk assessment model that couples wearable sensor data with machine learning–based pattern recognition for precise detection of dynamic biomechanical stress. Second, it validates an optimization-based calibration method that compensates for sensor drift, signal noise, and synchronization delays, improving stability during prolonged monitoring. Third, it establishes a clinically relevant rehabilitation feedback system, enabling adaptive load progression and asymmetry correction through individualized feedback loops. From a translational standpoint, the framework supports clinicians, ergonomists, and sports scientists in monitoring musculoskeletal health with high fidelity. In rehabilitation, physiotherapists can monitor progress remotely and make informed adjustments to recovery protocols based on predicted biomechanical patterns rather than assumed treatment effects. In sports settings, athletes and coaches receive data-driven feedback that highlights potential re-injury risks and supports performance optimization through fatigue-aware training, without implying direct prevention outcomes. In occupational environments, continuous posture and strain monitoring provides early indications of repetitive-motion stress, aiding proactive decision-making. Future research will focus on large-scale validation across broader populations and integration with tele-rehabilitation and wearable robotic systems to enhance practical utility. Improvements in sensor durability, power efficiency, and adaptive calibration remain important for long-term deployment. By emphasizing methodological transparency and reproducible analytics, this work offers a predictive framework for musculoskeletal monitoring aligned with UN Sustainable Development Goal 3. These findings should be interpreted as demonstrating predictive capability rather than causal injury-reduction effects, and longitudinal studies are required to confirm long-term clinical impact.
Data availability
The data used to support the findings of this study can be provided on a reasonable request from the corresponding author.
References
AbdElmomen, M. et al. Research on upper-body exoskeletons for performance augmentation of production workers. In DAAAM Proceedings, 1 edn., vol. 1, 904–913. https://doi.org/10.2507/30th.daaam.proceedings.126 (2019).
Bandura, A. Self-efficacy: toward a unifying theory of behavioral change. Psychol. Rev. 84, 191–215. https://doi.org/10.1037/0033-295X.84.2.191 (1977). https://psycnet.apa.org
Black, O., Keegel, T., Sim, M. R., Collie, A. & Smith, P. The effect of self-efficacy on return-to-work outcomes for workers with psychological or upper-body musculoskeletal injuries: A review of the literature. J. Occup. Rehabil. 28, 16–27. https://doi.org/10.1007/s10926-017-9697-y (2018).
Blanco, A., Catalan, J. M., D ´ ´ıez, J. A., Garc´ıa, J. V. & Lobato, E. Garc´ıa-Aracil. Electromyography assessment of the assistance provided by an upper-limb exoskeleton in maintenance tasks. Sensors 19, 3391. https://doi.org/10.3390/s19153391 (2019).
Xiang, L. et al. Integrating personalized shape prediction, Biomechanical modeling, and wearables for bone stress prediction in runners. NPJ Digit. Med. 8, 1–12. https://doi.org/10.1038/s41746-025-01677-0 (2025).
Han, R., Qi, F., Wang, H. & Yi, M. Innovative machine learning approach for analysing Biomechanical factors in running-related injuries. Mol. Cell. Biomech. 21, 530. https://doi.org/10.62617/mcb530 (2024).
Bi, W., Zhao, Y. & Zhao, H. Predicting sports injuries using machine learning: risk factors and early warning systems. Mol. Cell. Biomech. 22, 335. https://doi.org/10.62617/mcb335 (2025).
Cao, C. & Liu, X. Predicting sports injuries with machine learning technology: enhancing athletes’ life expectancy through Biomechanical analysis. Mol. Cell. Biomech. 22, 1408. https://doi.org/10.62617/mcb1408 (2025).
Xiang, L. et al. Integrating an LSTM framework for predicting ankle joint biomechanics during gait using inertial sensors. Comput. Biol. Med. 170, 108016. https://doi.org/10.1016/j.compbiomed.2024.108016 (2024).
Liu, Y. et al. SETransformer: A hybrid attention-based architecture for robust human activity recognition. Preprint at http://arXiv.org/2505.19369 (2025).
Paviotti, F. et al. Equilivest: A robotic vest to aid in post-stroke dynamic balance rehabilitation. Preprint at http://arXiv.org/2301.06528 (2023).
Mollyn, V. et al. IMUPoser: Full-body pose estimation using IMUs in phones, watches, and earbuds. In Proc. 2023 CHI Conf. Human Factors in Computing Systems 1–12. https://doi.org/10.1145/3544548.3581392 (2023).
Carlson, H. & Carlson, N. An overview of the management of persistent musculoskeletal pain. Ther. Adv. Musculoskelet. 3, 91–99. https://doi.org/10.1177/1759720X11398742 (2011).
Chander, H. et al. Wearable stretch sensors for human movement monitoring and fall detection in ergonomics. Int. J. Environ. Res. Public. Health. 17, 3554. https://doi.org/10.3390/ijerph17103554 (2020).
Constantinescu, C., Popescu, D., Muresan, P. C. & Stana, S. I. Exoskeleton-centered process optimization in advanced factory environments. Procedia CIRP. 41, 740–745. https://doi.org/10.1016/j.procir.2015.12.051 (2016).
De Bock, S. et al. De Pauw. Passive shoulder exoskeletons: more effective in the lab than in the field? IEEE Trans. Neural Syst. Rehabil Eng. 29, 173–183. https://doi.org/10.1109/TNSRE.2020.3041906 (2021).
Dehghani, M. & Dangelico, R. M. Smart wearable technologies: Current status and market orientation through a patent analysis. In 2017 IEEE International Conference on Industrial Technology (ICIT) 1570–1575. https://doi.org/10.1109/ICIT.2017.7915602 (2017).
Del Ferraro, S., Falcone, T., Ranavolo, A. & Molinaro, V. The effects of upper-body exoskeletons on human metabolic cost and thermal response during work tasks—a systematic review. Int. J. Environ. Res. Public. Health. 17, 7374. https://doi.org/10.3390/ijerph17207374 (2020).
Desbrosses, K. Manual handling tasks performed with an upper limbs exoskeleton at the workplace. Ann. Phys. Rehabil Med. 60, e101. https://doi.org/10.1016/j.rehab.2017.07.204 (2017).
Di Natali, C., Toxiri, S., Ioakeimidis, S. & Caldwell, D. G. Ortiz. Systematic framework for performance evaluation of exoskeleton actuators. Wearable Technol. 1, e4. https://doi.org/10.1017/wtc.2020.5 (2020).
Hidayah, R., Sui, D., Wade, K. A., Chang, B. C. & Agrawal, S. Passive knee exoskeletons in functional tasks: Biomechanical effects of a Springexo coil-spring on squats. Wearable Technol. 2, e7. https://doi.org/10.1017/wtc.2021.6 (2021).
Hoffmann, N., Prokop, G. & Weidner, R. Methodologies for evaluating exoskeletons with industrial applications. Ergonomics 1, 1–38. https://doi.org/10.1080/00140139.2021.1970823 (2021).
Iranzo, S., Piedrabuena, A., Iordanov, D., Martinez-Iranzo, U. & Lois, J. M. B. Ergonomics assessment of passive upper-limb exoskeletons in an automotive assembly plant. Appl. Ergon. 87, 103120. https://doi.org/10.1016/j.apergo.2020.103120 (2020).
Kelson, D. M. et al. Effects of passive upper-extremity exoskeleton use on motor performance in a precision task. In Proc. Hum. Factors Ergon. Soc. Annu. Meet., vol. 63, 1084–1085. https://doi.org/10.1177/1071181319631437 (2019).
Kim, S. et al. Potential of exoskeleton technologies to enhance safety, health, and performance in construction: industry perspectives and future research directions. IISE Trans. Occup. Ergon. Hum. Factors. 7, 185–191. https://doi.org/10.1080/24725838.2018.1561557 (2019).
Lim, S. & D’Souza, C. A narrative review on contemporary and emerging uses of inertial sensing in occupational ergonomics. Int. J. Ind. Ergon. 76, 102937. https://doi.org/10.1016/j.ergon.2020.102937 (2020).
Lowe, B. D., Billotte, W. G. & Peterson, D. R. Astm f48 formation and standards for industrial exoskeletons and exosuits. IISE Trans. Occup. Ergon. Hum. Factors. 7, 230–236. https://doi.org/10.1080/24725838.2019.1579769 (2019).
Luczak, T. et al. Prabhu. Closing the wearable gap—part v: development of a pressure-sensitive sock utilizing soft sensors. Sensors 20, 208. https://doi.org/10.3390/s20010208 (2019).
Luger, T., Bar, M., Seibt, R., Rieger, M. A. & Steinhilber, B. Using a back exoskeleton during industrial and functional tasks—effects on muscle activity, posture, performance, usability, and wearer discomfort in a laboratory trial. Hum. Factors. 11, 001872082110072. https://doi.org/10.1177/00187208211007267 (2021).
Madinei, S. et al. Assessment of two passive back-support exoskeletons in a simulated precision manual assembly task. In Proc. Hum. Factors Ergon. Soc. Annu. Meet., vol. 63, 1078–1079. https://doi.org/10.1177/1071181319631192 (2019).
Maltseva, K. Wearables in the workplace: the brave new world of employee engagement. Bus. Horiz. 63, 493–505 (2020).
Matijevich, E. S., Scott, L. R., Volgyesi, P., Derry, K. H. & Zelik, K. E. Combining wearable sensor signals, machine learning and biomechanics to estimate tibial bone force and damage during running. Hum. Mov. Sci. 74, 102690. https://doi.org/10.1016/j.bushor.2020.03.007 (2020).
McFarland, T. & Fischer, S. Considerations for industrial use: A systematic review of the impact of active and passive upper limb exoskeletons on physical exposures. IISE Trans. Occup. Ergon. Hum. Factors. 7, 322–347. https://doi.org/10.1080/24725838.2019.1684399 (2019).
Naf, M. B. et al. Passive back support exoskeleton improves range of ¨ motion using flexible beams. Front. Robot AI. 5, 72. https://doi.org/10.3389/frobt.2018.00072 (2018).
Nelson, A. J., Hall, P. T., Saul, K. R. & Crouch, D. Effect of mechanically passive, wearable shoulder exoskeletons on muscle output during dynamic upper extremity movements: A computational simulation study. J. Appl. Biomech. 36, 59–67. https://doi.org/10.1123/jab.2018-0369 (2020).
Poliero, T. et al. Applicability of an active back-support exoskeleton to carrying activities. Front. Robot AI. 7, 579963. https://doi.org/10.3389/frobt.2020.579963 (2020).
Qu, X. et al. Effects of an industrial passive assistive exoskeleton on muscle activity, oxygen consumption and subjective responses during lifting tasks. PLoS ONE. 16, e0245629. https://doi.org/10.1371/journal.pone.0245629 (2021).
Remmen, L. N., Heiberg, R. F., Christiansen, D. H., Herttua, K. & Berg-Beckhoff, G. Work-related musculoskeletal disorders among occupational fishermen: A systematic literature review. Occup. Environ. Med. 78, 522–529. https://doi.org/10.1136/oemed-2020-106675 (2021).
Romero, D. et al. Digitalizing occupational health, safety and productivity for the operator 4.0. In Advances in Production Management Systems. Smart Manufacturing for Industry 4.0, vol. 536, 473–481. https://doi.org/10.1007/978-3-319-99707-0_59 (2018).
Schmalz, T. et al. Biomechanical and metabolic effectiveness of ¨ an industrial exoskeleton for overhead work. Int. J. Environ. Res. Public. Health. 16, 4792. https://doi.org/10.3390/ijerph16234792 (2019).
Sevier, T. L. The industrial athlete? Occup. Environ. Med. 57, 285. https://doi.org/10.3233/WOR-2000-00133 (2000).
Smith, E., Burch, V. R. F., Strawderman, L., Chander, H. & Smith, B. K. A comfort analysis of using smart glasses during ‘picking’ and ‘putting’ tasks. Int. J. Ind. Ergon. 83, 103133. https://doi.org/10.1016/j.ergon.2021.103133 (2021).
Svertoka, E. et al. State-of-the-art of industrial wearables: A systematic review. In 2020 13th International Conference on Communications (COMM) 411–415. https://doi.org/10.1109/COMM48946.2020.9141982 (2020).
Sylla, N., Bonnet, V., Colledani, F. & Fraisse, P. Ergonomic contribution of able exoskeleton in automotive industry. Int. J. Ind. Ergon. 44, 475–481. https://doi.org/10.1016/j.ergon.2014.03.008 (2014).
Talegaonkar, P. et al. Closing the wearable gap-part vii: A retrospective of stretch sensor tool kit development for benchmark testing. Electronics 9, 1457. https://doi.org/10.3390/electronics9091457 (2020).
Van Hooren, B., Goudsmit, J., Restrepo, J. & Vos, S. Real-time feedback by wearables in running: current approaches, challenges and suggestions for improvements. J. Sports Sci. 38, 214–230. https://doi.org/10.1080/02640414.2019.1690960 (2019).
Wang, Z. et al. A semi-active exoskeleton based on Emgs reduces muscle fatigue when squatting. Front. Neurorobot. 15, 625479. https://doi.org/10.3389/fnbot.2021.625479 (2021).
Zhang, X., Shan, G., Wang, Y., Wan, B. & Li, H. Wearables, Biomechanical feedback, and human motor-skills’ learning & optimization. Appl. Sci. 9, 226. https://doi.org/10.3390/app9020226 (2019).
Xu, D. et al. Data-driven deep learning for predicting ligament fatigue failure risk mechanisms. Int. J. Mech. Sci. 301, 110519. https://doi.org/10.1016/j.ijmecsci.2025.110519 (2025).
Xu, D. et al. A new method proposed for realizing human gait pattern recognition: inspirations for the application of sports and clinical gait analysis. Gait Posture. 107, 293–305. https://doi.org/10.1016/j.gaitpost.2023.10.019 (2024).
Tedeschi, R. Exploring the potential of iPhone applications in podiatry: A comprehensive review. Egypt. Rheumatol. Rehabil. 51 (2). https://doi.org/10.1186/s43166-023-00234-5 (2024).
Tedeschi, R. et al. Wearable Technology in Rehabilitation: Assessing the Impact of the Apple Watch on Physical Activity and Cardiovascular Health. https://doi.org/10.20944/preprints202411.v1 (2024).
Sadeghi, M., Abbasimoshaei, A., Kitajima Borges, J. P. & Kern, T. A. Numerical and experimental study of a wearable exo-glove for telerehabilitation application using shape memory alloy actuators. Actuators 13, 409. https://doi.org/10.3390/act13100409 (2024).
Lee, S. M. & Park, J. A soft wearable Exoglove for rehabilitation assistance: A novel application of knitted shape-memory alloy as a flexible actuator. Fashion Textiles 11, 14. https://doi.org/10.1186/s40691-024-00377-9 (2024).
Benjaminse, A., Nijmeijer, E. M., Gokeler, A. & Di Paolo, S. Application of machine learning methods to investigate joint load in agility on the football field: creating the model, part I. Sensors 24, 3652. https://doi.org/10.3390/s24113652 (2024).
Umehara, J. et al. Skeletal muscle shape influences joint torque exertion through the mechanical advantages. J. Appl. Physiol. 138, 1119–1132. https://doi.org/10.1152/japplphysiol.00997.2024 (2025).
Turner, J. A., Chaaban, C. R. & Padua, D. A. Validation of opencap: A low-cost markerless motion capture system for lower-extremity kinematics during return-to-sport tasks. J. Biomech. 171, 112200. https://doi.org/10.1016/j.jbiomech.2024.112200 (2024).
Acknowledgements
The authors would like to acknowledge Dring Stadium Bahawalpur for their support in conducting this research. Special thanks to the Deanship of Scientific Research at Shaqra University for supporting this work.
Author information
Authors and Affiliations
Contributions
- **Data Collection** : Abdullah Alzahrani, Maysa Aljohany- **Conceptualization** : Abdullah Alzahrani, Hadeel Alsirhani- **Methodology** : Abdullah Alzahrani, Maysa Aljohany, Hadeel Alsirhani- **Software** : Abdullah Alzahrani, Hadeel Alsirhani- **Formal Analysis** : Abdullah Alzahrani, Maysa Aljohany- **Resources** : Abdullah Alzahrani, Hadeel Alsirhani- **Writing—Review and Editing** : Abdullah Alzahrani, Maysa Aljohany, Hadeel Alsirhani.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Research involving human participants and/or animals
This research does not involve the use of human tissue samples. This involved athletes from Dring Stadium Bahawalpur and industrial workers from relevant occupational settings, a thorough review of the research protocol, covering aspects such as study design, data collection methods, and participant involvement. The experiments took place at Dring Stadium Bahawalpur under the supervision of medical staff from Bahawal Victoria Hospital and were performed in accordance with relevant guidelines and regulations. This hospital, known for its comprehensive medical services, provided a suitable setting. All experimental protocols were approved by the Institutional Review Board (IRB) of Bahawal Victoria Hospital.
Informed consent
An informed consent was obtained from all the participating subjects. And an ethical approval having reference # Ref No: BVH-IRB-255 was also secured from the Institutional Review Board (IRB) of Bahawal Victoria Hospital. The authors affirm their commitment to conducting research in accordance with the highest ethical standards and ensuring the accuracy, transparency, and reliability of the presented findings.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Below is the link to the electronic supplementary material.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Alzahrani, A., Aljohany, M. & Alsirhani, H. Real-time wearable biomechanics framework for sports injury prevention and rehabilitation optimization. Sci Rep 16, 4436 (2026). https://doi.org/10.1038/s41598-025-34551-w
Received:
Accepted:
Published:
Version of record:
DOI: https://doi.org/10.1038/s41598-025-34551-w









