Introduction

Engaging in exercise and physical activity (PA) provides tangible health benefits at every age, especially at advanced ages. Our bodies require regular exercise or PA to maintain a healthy state. Several studies have shown that regular exercise or PA can maintain physical functions, reduce falls, prevent diseases, and enhance physical performance and daily living activities among older adults1,2,3,4,5,6. However, more than one-fourth of the world’s older adults have PA levels lower than those recommended by the World Health Organization7. Additionally, the global aging population, particularly in Eastern and Southeastern Asian countries, is expected to increase significantly to 1.5 billion by 20508. Encouraging and motivating older adults to engage in regular exercise or physical activity is crucial to support the development of a healthy and active aging society, which is a major priority for health care and socioeconomic advancement9.

Generally, community-dwelling older adults and those who attend senior day care centers are eager to exercise with their community friends. However, older adults who are not involved in these communities tend to have lower levels of PA and may spend more time at home. Some research indicates that older adults who do not participate in community activities often exhibit lower levels of physical activity and can spend up to 80% of their day sedentary10, particularly those with chronic conditions or walking difficulties, as noted in the study11. This could be due to a lack of motivation or access to proper exercise guidance tools. Additionally, a systematic review of reviews by Zubala et al. highlighted the importance of providing older adults with sustainable options that meet their needs and preferences for engaging in PA in the long term12. Therefore, having interactive exercise training or guidance tools at home could be an excellent option to facilitate and motivate older adults to increase their exercise and PA levels in a convenient and sustainable manner.

Typically, a simple exercise guidance or training tool is a series of instructional exercise videos produced by fitness experts or exercise trainers. These videos provide two-dimensional (2D) cues to guide users through exercises. However, users watch and follow the instructor in the video without receiving feedback on their technique, which may result in incorrect execution and reduced benefits. Furthermore, in recent studies on the visual perception of 2D and three-dimensional (3D) objects, Korisky and Mudrik13 reported that participants can readily perceive real 3D objects better than 2D images can, and Ozana and Ganel14 reported that, unlike grasping real 3D objects, unrelated perceptual information vulnerability occurs when grasping 2D images. In addition, Snow et al. reported that 3D objects are more rememberable than 2D objects15 and that grasping 3D objects tends to increase attention and manual response in comparison with grasping 2D objects16. Unfortunately, to the best of our knowledge, no studies have developed an exercise guidance system that uses a real 3D guidance object and provides interactive real-time feedback to the user.

Owing to technological advancements, the emergence of small flyable robots, known as “drones”—small 3D objects that can move freely in 3D space—has broadened research areas, leading to many small branches. It has had a significant impact on society in various fields, including entertainment (games and sports)17,18,19,20,21, military (high-risk exploration and spying)22, agriculture (soil and field analysis)23,24, and even health care services (drug delivery)25,26. In the area of human‒robot interaction, human‒drone interaction (HDI) research has been continually growing over the past decade17,18,19,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42. Some researchers and developers have applied drones in sports education, such as dancing17,18 and boxing19, or in meditation, such as Tai-Chi Qigong28,30,31, to encourage strong and intimate interactions between humans and drones.

However, the aforementioned HDI innovations for sports education are limited for young people and are not suitable for older adults to safely and effectively perform physical exercises at home because of either the system designs themselves or the applied tasks and environments. Although the HDI in meditation from Delfa et al.31 seems to benefit older people, the system design and interaction focus mainly on mental exercise and mind–body connections rather than on physical training of the neuromusculoskeletal system. Accordingly, we recognize the potential of developing an HDI system using a palm-sized drone to encourage and guide older adults in performing physical exercises or movements, demonstrating the principle of promoting healthy aging. Using the drone as part of physical exercise introduces a novel approach to facilitating movement and activity through interactive technology.

In this article, we present a home-based 3D exercise drone guidance system for older adults named the Pei-Wo Drone. Pei-Wo means “accompany me” in Mandarin. Pei-Wo Drone was designed to support and encourage older adults when performing specific exercises at home. Instead of following an exercise trainer, the user will be guided by a simple and easy-to-recognize real 3D object. Additionally, Pei-Wo Drone includes an interactive feedback mechanism that enables the user to receive guidance and correction while performing the exercises to ensure proper and safe execution.

Compared to other indoor exercise guidance technologies, such as mobile applications43,44, Wii Gaming45,46,47, Kinect48,49 and wearable motion tracking systems (e.g., IMUs)50, which still rely on a 2D screen for visualization, the Pei-Wo Drone offers a distinct advantage. While systems like Wii, Kinect, and motion tracking technologies offer real-time feedback, they rely on 2D visual cues displayed on a screen, which may restrict the user’s depth perception and spatial awareness. In contrast, the Pei-Wo Drone uses a 3D movable object as a guiding medium, potentially enhancing the user’s visual perception by providing a more natural and dynamic 3D visual cue. This distinctive feature may improve engagement and help users connect more intuitively with their physical movements, creating a more immersive exercise experience.

When compared to immersive VR and AR systems, which also offer 3D visual cues and immersive experiences, the Pei-Wo Drone avoids the potential downsides of prolonged screen exposure. While VR and AR technologies can be highly immersive, they still rely on digital screens that can cause visual strain and discomfort over time, similar to 2D systems. Previous research has shown that screen-based exercise interventions, such as VR and AR, can enhance physical activity engagement and motor function across diverse populations, including healthy young individuals51,52, older adults53,54,55, and stroke patients51,56. However, prolonged exposure to digital screens has been associated with visual strain, headaches, impaired vision, and dry eyes57,58,59. It can also disrupt sleep patterns, increase stress levels, and contribute to mental health issues such as anxiety and depression60. The Pei-Wo Drone addresses these concerns by providing a 3D screen-free exercise assistance tool that not only encourages physical activity but also minimizes screen time and reduces the associated risks.

Results

Design and overview of Pei-Wo Drone, an exercise guidance system for older adults

The system consists of a palm-sized drone with a time-of-flight (ToF) sensor and an optical flow sensor deck (bottom deck) and an ultrawideband (UWB) loco positioning sensor or tag (top deck) (Fig. 1a), two wearable wrist tags (Fig. 1b, c), eight UWB loco positioning anchors (Fig. 1d and Methods), and a laptop with a USB radio dongle (Fig. 1e) to communicate with the drone and the wearable devices. The drone determines its current position from a sensor fusion algorithm, which outputs the value from the most accurate sensor in each rotational axis of the drone. Once the program is executed and the user places the wearable devices on their wrists, the system can receive the absolute positions of the drone and the wearable devices in real time, which enables the system to track the drone and the participant’s wrists simultaneously. The system provides interactive sound feedback (beep) if neither of the wrist sensors are within the threshold range of the drone’s current position (Fig. 1h). An executable program was written to control the drone’s path based on the trajectory of a preselected exercise or movement. In this study, there were two selected movements: (1) lateral arm reach movement (Fig. 1f) and (2) arm up and down movement (Fig. 1g).

Fig. 1
figure 1

Schematics and images of the home-based exercise guidance system with Pei-Wo Drone. (a) Assembly of a custom Crazyflie 2.1 nano drone with a combined ToF and optical flow sensor deck (bottom layer), motion capture marker deck (the 2nd layer from the top), and UWB loco positioning deck or UWB tag (top layer). The drone was powered by a 240 mAh LiPo battery (the 2nd layer from the bottom). Scale bar, 35 mm. (b, c) Each custom wearable wrist sensor contains a Crazyflie Bolt (bottom layer), 240 mAh LiPo battery (middle layer), and UWB loco positioning deck (top layer) (b). The participant must wear one wrist sensor on each wrist, pointing the UWB sensor (a small green color chip) out toward the fingers (c). Scale bars, 38 mm (b) and 50 mm (c). (d) A UWB loco positioning node or UWB anchor for building up a global 3D coordinate system for the drone and wrist sensors. Scale bar, 40 mm. (e) A laptop with a Crazyradio PA 2.4 GHz USB dongle to program the drone. Scale bar, 18 mm. (f) A selected movement for this guidance system named ‘lateral arm reach’ movement, where the arms move in the ML direction with respect to the human body. (g) Another selected movement named ‘arm up and down’ movement, where the arms in the vertical direction with respect to the human body. (h) A diagram of the real-time sound feedback of the Pei-Wo Drone.

Evaluation of the Pei-Wo Drone for trajectory guidance in 3D space

The system was evaluated in terms of accuracy (relative error) and precision (relative uncertainty) of the drone, as shown in Eqs. (1) and (2), respectively, after being programmed to reach the three predefined target positioning ranges for each selected movement. We attached reflective markers on top of the drone and then tracked its output trajectories while performing each selected movement with the motion capture system (Methods). The results with high accuracy and precision for both selected movements were observed after evaluation. The example trajectories of the drone while performing both movements with a target positioning range are shown in Fig. 2a, d.

Fig. 2
figure 2

Schematics of the testing results of the autonomous drone guiding system. (a) Drone trajectories while performing the lateral arm reach movement with a target range of 0.85 m in five trials. (b, c) Accuracy and precision values of the drone while performing the lateral arm reach movement for each target range (0.80 m, 0.85 m, and 0.90 m) when guiding the right arm (b) and the left arm (c). (d) Drone trajectories while performing the arm up and down movement with a target range of 0.9 m in five trials. (e, f) Accuracy and precision values of the drone while performing the arm up and down movement for each target range (0.80 m, 0.90 m, and 1.0 m) when guiding the right arm (e) and the left arm (f). (g, h) RMS errors between the participants’ wrist positions and positions of the drone in two phases with two subphases each; for lateral arm reach movement (g) and arm up and down movement (h).

Lateral arm reach movement. The moving trajectory of the drone for guiding the lateral arm reach movement is in the mediolateral direction of the user’s body (Fig. 1f). The accuracy and precision results of the drone while guiding the user in performing this movement are presented in Fig. 2b, c. After the results from the three preset positions were averaged, the accuracy and precision of the drone were approximately 88.42% (− 11.58% relative error, lower than the target range) and 78.59% (21.41% relative uncertainty), respectively, for right-arm guidance (Fig. 2b). For left arm guidance (Fig. 2c), the average accuracy and precision of the drone were approximately 97.41 percent (− 2.59 percent relative error, lower than the target range) and 79.93 percent (20.07 percent relative uncertainty), respectively.

Arm up and down movement. On the other hand, the moving trajectory of the drone for guiding the arm up and down movement is in the vertical direction compared with the user’s anatomical position (Fig. 1g). The results of the accuracy and precision of the drone while guiding the user in performing this movement are presented in Fig. 2e, f. After averaging, the results were 99.64% accurate (-0.36% relative error, lower than the target range) and 97.88% precise (2.12% relative uncertainty) for right arm guidance (Fig. 2e). Moreover, for left-arm guidance (Fig. 2f), the accuracy and precision of the drone were approximately 92.35% (7.65% relative error, higher than the target range) and 98.77% (1.23% relative uncertainty), respectively.

Ability of older adults to follow trajectories guided by the Pei-Wo Drone

We evaluated whether the Pei-Wo Drone system can be used as an exercise guidance tool for older adults by observing the root mean square (RMS) errors between the wrist positions of the users (n = 15, mean age: 67.40 ± 5.85 years) and the positions of the drones. The real-time positions of the drones and the participant’s wrists were tracked by an eight-camera motion capture system (Fig. 3c and Methods). Before the RMS error was analyzed, each recorded datum was cut into two phases with two subphases each (Fig. 3d-g) for both movements. The RMS errors in each phase of the selected movement are shown in Fig. 2g and Table 1 for the lateral arm reach movement and in Fig. 2h and Table 2 for the arm up and down movement, indicating the ability of the older adults to follow the drone.

Fig. 3
figure 3

Illustrations and schematics of performing two selected movements guided by Pei-Wo Drone, system and experimental setup, and movement phase analysis. (a) The lateral arm reach movement steps with respect to the starting position (1) and six reference points (2–7) of the drone while guiding the participant in performing this movement. (b) The arm up and down movement steps with respect to the starting position (1) and four reference points (2–5) of the drone while guiding the participant in performing this movement. (c) System setup with eight anchors in the LPS and experimental setup with an eight-camera motion capture system. (d, f) Phase segmentation for the analysis of the lateral arm reach movement (d) and data segmentation in MATLAB programming (f). (e, g) Phase segmentation of the arm up and down movement (e) and data segmentation in MATLAB programming (g).

Table 1 The distance errors in the mediolateral direction of lateral arm reach movement between the drone and wrist positions of all participants.
Table 2 The distance errors in the vertical direction of arm up and down movement between the drone and wrist positions of all participants.

Participant feedback on Pei-Wo Drone

The participant feedback on the Pei-Wo Drone for Baduanjin movement guidance was collected in a form of questionnaire by using a 5-point Likert scale, ranging from “Strongly Agree” to “Strongly Disagree” (Table 3). Most participants rated the drone’s guidance as clear, with strong agreement or agreement dominating responses. They found it easier to perform correct movements under the drone’s guidance compared to video or audio. The system was praised for its interactive nature, safety, and comfort, with many agreeing it was easy to operate and expressing satisfaction with the experience. Additionally, most participants indicated a willingness to use the system again. However, some challenges were noted, such as adapting to the drone’s guidance, and one participant suggested integrating the drone with video illustrations for enhanced practicality. Overall, the feedback reflected positive reception with potential for improvement.

Table 3. 5-point Likert scale questionnaire reflecting user feedback on the Pei-Wo Drone.

Discussion

We have developed an exercise guidance system to support and encourage older adults to perform regular exercises or movements at home. Unlike other physical exercise or movement guidance systems, our system, Pei-Wo Drone, guides users with a real 3D flyable object—a small palm-sized drone. The drone provides real-time sound feedback to notify users if they are not following it. The system can be used as part of a physical rehabilitation program for upper limb motor skill training. Because the drone can move three-dimensionally and its flying speed is controllable, training tasks can be customized to fit patients’ or individuals’ needs. This introduces an innovative concept of personalized physical training and rehabilitation through an interactive physical movement guidance system.

However, the maximum flight time of a small quadcopter is currently limited to 5–6 mins19. Thus, this limitation should be considered when designing any type of application using a small drone. Additionally, the appearance, interaction type, and safety of drones should be considered when designing drones to interact with humans, as mentioned in a study by Yeh et al.42.

The processed drone trajectory data revealed that the drone motion accuracies were greater than 88% for each selected movement. However, the drone motion precision was obviously high in the arm up and down movement (approximately 98%) compared with that of the lateral arm reach movement (approximately 80%). This is presumably due to the limitations of the positioning sensors that the drone controller uses in different movement directions. According to the specifications of the drone and the positioning sensors that we used on it (Methods), the drone determines its current position from the sensor fusion algorithm, which outputs the value from the most accurate sensor in each axis. Since the drone can obtain a highly accurate absolute position in the vertical axis from the flow deck sensor, higher precision and higher accuracy were observed in the arm up and down movement (vertical movement) than in the lateral arm reach movement (horizontal movement).

In terms of the RMS errors between the real-time wrist positions of the users and the position of the drone during each phase of the two movements (Fig. 2g, h), the mean real-time RMS error ranged between 0.18 and 0.22 m for the lateral arm reach movement (Fig. 2g). However, for the arm up and down movement, the average RMS error ranged between 0.10 and 0.18 m (Fig. 2h). After considering the results in each phase, the averages of the real-time RMS errors of the right arm (phases 1a, 1b, and 1) were likely to be smaller than those of the left arm (phases 2a, 2b, and 2) for both selected movements. We assume that this may be due to the impact of the handedness of the user on the reaction time since almost every participant in our study was right-handed; one individual did not specify which hand was her dominant hand. Additionally, Karim et al. reported that the reaction time of all participants was faster in the dominant hand than in the nondominant hand61. However, further study is needed to confirm this assumption.

After the experiment, we administered a 5-point Likert scale questionnaire (Table 3) to gather user feedback on the drone guidance system. Out of fifteen participants, ten agreed that the Pei-Wo Drone was easy to use and expressed interest in exercising with the drone again, with four participants strongly agreeing on these points. Overall, the majority of participants were satisfied with the drone’s guidance and found it to be clear. The results of this study, combined with positive user feedback, suggest the potential for the Pei-Wo Drone to serve as effective exercise guidance tools for older adults, supporting their physical well-being through a practical and engaging home-based approach.

Several key points of user feedback should be addressed to improve the system in future iterations. While the drone is small, lightweight (approximately 30 g), and safe for users—even in the event of a malfunction, with no participants expressing concerns about injury—it could benefit from design refinements that enhance its user-friendliness and comfort, especially when compared to traditional, simpler guidance tools like video and audio. For instance, redesigning the drone to resemble a flying animal, such as a butterfly, bird, or bee, could make it more appealing to older adult users, potentially increasing motivation to continue exercising. These design improvements would likely enhance the overall user experience.

In addition, enhancing interactive feedback by incorporating features such as colored lights or replacing the beep sound with relaxing music could heighten the sense of interaction and engagement. Adjusting the feedback threshold to create a self-adaptive system would allow the drone to better respond to individual user needs, potentially enhancing physical training outcomes. The difficulty level of a task significantly impacts user performance and cognitive load, as demonstrated in various studies62,63,64,65. Research examining the impact of game-task difficulty on cognitive load and performance suggests that different types of challenges elicit varying responses from players, and a proper level of game challenge can minimize cognitive load65. Finally, as suggested by participants, incorporating a video demonstration alongside the drone during the initial exercise session would further enhance the system’s practicality, especially for users who are unfamiliar with this type of exercise.

Furthermore, the sample size in this study was limited to 15 participants, most of whom were right-handed. While the findings offer valuable preliminary insights into the effectiveness of the Pei-Wo Drone for exercise guidance, the sample’s homogeneity poses a limitation. A larger and more diverse participant pool—including individuals from various age groups, physical conditions, and with different hand dominances—would improve the generalizability of the results. Notably, motor coordination and responsiveness may vary between right-handed and left-handed users, which could impact interaction with the system66,67 and should be explored in future research. Expanding the sample size and ensuring greater diversity would strengthen the validity and broader applicability of the study’s conclusions.

In addition, the use of a drone as a guiding medium offers a more realistic and engaging experience compared to screen-based or immersive VR/AR guidance systems. As a tangible 3D object, the drone allows users to focus directly on their physical tasks68 without the distractions or visual strain associated with digital screens. Additionally, similar to other exercise guidance systems—such as mobile applications43,44, Wii45,46,47, Kinect48,49, wearable motion trackers50, and AR/VR51,52,53,54,55,56—the Pei-Wo Drone can deliver interactive real-time feedback, allowing users to adjust and refine their movements during exercise for improved accuracy and effectiveness. However, to enhance accessibility for older adults, the system’s setup and operation should be simplified. Improving the drone’s local positioning system—similar to advancements in AR/VR headsets that no longer require external devices to establish a coordinate system—could streamline the process and make it more user-friendly.

In summary, the developed home-based 3D exercise drone guidance system for older adults, named Pei-Wo Drone, demonstrated sufficient accuracy and precision, particularly in terms of vertical movement. However, the accuracy and precision of drone guidance can be further improved with advancements in indoor autonomous drone technology and its positioning system. In addition, the interaction between each participant, i.e., older adults, and the drone was impressive. According to the analysis of the real-time RMS error between the participants’ wrist positions and the drone positions, the participants tended to follow the drone well, particularly with their right arm, which was the dominant arm for most participants. Moreover, the positive feedback received from the users of the system further supports its effectiveness.

In conclusion, our home-based 3D exercise drone guidance system demonstrates the potential of utilizing an indoor autonomous drone system to facilitate human exercise and promote healthy aging. It offers promise as a versatile exercise tool and as an interactive system for home-based training or rehabilitation. The system lays a foundation for further development, incorporating the concept of personalized physical rehabilitation to support diverse user needs.

To enhance the system’s general applicability, future research could focus on refining the feedback mechanism to improve user engagement, incorporating multimodal feedback (e.g., visual cues or haptic feedback) to accommodate different user preferences, and expanding the participant pool to include a more diverse demographic. Additionally, further studies could investigate user feedback and assess the physical and cognitive benefits of training with the drone-based system compared to other exercise guidance technologies. These advancements would contribute to the system’s adaptability, effectiveness, and broader applicability in exercise and rehabilitation settings.

Methods

Materials and system integration

Pei-Wo Drone was developed by using a commercial programmable and customizable drone Crazyflie 2.1 (Bitcraze AB, Sweden) and an indoor loco positioning system from the same manufacturer to preprogram the drone’s path and allow it to fly autonomously. The system consists of four main parts: (1) a drone (Fig. 1a); (2) two wearable wrist sensors (Fig. 1b, c); (3) a loco positioning system (LPS) that includes eight loco positioning nodes or anchors (Figs. 1d, 3c) and three loco positioning decks or tags, one on the drone (Fig. 1a) and one on each wrist sensor (Fig. 1b, c); and (4) a computer laptop with a Crazyradio PA 2.4 GHz USB dongle (Fig. 1e), which is used for communication among the drone, wrist sensors, and LPS. The drone we used in this system is a Crazyflie 2.1 micro quadcopter powered by a rechargeable 240 mAh LiPo battery. A flow deck v2 (a VL53L1x ToF sensor integrated with a PMW3901 optical flow sensor) and a loco positioning deck (tag), which is a subpart of the LPS, are also attached. Each wrist sensor consists of a Crazyflie Bolt, which is the main control board, a loco positioning deck, and a 240 mAh LiPo battery. The LPS creates a global 3D coordinate system for the drone and wrist sensors. The anchors and tags communicate with each other through ultrawideband (UWB) sensors. With respect to the specifications of the UWB chip (Decawave DWM1000), the UWB sensor consumes a small amount of energy, but it can communicate with an extra high bandwidth over a high radio spectrum. Additionally, it has a maximum range of 10 m with a distance error of approximately ± 10 cm (https://www.bitcraze.io/products/loco-positioning-deck/). Moreover, the VL53L1x ToF sensor on the flow deck has an error margin of a few millimeters depending on the surface and light conditions, and it can measure a distance of up to 4 m in the vertical direction (https://www.bitcraze.io/products/flow-deck-v2/).

Selected movement and drone path planning

The trajectories of the drone were preprogrammed on the basis of the specific arm movement steps associated with each selected motion. The endpoint of each step is a checked point for the drone (Fig. 3a, b). The primary direction of the arm reach movement aligns with the mediolateral (ML) aspect of the human body (Fig. 1f). To perform this movement (Fig. 3d), first, the participant stands in the default position. Second, they simultaneously extend their right leg along the ML direction while crossing their arms over their chest. Third, they extend their right arm along the ML direction. Finally, they sweep their right arm down and move back to the default position while concurrently returning their left arm back to the default position. Then, the left side exercise is performed with the same pattern as that of the right side. On the other hand, the arm up and down motions involve the vertical movement of the arms, as associated with the human body (Fig. 1g). With respect to this movement (Fig. 3e), the participants begin in the default position, as in the previous movement, and then reach their right arm up while simultaneously pushing their left arm down. After that, they return to the default position. For the left side exercise, the participants perform a mirrored movement to that of the right side. The duration of each step of the “lateral arm reach” and “arm up and down” movements was determined from the Baduanjin exercise practice video (https://www.youtube.com/watch?v=oqiENrM30Yk) for the second movement (Fig. 1f) and the third movement (Fig. 1g), respectively.

To preprogram the drone’s movements, we used the starting position of the drone, which was positioned in front of the participant along the midline and 1.2 m away from the participant (Fig. 3c), based on an investigation by Wojciechowska et al. 69 on the optimal proximity range between humans and drones. In addition, we identified six other reference points for the lateral arm reach movement (Fig. 3a) and four reference points for the arm up and down movement (Fig. 3b). However, the values of these reference points were specific to each participant and relied on the differences in their physical body lengths, e.g., arm length, height, and width between their legs after stepping in the ML direction (only considering the lateral arm reach movement). Therefore, we measured the body segments of each participant before performing the experiment and used them as the inputs of our program.

Drone control and interactive feedback

We used the Python 3.7 API and the opensource library from Bitcraze (https://github.com/bitcraze) to program the drone, allowing it to fly autonomously in the paths we designed. With respect to drone control, the sensor fusion algorithm developed by Bitcraze determines the most accurate position of the drone. A determination of the absolute current position results from the most accurate sensor attached to the drone. For vertical movement, the drone can refer to its absolute position from the sensors on either the UWB tag or the flow deck. However, since the sensor on the flow deck has a smaller amount of error than the UWB tag does, the drone will refer mainly to its current position from the sensor on the flow deck. For horizontal movement, the drone can refer only to its absolute position from the UWB sensor, which has an approximately ± 10-cm error. To obtain the real-time positions of the participant’s wrists and generate sound feedback to interact with the participant, the participant must wear wrist sensors (Fig. 1c). To obtain the best accuracy when more than one tag is used in the LPS, as mentioned on the official website of Bitcraze, we used the time difference of arrival 2 (TDoA2) method with eight anchors to transmit signals among anchors and tags (https://www.bitcraze.io/documentation/system/positioning/loco-positioning-system/).

In this study, we developed a drone guidance system to assist older adult participants in performing exercise-based movements by providing real-time interactive audio feedback. If a participant fails to follow the drone (i.e., when the absolute positions of their wrists exceed the defined threshold range), the system immediately emits a “beep” sound through the laptop’s speaker. This sound has a constant frequency of 2500 Hertz and lasts for 500 ms per beep. The threshold range refers to the acceptable range of the error gap between the drone and the participant’s wrists in any major direction for the two selected movements (e.g., the major direction of the arm up and down movement was aligned with the vertical axis in reference to the anatomical position). In other words, the system does not provide feedback (beep sound) to the participant if the positions of the wrist sensor are still within the threshold range (± 20 cm, summation errors of the UWB tags (± 10 cm each) on the drone and the wrist sensor). However, the volume of the feedback was adjusted based on the participants’ preferences. Before data collection, we always confirmed with the participants that they could hear the sound feedback clearly.

System validation

To validate the drone flight for each selected movement, the output trajectories of the drone in the major flight axis observed by the motion capture system were used to calculate the accuracy and precision of the drone. To validate the drone while performing each selected movement, we defined three target positioning ranges for each movement, and each target range was tested through five trials. The target ranges for the lateral arm reach movement were 0.80, 0.85, and 0.90 m from the drone position after takeoff (along the ML direction). The target ranges for the arm up and down movement were 0.80, 0.90, and 1.0 m above the drone position after takeoff (along the vertical direction).

The selection of these specific target positioning ranges was based on the approximate movement ranges programmed for the drone during real-time guidance. For instance, in Phase 1 of the lateral arm reach movement, the target range along the ML axis extended from the drone’s starting position (Fig. 3a-1) or Event 0 (Fig. 3d) to the maximum range it reached (Fig. 3a-3) or Event 2 (Fig. 3d). Similarly, in Phase 1 of the arm up and down movement, the target range along the vertical axis spanned from the drone’s starting position (Fig. 3b-1) or Event 0 (Fig. 3e) to the maximum range achieved (Fig. 3b-2) or Event 2 (Fig. 3e).

These target ranges were applied for both left (Phase 1) and right (Phase 2) arm guidance. The calculation methods that we used to obtain the accuracy and precision of the drone guidance system were a relative error, as represented in Eq. (1), and a relative uncertainty, as represented in Eq. (2), respectively70.

$$Accuracy \left( {relative \;error} \right) = \left| {\frac{measured\; value - target \;range}{{target \;range}}} \right|$$
(1)
$$Precision \;\left( {relative\; uncertainty} \right) = \left| {\frac{{standard \;deviation \;of\; measured \;value\;\left( {uncertainty} \right)}}{measured \;value}} \right|$$
(2)

MATLAB software was used to analyze the data and calculate the accuracy and precision of the drone. The measured value and standard deviation for each target range were calculated by averaging the results from five recorded trials.

Participant recruitment

In this study, we recruited participants who were aged 60 years or older, capable of performing regular exercise, proficient in Mandarin Chinese—the language used in the Baduanjin practice video—and had no prior experience with or practice of Baduanjin exercise. Additionally, participants were recruited from various communities in Tainan, Taiwan. A total of fifteen healthy individuals (mean age: 67.40 ± 5.85 years; twelve females and three males) participated in this study to evaluate the usability of the Pei-Wo Drone as an exercise guidance tool for older adults. Fourteen participants were right-handed, while the dominant hand of one participant was not specified. None of the participants had any illnesses or conditions, such as musculoskeletal diseases or chronic diseases, that would prevent them from engaging physical activities or regular exercise. The study design and protocol were approved by the National Cheng Kung University Human Research Ethics Committee (NCKU HREC) under approval number NCKU HREC_E_109-423-2. All methods were conducted in accordance with the relevant guidelines and regulations. Additionally, the study was registered at http://www.clinicaltrials.gov under the identifier NCT05362214.

Data collection

An eight-camera Kestrel-4200 digital real-time motion analysis system (Motion Analysis Corp., Santa Rosa, CA) was used to record the trajectories of the drone and the participant’s wrists at a sampling rate of 120 Hz. The markers on the wrists of the participant were ball-shaped reflective markers with a size of 25 mm in diameter, whereas the markers on the drone were 6.5 mm in diameter ball-shaped reflective markers, which were fixed on the motion capture marker deck on the drone (Fig. 1a). For every participant, we recorded the motion data while they were performing each selected movement with the Pei-Wo Drone for five trials.

Data analysis and phase segmentation of each selected movement

The recorded data from the motion capture system were postprocessed via MATLAB programming. A 4th-order, zero lag low-pass Butterworth filter with a 20 Hz cutoff frequency was applied to remove high-frequency noise71. Before the RMS errors between the participant’s wrist positions and drone positions were analyzed, the data were divided into two phases and each phase had two subphases as follows.

Lateral arm reach movement (Fig. 3d):

  • Phase 1a: The participant stood in the default position. They simultaneously extended their right leg along the ML direction while crossing their arms over their chest. After that, they reached their right arm along the ML direction.

  • Phase 1b: The participant swept their right arm down to move it back to the default position and returned their left arm back to the default position at the same time.

  • Phase 1: Phase 1a + Phase 1b

  • Phase 2a: Like phase 1a but for the left side of the participant instead (mirrored movement of that in phase 1a).

  • Phase 2b: Like phase 1b but for the left side of the participant instead (mirrored movement of that in phase 1b).

  • Phase 2: Phase 2a + Phase 2b

Arm up and down movement (Fig. 3e):

  • Phase 1a: The participant stood in the default position. Then, the patient raised their right arm while simultaneously lowering their left arm.

  • Phase 1b: The participant returned both arms back to the default position.

  • Phase 1: Phase 1a + Phase 1b

  • Phase 2a: Like phase 1a but for the left side of the participant instead (mirrored movement of that in phase 1a).

  • Phase 2b: Like phase 1b but for the left side of the participant instead (mirrored movement of that in phase 1b).

  • Phase 2: Phase 2a + Phase 2b

The data in each phase were analyzed separately. For the right-side phases (Phases 1a, 1b, and 1), only the positioning data of the right wrist position were analyzed. Meanwhile, only the positioning data of left wrist position were analyzed for the left-side phases (Phases 2a, 2b, and 2).