Abstract
Active and healthy aging is in great demand in an aging society. Maintaining a high level of physical function is a key aspect of healthy aging. However, many older adults engage in low levels of exercise and physical activity, often due to limited motivation or access to proper guidance. Here, we demonstrate a novel home-based exercise guidance system named Pei-Wo Drone, which assists users perform exercise or physical movements effectively in limited space. By following the drone’s trajectory and receiving real-time feedback, users can ensure correct movements and make necessary adjustments. The system provides real-time continuous sound feedback to notify the users if they deviate from the drone’s path. The study results indicate that older adult participants (12 females, 3 males; 67.40 ± 5.85 years) successfully followed the drone’s movements. In summary, using a drone with real-time feedback to guide physical exercises has the potential to support healthy aging in older adults.
Similar content being viewed by others
Introduction
Engaging in exercise and physical activity (PA) provides tangible health benefits at every age, especially at advanced ages. Our bodies require regular exercise or PA to maintain a healthy state. Several studies have shown that regular exercise or PA can maintain physical functions, reduce falls, prevent diseases, and enhance physical performance and daily living activities among older adults1,2,3,4,5,6. However, more than one-fourth of the world’s older adults have PA levels lower than those recommended by the World Health Organization7. Additionally, the global aging population, particularly in Eastern and Southeastern Asian countries, is expected to increase significantly to 1.5 billion by 20508. Encouraging and motivating older adults to engage in regular exercise or physical activity is crucial to support the development of a healthy and active aging society, which is a major priority for health care and socioeconomic advancement9.
Generally, community-dwelling older adults and those who attend senior day care centers are eager to exercise with their community friends. However, older adults who are not involved in these communities tend to have lower levels of PA and may spend more time at home. Some research indicates that older adults who do not participate in community activities often exhibit lower levels of physical activity and can spend up to 80% of their day sedentary10, particularly those with chronic conditions or walking difficulties, as noted in the study11. This could be due to a lack of motivation or access to proper exercise guidance tools. Additionally, a systematic review of reviews by Zubala et al. highlighted the importance of providing older adults with sustainable options that meet their needs and preferences for engaging in PA in the long term12. Therefore, having interactive exercise training or guidance tools at home could be an excellent option to facilitate and motivate older adults to increase their exercise and PA levels in a convenient and sustainable manner.
Typically, a simple exercise guidance or training tool is a series of instructional exercise videos produced by fitness experts or exercise trainers. These videos provide two-dimensional (2D) cues to guide users through exercises. However, users watch and follow the instructor in the video without receiving feedback on their technique, which may result in incorrect execution and reduced benefits. Furthermore, in recent studies on the visual perception of 2D and three-dimensional (3D) objects, Korisky and Mudrik13 reported that participants can readily perceive real 3D objects better than 2D images can, and Ozana and Ganel14 reported that, unlike grasping real 3D objects, unrelated perceptual information vulnerability occurs when grasping 2D images. In addition, Snow et al. reported that 3D objects are more rememberable than 2D objects15 and that grasping 3D objects tends to increase attention and manual response in comparison with grasping 2D objects16. Unfortunately, to the best of our knowledge, no studies have developed an exercise guidance system that uses a real 3D guidance object and provides interactive real-time feedback to the user.
Owing to technological advancements, the emergence of small flyable robots, known as “drones”—small 3D objects that can move freely in 3D space—has broadened research areas, leading to many small branches. It has had a significant impact on society in various fields, including entertainment (games and sports)17,18,19,20,21, military (high-risk exploration and spying)22, agriculture (soil and field analysis)23,24, and even health care services (drug delivery)25,26. In the area of human‒robot interaction, human‒drone interaction (HDI) research has been continually growing over the past decade17,18,19,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42. Some researchers and developers have applied drones in sports education, such as dancing17,18 and boxing19, or in meditation, such as Tai-Chi Qigong28,30,31, to encourage strong and intimate interactions between humans and drones.
However, the aforementioned HDI innovations for sports education are limited for young people and are not suitable for older adults to safely and effectively perform physical exercises at home because of either the system designs themselves or the applied tasks and environments. Although the HDI in meditation from Delfa et al.31 seems to benefit older people, the system design and interaction focus mainly on mental exercise and mind–body connections rather than on physical training of the neuromusculoskeletal system. Accordingly, we recognize the potential of developing an HDI system using a palm-sized drone to encourage and guide older adults in performing physical exercises or movements, demonstrating the principle of promoting healthy aging. Using the drone as part of physical exercise introduces a novel approach to facilitating movement and activity through interactive technology.
In this article, we present a home-based 3D exercise drone guidance system for older adults named the Pei-Wo Drone. Pei-Wo means “accompany me” in Mandarin. Pei-Wo Drone was designed to support and encourage older adults when performing specific exercises at home. Instead of following an exercise trainer, the user will be guided by a simple and easy-to-recognize real 3D object. Additionally, Pei-Wo Drone includes an interactive feedback mechanism that enables the user to receive guidance and correction while performing the exercises to ensure proper and safe execution.
Compared to other indoor exercise guidance technologies, such as mobile applications43,44, Wii Gaming45,46,47, Kinect48,49 and wearable motion tracking systems (e.g., IMUs)50, which still rely on a 2D screen for visualization, the Pei-Wo Drone offers a distinct advantage. While systems like Wii, Kinect, and motion tracking technologies offer real-time feedback, they rely on 2D visual cues displayed on a screen, which may restrict the user’s depth perception and spatial awareness. In contrast, the Pei-Wo Drone uses a 3D movable object as a guiding medium, potentially enhancing the user’s visual perception by providing a more natural and dynamic 3D visual cue. This distinctive feature may improve engagement and help users connect more intuitively with their physical movements, creating a more immersive exercise experience.
When compared to immersive VR and AR systems, which also offer 3D visual cues and immersive experiences, the Pei-Wo Drone avoids the potential downsides of prolonged screen exposure. While VR and AR technologies can be highly immersive, they still rely on digital screens that can cause visual strain and discomfort over time, similar to 2D systems. Previous research has shown that screen-based exercise interventions, such as VR and AR, can enhance physical activity engagement and motor function across diverse populations, including healthy young individuals51,52, older adults53,54,55, and stroke patients51,56. However, prolonged exposure to digital screens has been associated with visual strain, headaches, impaired vision, and dry eyes57,58,59. It can also disrupt sleep patterns, increase stress levels, and contribute to mental health issues such as anxiety and depression60. The Pei-Wo Drone addresses these concerns by providing a 3D screen-free exercise assistance tool that not only encourages physical activity but also minimizes screen time and reduces the associated risks.
Results
Design and overview of Pei-Wo Drone, an exercise guidance system for older adults
The system consists of a palm-sized drone with a time-of-flight (ToF) sensor and an optical flow sensor deck (bottom deck) and an ultrawideband (UWB) loco positioning sensor or tag (top deck) (Fig. 1a), two wearable wrist tags (Fig. 1b, c), eight UWB loco positioning anchors (Fig. 1d and Methods), and a laptop with a USB radio dongle (Fig. 1e) to communicate with the drone and the wearable devices. The drone determines its current position from a sensor fusion algorithm, which outputs the value from the most accurate sensor in each rotational axis of the drone. Once the program is executed and the user places the wearable devices on their wrists, the system can receive the absolute positions of the drone and the wearable devices in real time, which enables the system to track the drone and the participant’s wrists simultaneously. The system provides interactive sound feedback (beep) if neither of the wrist sensors are within the threshold range of the drone’s current position (Fig. 1h). An executable program was written to control the drone’s path based on the trajectory of a preselected exercise or movement. In this study, there were two selected movements: (1) lateral arm reach movement (Fig. 1f) and (2) arm up and down movement (Fig. 1g).
Schematics and images of the home-based exercise guidance system with Pei-Wo Drone. (a) Assembly of a custom Crazyflie 2.1 nano drone with a combined ToF and optical flow sensor deck (bottom layer), motion capture marker deck (the 2nd layer from the top), and UWB loco positioning deck or UWB tag (top layer). The drone was powered by a 240 mAh LiPo battery (the 2nd layer from the bottom). Scale bar, 35 mm. (b, c) Each custom wearable wrist sensor contains a Crazyflie Bolt (bottom layer), 240 mAh LiPo battery (middle layer), and UWB loco positioning deck (top layer) (b). The participant must wear one wrist sensor on each wrist, pointing the UWB sensor (a small green color chip) out toward the fingers (c). Scale bars, 38 mm (b) and 50 mm (c). (d) A UWB loco positioning node or UWB anchor for building up a global 3D coordinate system for the drone and wrist sensors. Scale bar, 40 mm. (e) A laptop with a Crazyradio PA 2.4 GHz USB dongle to program the drone. Scale bar, 18 mm. (f) A selected movement for this guidance system named ‘lateral arm reach’ movement, where the arms move in the ML direction with respect to the human body. (g) Another selected movement named ‘arm up and down’ movement, where the arms in the vertical direction with respect to the human body. (h) A diagram of the real-time sound feedback of the Pei-Wo Drone.
Evaluation of the Pei-Wo Drone for trajectory guidance in 3D space
The system was evaluated in terms of accuracy (relative error) and precision (relative uncertainty) of the drone, as shown in Eqs. (1) and (2), respectively, after being programmed to reach the three predefined target positioning ranges for each selected movement. We attached reflective markers on top of the drone and then tracked its output trajectories while performing each selected movement with the motion capture system (Methods). The results with high accuracy and precision for both selected movements were observed after evaluation. The example trajectories of the drone while performing both movements with a target positioning range are shown in Fig. 2a, d.
Schematics of the testing results of the autonomous drone guiding system. (a) Drone trajectories while performing the lateral arm reach movement with a target range of 0.85 m in five trials. (b, c) Accuracy and precision values of the drone while performing the lateral arm reach movement for each target range (0.80 m, 0.85 m, and 0.90 m) when guiding the right arm (b) and the left arm (c). (d) Drone trajectories while performing the arm up and down movement with a target range of 0.9 m in five trials. (e, f) Accuracy and precision values of the drone while performing the arm up and down movement for each target range (0.80 m, 0.90 m, and 1.0 m) when guiding the right arm (e) and the left arm (f). (g, h) RMS errors between the participants’ wrist positions and positions of the drone in two phases with two subphases each; for lateral arm reach movement (g) and arm up and down movement (h).
Lateral arm reach movement. The moving trajectory of the drone for guiding the lateral arm reach movement is in the mediolateral direction of the user’s body (Fig. 1f). The accuracy and precision results of the drone while guiding the user in performing this movement are presented in Fig. 2b, c. After the results from the three preset positions were averaged, the accuracy and precision of the drone were approximately 88.42% (− 11.58% relative error, lower than the target range) and 78.59% (21.41% relative uncertainty), respectively, for right-arm guidance (Fig. 2b). For left arm guidance (Fig. 2c), the average accuracy and precision of the drone were approximately 97.41 percent (− 2.59 percent relative error, lower than the target range) and 79.93 percent (20.07 percent relative uncertainty), respectively.
Arm up and down movement. On the other hand, the moving trajectory of the drone for guiding the arm up and down movement is in the vertical direction compared with the user’s anatomical position (Fig. 1g). The results of the accuracy and precision of the drone while guiding the user in performing this movement are presented in Fig. 2e, f. After averaging, the results were 99.64% accurate (-0.36% relative error, lower than the target range) and 97.88% precise (2.12% relative uncertainty) for right arm guidance (Fig. 2e). Moreover, for left-arm guidance (Fig. 2f), the accuracy and precision of the drone were approximately 92.35% (7.65% relative error, higher than the target range) and 98.77% (1.23% relative uncertainty), respectively.
Ability of older adults to follow trajectories guided by the Pei-Wo Drone
We evaluated whether the Pei-Wo Drone system can be used as an exercise guidance tool for older adults by observing the root mean square (RMS) errors between the wrist positions of the users (n = 15, mean age: 67.40 ± 5.85 years) and the positions of the drones. The real-time positions of the drones and the participant’s wrists were tracked by an eight-camera motion capture system (Fig. 3c and Methods). Before the RMS error was analyzed, each recorded datum was cut into two phases with two subphases each (Fig. 3d-g) for both movements. The RMS errors in each phase of the selected movement are shown in Fig. 2g and Table 1 for the lateral arm reach movement and in Fig. 2h and Table 2 for the arm up and down movement, indicating the ability of the older adults to follow the drone.
Illustrations and schematics of performing two selected movements guided by Pei-Wo Drone, system and experimental setup, and movement phase analysis. (a) The lateral arm reach movement steps with respect to the starting position (1) and six reference points (2–7) of the drone while guiding the participant in performing this movement. (b) The arm up and down movement steps with respect to the starting position (1) and four reference points (2–5) of the drone while guiding the participant in performing this movement. (c) System setup with eight anchors in the LPS and experimental setup with an eight-camera motion capture system. (d, f) Phase segmentation for the analysis of the lateral arm reach movement (d) and data segmentation in MATLAB programming (f). (e, g) Phase segmentation of the arm up and down movement (e) and data segmentation in MATLAB programming (g).
Participant feedback on Pei-Wo Drone
The participant feedback on the Pei-Wo Drone for Baduanjin movement guidance was collected in a form of questionnaire by using a 5-point Likert scale, ranging from “Strongly Agree” to “Strongly Disagree” (Table 3). Most participants rated the drone’s guidance as clear, with strong agreement or agreement dominating responses. They found it easier to perform correct movements under the drone’s guidance compared to video or audio. The system was praised for its interactive nature, safety, and comfort, with many agreeing it was easy to operate and expressing satisfaction with the experience. Additionally, most participants indicated a willingness to use the system again. However, some challenges were noted, such as adapting to the drone’s guidance, and one participant suggested integrating the drone with video illustrations for enhanced practicality. Overall, the feedback reflected positive reception with potential for improvement.
Discussion
We have developed an exercise guidance system to support and encourage older adults to perform regular exercises or movements at home. Unlike other physical exercise or movement guidance systems, our system, Pei-Wo Drone, guides users with a real 3D flyable object—a small palm-sized drone. The drone provides real-time sound feedback to notify users if they are not following it. The system can be used as part of a physical rehabilitation program for upper limb motor skill training. Because the drone can move three-dimensionally and its flying speed is controllable, training tasks can be customized to fit patients’ or individuals’ needs. This introduces an innovative concept of personalized physical training and rehabilitation through an interactive physical movement guidance system.
However, the maximum flight time of a small quadcopter is currently limited to 5–6 mins19. Thus, this limitation should be considered when designing any type of application using a small drone. Additionally, the appearance, interaction type, and safety of drones should be considered when designing drones to interact with humans, as mentioned in a study by Yeh et al.42.
The processed drone trajectory data revealed that the drone motion accuracies were greater than 88% for each selected movement. However, the drone motion precision was obviously high in the arm up and down movement (approximately 98%) compared with that of the lateral arm reach movement (approximately 80%). This is presumably due to the limitations of the positioning sensors that the drone controller uses in different movement directions. According to the specifications of the drone and the positioning sensors that we used on it (Methods), the drone determines its current position from the sensor fusion algorithm, which outputs the value from the most accurate sensor in each axis. Since the drone can obtain a highly accurate absolute position in the vertical axis from the flow deck sensor, higher precision and higher accuracy were observed in the arm up and down movement (vertical movement) than in the lateral arm reach movement (horizontal movement).
In terms of the RMS errors between the real-time wrist positions of the users and the position of the drone during each phase of the two movements (Fig. 2g, h), the mean real-time RMS error ranged between 0.18 and 0.22 m for the lateral arm reach movement (Fig. 2g). However, for the arm up and down movement, the average RMS error ranged between 0.10 and 0.18 m (Fig. 2h). After considering the results in each phase, the averages of the real-time RMS errors of the right arm (phases 1a, 1b, and 1) were likely to be smaller than those of the left arm (phases 2a, 2b, and 2) for both selected movements. We assume that this may be due to the impact of the handedness of the user on the reaction time since almost every participant in our study was right-handed; one individual did not specify which hand was her dominant hand. Additionally, Karim et al. reported that the reaction time of all participants was faster in the dominant hand than in the nondominant hand61. However, further study is needed to confirm this assumption.
After the experiment, we administered a 5-point Likert scale questionnaire (Table 3) to gather user feedback on the drone guidance system. Out of fifteen participants, ten agreed that the Pei-Wo Drone was easy to use and expressed interest in exercising with the drone again, with four participants strongly agreeing on these points. Overall, the majority of participants were satisfied with the drone’s guidance and found it to be clear. The results of this study, combined with positive user feedback, suggest the potential for the Pei-Wo Drone to serve as effective exercise guidance tools for older adults, supporting their physical well-being through a practical and engaging home-based approach.
Several key points of user feedback should be addressed to improve the system in future iterations. While the drone is small, lightweight (approximately 30 g), and safe for users—even in the event of a malfunction, with no participants expressing concerns about injury—it could benefit from design refinements that enhance its user-friendliness and comfort, especially when compared to traditional, simpler guidance tools like video and audio. For instance, redesigning the drone to resemble a flying animal, such as a butterfly, bird, or bee, could make it more appealing to older adult users, potentially increasing motivation to continue exercising. These design improvements would likely enhance the overall user experience.
In addition, enhancing interactive feedback by incorporating features such as colored lights or replacing the beep sound with relaxing music could heighten the sense of interaction and engagement. Adjusting the feedback threshold to create a self-adaptive system would allow the drone to better respond to individual user needs, potentially enhancing physical training outcomes. The difficulty level of a task significantly impacts user performance and cognitive load, as demonstrated in various studies62,63,64,65. Research examining the impact of game-task difficulty on cognitive load and performance suggests that different types of challenges elicit varying responses from players, and a proper level of game challenge can minimize cognitive load65. Finally, as suggested by participants, incorporating a video demonstration alongside the drone during the initial exercise session would further enhance the system’s practicality, especially for users who are unfamiliar with this type of exercise.
Furthermore, the sample size in this study was limited to 15 participants, most of whom were right-handed. While the findings offer valuable preliminary insights into the effectiveness of the Pei-Wo Drone for exercise guidance, the sample’s homogeneity poses a limitation. A larger and more diverse participant pool—including individuals from various age groups, physical conditions, and with different hand dominances—would improve the generalizability of the results. Notably, motor coordination and responsiveness may vary between right-handed and left-handed users, which could impact interaction with the system66,67 and should be explored in future research. Expanding the sample size and ensuring greater diversity would strengthen the validity and broader applicability of the study’s conclusions.
In addition, the use of a drone as a guiding medium offers a more realistic and engaging experience compared to screen-based or immersive VR/AR guidance systems. As a tangible 3D object, the drone allows users to focus directly on their physical tasks68 without the distractions or visual strain associated with digital screens. Additionally, similar to other exercise guidance systems—such as mobile applications43,44, Wii45,46,47, Kinect48,49, wearable motion trackers50, and AR/VR51,52,53,54,55,56—the Pei-Wo Drone can deliver interactive real-time feedback, allowing users to adjust and refine their movements during exercise for improved accuracy and effectiveness. However, to enhance accessibility for older adults, the system’s setup and operation should be simplified. Improving the drone’s local positioning system—similar to advancements in AR/VR headsets that no longer require external devices to establish a coordinate system—could streamline the process and make it more user-friendly.
In summary, the developed home-based 3D exercise drone guidance system for older adults, named Pei-Wo Drone, demonstrated sufficient accuracy and precision, particularly in terms of vertical movement. However, the accuracy and precision of drone guidance can be further improved with advancements in indoor autonomous drone technology and its positioning system. In addition, the interaction between each participant, i.e., older adults, and the drone was impressive. According to the analysis of the real-time RMS error between the participants’ wrist positions and the drone positions, the participants tended to follow the drone well, particularly with their right arm, which was the dominant arm for most participants. Moreover, the positive feedback received from the users of the system further supports its effectiveness.
In conclusion, our home-based 3D exercise drone guidance system demonstrates the potential of utilizing an indoor autonomous drone system to facilitate human exercise and promote healthy aging. It offers promise as a versatile exercise tool and as an interactive system for home-based training or rehabilitation. The system lays a foundation for further development, incorporating the concept of personalized physical rehabilitation to support diverse user needs.
To enhance the system’s general applicability, future research could focus on refining the feedback mechanism to improve user engagement, incorporating multimodal feedback (e.g., visual cues or haptic feedback) to accommodate different user preferences, and expanding the participant pool to include a more diverse demographic. Additionally, further studies could investigate user feedback and assess the physical and cognitive benefits of training with the drone-based system compared to other exercise guidance technologies. These advancements would contribute to the system’s adaptability, effectiveness, and broader applicability in exercise and rehabilitation settings.
Methods
Materials and system integration
Pei-Wo Drone was developed by using a commercial programmable and customizable drone Crazyflie 2.1 (Bitcraze AB, Sweden) and an indoor loco positioning system from the same manufacturer to preprogram the drone’s path and allow it to fly autonomously. The system consists of four main parts: (1) a drone (Fig. 1a); (2) two wearable wrist sensors (Fig. 1b, c); (3) a loco positioning system (LPS) that includes eight loco positioning nodes or anchors (Figs. 1d, 3c) and three loco positioning decks or tags, one on the drone (Fig. 1a) and one on each wrist sensor (Fig. 1b, c); and (4) a computer laptop with a Crazyradio PA 2.4 GHz USB dongle (Fig. 1e), which is used for communication among the drone, wrist sensors, and LPS. The drone we used in this system is a Crazyflie 2.1 micro quadcopter powered by a rechargeable 240 mAh LiPo battery. A flow deck v2 (a VL53L1x ToF sensor integrated with a PMW3901 optical flow sensor) and a loco positioning deck (tag), which is a subpart of the LPS, are also attached. Each wrist sensor consists of a Crazyflie Bolt, which is the main control board, a loco positioning deck, and a 240 mAh LiPo battery. The LPS creates a global 3D coordinate system for the drone and wrist sensors. The anchors and tags communicate with each other through ultrawideband (UWB) sensors. With respect to the specifications of the UWB chip (Decawave DWM1000), the UWB sensor consumes a small amount of energy, but it can communicate with an extra high bandwidth over a high radio spectrum. Additionally, it has a maximum range of 10 m with a distance error of approximately ± 10 cm (https://www.bitcraze.io/products/loco-positioning-deck/). Moreover, the VL53L1x ToF sensor on the flow deck has an error margin of a few millimeters depending on the surface and light conditions, and it can measure a distance of up to 4 m in the vertical direction (https://www.bitcraze.io/products/flow-deck-v2/).
Selected movement and drone path planning
The trajectories of the drone were preprogrammed on the basis of the specific arm movement steps associated with each selected motion. The endpoint of each step is a checked point for the drone (Fig. 3a, b). The primary direction of the arm reach movement aligns with the mediolateral (ML) aspect of the human body (Fig. 1f). To perform this movement (Fig. 3d), first, the participant stands in the default position. Second, they simultaneously extend their right leg along the ML direction while crossing their arms over their chest. Third, they extend their right arm along the ML direction. Finally, they sweep their right arm down and move back to the default position while concurrently returning their left arm back to the default position. Then, the left side exercise is performed with the same pattern as that of the right side. On the other hand, the arm up and down motions involve the vertical movement of the arms, as associated with the human body (Fig. 1g). With respect to this movement (Fig. 3e), the participants begin in the default position, as in the previous movement, and then reach their right arm up while simultaneously pushing their left arm down. After that, they return to the default position. For the left side exercise, the participants perform a mirrored movement to that of the right side. The duration of each step of the “lateral arm reach” and “arm up and down” movements was determined from the Baduanjin exercise practice video (https://www.youtube.com/watch?v=oqiENrM30Yk) for the second movement (Fig. 1f) and the third movement (Fig. 1g), respectively.
To preprogram the drone’s movements, we used the starting position of the drone, which was positioned in front of the participant along the midline and 1.2 m away from the participant (Fig. 3c), based on an investigation by Wojciechowska et al. 69 on the optimal proximity range between humans and drones. In addition, we identified six other reference points for the lateral arm reach movement (Fig. 3a) and four reference points for the arm up and down movement (Fig. 3b). However, the values of these reference points were specific to each participant and relied on the differences in their physical body lengths, e.g., arm length, height, and width between their legs after stepping in the ML direction (only considering the lateral arm reach movement). Therefore, we measured the body segments of each participant before performing the experiment and used them as the inputs of our program.
Drone control and interactive feedback
We used the Python 3.7 API and the opensource library from Bitcraze (https://github.com/bitcraze) to program the drone, allowing it to fly autonomously in the paths we designed. With respect to drone control, the sensor fusion algorithm developed by Bitcraze determines the most accurate position of the drone. A determination of the absolute current position results from the most accurate sensor attached to the drone. For vertical movement, the drone can refer to its absolute position from the sensors on either the UWB tag or the flow deck. However, since the sensor on the flow deck has a smaller amount of error than the UWB tag does, the drone will refer mainly to its current position from the sensor on the flow deck. For horizontal movement, the drone can refer only to its absolute position from the UWB sensor, which has an approximately ± 10-cm error. To obtain the real-time positions of the participant’s wrists and generate sound feedback to interact with the participant, the participant must wear wrist sensors (Fig. 1c). To obtain the best accuracy when more than one tag is used in the LPS, as mentioned on the official website of Bitcraze, we used the time difference of arrival 2 (TDoA2) method with eight anchors to transmit signals among anchors and tags (https://www.bitcraze.io/documentation/system/positioning/loco-positioning-system/).
In this study, we developed a drone guidance system to assist older adult participants in performing exercise-based movements by providing real-time interactive audio feedback. If a participant fails to follow the drone (i.e., when the absolute positions of their wrists exceed the defined threshold range), the system immediately emits a “beep” sound through the laptop’s speaker. This sound has a constant frequency of 2500 Hertz and lasts for 500 ms per beep. The threshold range refers to the acceptable range of the error gap between the drone and the participant’s wrists in any major direction for the two selected movements (e.g., the major direction of the arm up and down movement was aligned with the vertical axis in reference to the anatomical position). In other words, the system does not provide feedback (beep sound) to the participant if the positions of the wrist sensor are still within the threshold range (± 20 cm, summation errors of the UWB tags (± 10 cm each) on the drone and the wrist sensor). However, the volume of the feedback was adjusted based on the participants’ preferences. Before data collection, we always confirmed with the participants that they could hear the sound feedback clearly.
System validation
To validate the drone flight for each selected movement, the output trajectories of the drone in the major flight axis observed by the motion capture system were used to calculate the accuracy and precision of the drone. To validate the drone while performing each selected movement, we defined three target positioning ranges for each movement, and each target range was tested through five trials. The target ranges for the lateral arm reach movement were 0.80, 0.85, and 0.90 m from the drone position after takeoff (along the ML direction). The target ranges for the arm up and down movement were 0.80, 0.90, and 1.0 m above the drone position after takeoff (along the vertical direction).
The selection of these specific target positioning ranges was based on the approximate movement ranges programmed for the drone during real-time guidance. For instance, in Phase 1 of the lateral arm reach movement, the target range along the ML axis extended from the drone’s starting position (Fig. 3a-1) or Event 0 (Fig. 3d) to the maximum range it reached (Fig. 3a-3) or Event 2 (Fig. 3d). Similarly, in Phase 1 of the arm up and down movement, the target range along the vertical axis spanned from the drone’s starting position (Fig. 3b-1) or Event 0 (Fig. 3e) to the maximum range achieved (Fig. 3b-2) or Event 2 (Fig. 3e).
These target ranges were applied for both left (Phase 1) and right (Phase 2) arm guidance. The calculation methods that we used to obtain the accuracy and precision of the drone guidance system were a relative error, as represented in Eq. (1), and a relative uncertainty, as represented in Eq. (2), respectively70.
MATLAB software was used to analyze the data and calculate the accuracy and precision of the drone. The measured value and standard deviation for each target range were calculated by averaging the results from five recorded trials.
Participant recruitment
In this study, we recruited participants who were aged 60 years or older, capable of performing regular exercise, proficient in Mandarin Chinese—the language used in the Baduanjin practice video—and had no prior experience with or practice of Baduanjin exercise. Additionally, participants were recruited from various communities in Tainan, Taiwan. A total of fifteen healthy individuals (mean age: 67.40 ± 5.85 years; twelve females and three males) participated in this study to evaluate the usability of the Pei-Wo Drone as an exercise guidance tool for older adults. Fourteen participants were right-handed, while the dominant hand of one participant was not specified. None of the participants had any illnesses or conditions, such as musculoskeletal diseases or chronic diseases, that would prevent them from engaging physical activities or regular exercise. The study design and protocol were approved by the National Cheng Kung University Human Research Ethics Committee (NCKU HREC) under approval number NCKU HREC_E_109-423-2. All methods were conducted in accordance with the relevant guidelines and regulations. Additionally, the study was registered at http://www.clinicaltrials.gov under the identifier NCT05362214.
Data collection
An eight-camera Kestrel-4200 digital real-time motion analysis system (Motion Analysis Corp., Santa Rosa, CA) was used to record the trajectories of the drone and the participant’s wrists at a sampling rate of 120 Hz. The markers on the wrists of the participant were ball-shaped reflective markers with a size of 25 mm in diameter, whereas the markers on the drone were 6.5 mm in diameter ball-shaped reflective markers, which were fixed on the motion capture marker deck on the drone (Fig. 1a). For every participant, we recorded the motion data while they were performing each selected movement with the Pei-Wo Drone for five trials.
Data analysis and phase segmentation of each selected movement
The recorded data from the motion capture system were postprocessed via MATLAB programming. A 4th-order, zero lag low-pass Butterworth filter with a 20 Hz cutoff frequency was applied to remove high-frequency noise71. Before the RMS errors between the participant’s wrist positions and drone positions were analyzed, the data were divided into two phases and each phase had two subphases as follows.
Lateral arm reach movement (Fig. 3d):
-
Phase 1a: The participant stood in the default position. They simultaneously extended their right leg along the ML direction while crossing their arms over their chest. After that, they reached their right arm along the ML direction.
-
Phase 1b: The participant swept their right arm down to move it back to the default position and returned their left arm back to the default position at the same time.
-
Phase 1: Phase 1a + Phase 1b
-
Phase 2a: Like phase 1a but for the left side of the participant instead (mirrored movement of that in phase 1a).
-
Phase 2b: Like phase 1b but for the left side of the participant instead (mirrored movement of that in phase 1b).
-
Phase 2: Phase 2a + Phase 2b
Arm up and down movement (Fig. 3e):
-
Phase 1a: The participant stood in the default position. Then, the patient raised their right arm while simultaneously lowering their left arm.
-
Phase 1b: The participant returned both arms back to the default position.
-
Phase 1: Phase 1a + Phase 1b
-
Phase 2a: Like phase 1a but for the left side of the participant instead (mirrored movement of that in phase 1a).
-
Phase 2b: Like phase 1b but for the left side of the participant instead (mirrored movement of that in phase 1b).
-
Phase 2: Phase 2a + Phase 2b
The data in each phase were analyzed separately. For the right-side phases (Phases 1a, 1b, and 1), only the positioning data of the right wrist position were analyzed. Meanwhile, only the positioning data of left wrist position were analyzed for the left-side phases (Phases 2a, 2b, and 2).
Data availability
The datasets used and/or analyzed during the current study available from the corresponding author on reasonable request.
References
Langhammer, B., Bergland, A. & Rydwik, E. The importance of physical activity exercise among older people. BioMed Res. Int. vol. 2018 Preprint at https://doi.org/10.1155/2018/7856823 (2018).
Gillespie, L. D. et al. Interventions for preventing falls in older people living in the community. Cochrane Database Syst. Rev. vol. 2012 Preprint at https://doi.org/10.1002/14651858.CD007146.pub3 (2012).
El-Khoury, F., Cassou, B., Charles, M. A. & Dargent-Molina, P. The effect of fall prevention exercise programmes on fall induced injuries in community dwelling older adults: Systematic review and meta-analysis of randomised controlled trials. BMJ (Online) 347, (2013).
Tricco, A. C. et al. Comparisons of interventions for preventing falls in older adults: A systematic review and meta-analysis. JAMA – J. Am. Med. Assoc. 318, 1687–1699 (2017).
McPhee, J. S. et al. Physical activity in older age: perspectives for healthy ageing and frailty. Biogerontology 17, 567–580 (2016).
Cheng, I. F., Kuo, L. C., Tsai, Y. J. & Su, F. C. The comparisons of physical functional performances between older adults with and without regular physical activity in two different living settings. Int J Environ Res Public Health 18, (2021).
WHO. Physical Activity Fact Sheet. https://www.who.int/publications/i/item/WHO-HEP-HPR-RUN-2021.2 (2021).
UN. Population Division. World Population Ageing 2019: Highlights. (United Nations, New York, 2019).
Kalache, A. & Gatti, A. Active ageing: A policy framework. Adv Gerontol 11, 7–18 (2003).
Harvey, J. A., Chastin, S. F. M. & Skelton, D. A. How sedentary are older people? A systematic review of the amount of sedentary behavior. J Aging Phys Act 23, 471–487 (2015).
Rosenberg, D. et al. Device-assessed physical activity and sedentary behavior in a community-based cohort of older adults. BMC Public Health 20, 1256 (2020).
Zubala, A. et al. Promotion of physical activity interventions for community dwelling older adults: A systematic review of reviews. PLoS One 12, (2017).
Korisky, U. & Mudrik, L. Dimensions of perception: 3D real-life objects are more readily detected than their 2D images. Psychol Sci 32, 1636–1648 (2021).
Ozana, A. & Ganel, T. Dissociable effects of irrelevant context on 2D and 3D grasping. Atten Percept Psychophys 80, 564–575 (2018).
Snow, J. C., Skiba, R. M., Coleman, T. L. & Berryhill, M. E. Real-world objects are more memorable than photographs of objects. Front. Hum. Neurosci. 8, (2014).
Gomez, M. A., Skiba, R. M. & Snow, J. C. Graspable objects grab attention more than images do. Psychol Sci 29, 206–218 (2018).
Kim, H. & Landay, J. A. Aeroquake: Drone augmented dance. DIS 2018 - Proceedings of the 2018 Designing Interactive Systems Conference 691–702 (2018) https://doi.org/10.1145/3196709.3196798.
Eriksson, S. et al. Dancing with drones crafting novel artistic expressions through intercorporeality. Conference on Human Factors in Computing Systems - Proceedings 1–12 (2019) https://doi.org/10.1145/3290605.3300847.
Zwaan, S. G. & Barakova, E. I. Boxing against drones: Drones in sports education. Proceedings of IDC 2016 —The 15th International Conference on Interaction Design and Children 607–612 (2016) https://doi.org/10.1145/2930674.2935991.
Graether, E. & Mueller, F. Joggobot: A flying robot as jogging companion. Conference on Human Factors in Computing Systems—Proceedings 1063–1066 (2012) https://doi.org/10.1145/2212776.2212386.
Mueller, F. F. & Muirhead, M. Jogging with a quadcopter. Conference on Human Factors in Computing Systems—Proceedings 2015-April, 2023–2032 (2015).
Hayat Adnan, W. & Fadly Khamis, M. Drone use in military and civilian application: Risk to national security. Journal of Media and Information Warfare 15, 60–70 (2022).
Kalamkar, R. B., Ahire, M. C., Ghadge, P. A., Dhenge, S. A. & Anarase, M. S. Drone and its applications in agriculture. Int J Curr Microbiol Appl Sci 9, 3022–3026 (2020).
Huuskonen, J. & Oksanen, T. Soil sampling with drones and augmented reality in precision agriculture. Comput Electron Agric 154, 25–35 (2018).
De Silvestri, S., Pagliarani, M., Tomasello, F., Trojaniello, D. & Sanna, A. Design of a service for hospital internal transport of urgent pharmaceuticals via drones. Drones 6, (2022).
Jain, P. Medicine delivery drone. Int. J. Eng. Res. Technol. 9, 191–193 (2020).
Baytaş, M. A., La Delfa, J., Ljungblad, S. & Fjeld, M. Agent archetypes for human-drone interaction: Social robots or objects with intent? CEUR Workshop Proc 2617, (2020).
Delfa, J. La, Jarvis, R., Khot, R. A. & Mueller, F. Tai Chi In The Clouds: Using micro UAVs To support Tai Chi practice. CHI PLAY 2018 - Proceedings of the 2018 Annual Symposium on Computer-Human Interaction in Play Companion Extended Abstracts 513–519 (2018) https://doi.org/10.1145/3270316.3271511.
la Delfa, J., Baytaş, M. A., Luke, E., Koder, B. & Mueller, F. F. Designing drone chi: Unpacking the thinking and making of somaesthetic human-drone interaction. DIS 2020 - Proceedings of the 2020 ACM Designing Interactive Systems Conference 575–586 (2020) https://doi.org/10.1145/3357236.3395589.
Delfa, J. La, Wichtowski, O., BaytaŞ, M. A., Khot, R. A. & Mueller, F. F. Are drones meditative? Conference on Human Factors in Computing Systems - Proceedings 1–4 (2019) https://doi.org/10.1145/3290607.3313274.
la Delfa, J. et al. Drone Chi: Somaesthetic human–drone interaction. Conference on Human Factors in Computing Systems - Proceedings (2020) https://doi.org/10.1145/3313831.3376786.
Cauchard, J. R. et al. Drone.io: A Gestural and visual interface for human–drone interaction. in 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI) 153–162 (2019). https://doi.org/10.1109/HRI.2019.8673011.
Obaid, M., Kistler, F., Kasparavičiute, G., Yantaç, A. E. & Fjeld, M. How would you gesture navigate a drone? A user-centered approach to control a drone. AcademicMindtrek 2016 - Proceedings of the 20th International Academic Mindtrek Conference 113–121 (2016) https://doi.org/10.1145/2994310.2994348.
Treurniet, T. et al. Drones with eyes: expressive Human-Drone Interaction. International workshop on Human-Drone Interaction (2019).
Cleland-Huang, J. & Agrawal, A. Human–drone interactions with semi-autonomous cohorts of collaborating drones. CEUR Workshop Proc 2617, (2020).
Hansen, J. P., Alapetite, A., MacKenzie, I. S. & Møllenbach, E. The use of gaze to control drones. Eye Tracking Research and Applications Symposium (ETRA) 27–34 (2014) https://doi.org/10.1145/2578153.2578156.
Malliaraki, E. Social interaction with drones using human emotion recognition. ACM/IEEE International Conference on Human-Robot Interaction 187–188 (2018) https://doi.org/10.1145/3173386.3176966.
Knierim, P. et al. Tactile drones—providing immersive tactile feedback in virtual reality through quadcopters. Conference on Human Factors in Computing Systems - Proceedings Part F1276, 433–436 (2017).
Jane, L. E., Ilene, L. E., Landay, J. A. & Cauchard, J. R. Drone & Wo: Cultural influences on human–drone interaction techniques. Conference on Human Factors in Computing Systems—Proceedings 2017-May, 6794–6799 (2017).
Abtahi, P., Zhao, D., E., J. & Landay, J. Drone near me: Exploring touch-based human–drone interaction. Proc ACM Interact Mob Wearable Ubiquitous Technol 1, 1–8 (2017).
Tan, H., Lee, J. & Gao, G. Human–drone interaction: Drone delivery and services for social events. DIS 2018—Companion Publication of the 2018 Designing Interactive Systems Conference 183–187 (2018) https://doi.org/10.1145/3197391.3205433.
Yeh, A. et al. Exploring proxemics for human-drone interaction. HAI 2017 - Proceedings of the 5th International Conference on Human Agent Interaction 81–88 (2017) https://doi.org/10.1145/3125739.3125773.
Chae, H. J. et al. An artificial intelligence exercise coaching mobile app: Development and randomized controlled trial to verify its effectiveness in posture correction. Interact J Med Res 12, e37604 (2023).
Huang, R., Wang, J., Lou, H., Lu, H. & Wang, B. Miss yoga: A yoga assistant mobile application based on keypoint detection. in 2020 Digital Image Computing: Techniques and Applications (DICTA) 1–3 (2020). https://doi.org/10.1109/DICTA51227.2020.9363384.
Saposnik, G. et al. Effectiveness of virtual reality using wii gaming technology in stroke rehabilitation: A pilot randomized clinical trial and proof of principle. Stroke 41, 1477–1484 (2010).
Fan, S.-C. et al. Improved intrinsic motivation and muscle activation patterns in reaching task using virtual reality training for stroke rehabilitation: A pilot randomized control trial. J Med Biol Eng 34, 399–407 (2014).
Marques-Sule, E. et al. Effectiveness of nintendo wii and physical therapy in functionality, balance, and daily activities in chronic stroke patients. J Am Med Dir Assoc 22, 1073–1080 (2021).
Zhao, W., Feng, H., Lun, R., Espy, D. D. & Reinthal, M. A. A kinect-based rehabilitation exercise monitoring and guidance system. in 2014 IEEE 5th International Conference on Software Engineering and Service Science 762–765 (2014). https://doi.org/10.1109/ICSESS.2014.6933678.
Dash, A., Yadav, A., Chauhan, A. & Lahiri, U. Kinect-assisted performance-sensitive upper limb exercise platform for post-stroke survivors. Front Neurosci 13 (2019).
Milosevic, B., Leardini, A. & Farella, E. Kinect and wearable inertial sensors for motor rehabilitation programs at home: State of the art and an experimental comparison. Biomed Eng Online 19, 25 (2020).
Lin, C. W. et al. Development and testing of a virtual reality mirror therapy system for the sensorimotor performance of upper extremity: A pilot randomized controlled trial. IEEE Access 9, 14725–14734 (2021).
Inoue, N. et al. Effect of Display location on finger motor skill training with music-based gamification. in Human Aspects of IT for the Aged Population. Healthy and Active Aging (eds. Gao, Q. & Zhou, J.) 78–90 (Springer International Publishing, Cham, 2020).
Liao, Y., Hsu, M. H. K., Xie, F. L., Dai, H. X. & Liu, M. The role of virtual reality on physical and psychological health among older adults: A systematic review. Edelweiss Applied Science and Technology 8, 2762–2777 (2024).
Peng, Y. et al. Virtual reality exergames for improving physical function, cognition and depression among older nursing home residents: A systematic review and meta-analysis. Geriatr Nurs (Minneap) 57, 31–44 (2024).
Ferreira, S. et al. Effects of an exercise program with augmented reality on functional fitness and physical activity of community-dwelling older adults. Front Sports Act Living 6, (2024).
Hsu, H. Y. et al. Effects of a virtual reality-based mirror therapy program on improving sensorimotor function of hands in chronic stroke patients: A randomized controlled trial. Neurorehabil Neural Repair 36, 335–345 (2022).
Kaur, K. et al. Digital eye strain: A comprehensive review. Ophthalmol. Ther. vol. 11 1655–1680 Preprint at https://doi.org/10.1007/s40123-022-00540-9 (2022).
Pavel, I. A. et al. Computer vision syndrome: An ophthalmic pathology of the modern era. Medicina (Lithuania) vol. 59 Preprint at https://doi.org/10.3390/medicina59020412 (2023).
Beeson, D. et al. Digital eye strain symptoms worsen during prolonged digital tasks, associated with a reduction in productivity. Comput. Hum. Behav. Rep. 16, 100489 (2024).
Nakshine, V. S., Thute, P., Khatib, M. N. & Sarkar, B. Increased screen time as a cause of declining physical, psychological health, and sleep patterns: A literary review. Cureus https://doi.org/10.7759/cureus.30051 (2022).
Karim Chouamo, A., Griego, S. & Susana Martinez Lopez, F. Reaction time and hand dominance. J. Sci. Med. 3, 1–12 (2021).
Leiker, A. M. et al. The effects of autonomous difficulty selection on engagement, motivation, and learning in a motion-controlled video game task. Hum Mov Sci 49, 326–335 (2016).
Abbas, S. & Jeong, H. Task difficulty impact on multitasking in mixed reality environments. Comput. Educ.: X Reality 4, 100065 (2024).
Jaquess, K. J. et al. Changes in mental workload and motor performance throughout multiple practice sessions under various levels of task difficulty. Neuroscience 393, 305–318 (2018).
Seyderhelm, A. J. A. & Blackmore, K. L. How hard is it really? Assessing game-task difficulty through real-time measures of performance and cognitive load. Simul Gaming 54, 294–321 (2023).
Hiraoka, K. et al. The laterality of stop and go processes of the motor response in left-handed and right-handed individuals. Laterality 23, 51–66 (2018).
Przybyla, A., Good, D. C. & Sainburg, R. L. Dynamic dominance varies with handedness: Reduced interlimb asymmetries in left-handers. Exp Brain Res 216, 419–431 (2012).
Butz, B., Jussen, A., Rafi, A., Lux, G. & Gerken, J. A Taxonomy for augmented and mixed reality applications to support physical exercises in medical rehabilitation: A literature review. Healthcare (Switzerland) vol. 10 Preprint at https://doi.org/10.3390/healthcare10040646 (2022).
Wojciechowska, A., Frey, J., Sass, S., Shafir, R. & Cauchard, J. R. Collocated human–drone interaction: Methodology and approach strategy. in 2019 14th ACM/IEEE International Conference on Human–Robot Interaction (HRI) 172–181 (2019). https://doi.org/10.1109/HRI.2019.8673127.
Advanced Instructional Systems, Inc. & Department of Physics and Astronomy at the University of North Carolina. Measurements and Error Analysis. © 2011 Advanced Instructional Systems, Inc. and the University of North Carolina https://www.webassign.net/question_assets/unccolphysmechl1/measurements/manual.html (2011).
Choi, A., Joo, S. Bin, Oh, E. & Mun, J. H. Kinematic evaluation of movement smoothness in golf: Relationship between the normalized jerk cost of body joints and the clubhead. Biomed Eng. Online 13, 1–12 (2014).
Acknowledgements
The authors would like to thank the National Science and Technology Council (NSTC) of Taiwan for grant support (grant no. 110-2221-E-006-011-MY3).
Author information
Authors and Affiliations
Contributions
K.C. and F.C.S. conceived the hypothesis. K.C., F.C.S., C.J.L. and H.F.C. designed the experiments. K.C. performed the experiments. K.C., C.J.L., H.F.C., K.N.A. and F.C.S. analyzed the results. K.C., C.J.L., H.F.C., K.N.A. and F.C.S. contributed to the writing and revision of the manuscript.
Corresponding authors
Ethics declarations
Competing interests
The authors declare no competing interests.
Ethical approval
This study was approved by the National Cheng Kung University Human Research Ethics Committee (Approval No. NCKU HREC-E-109–423-2), and all methods were conducted in accordance with relevant guidelines and regulations. Informed consent was obtained from all study participants for their involvement in this study. Additionally, the author whose image appears in Fig. 3 has provided informed consent for the publication of this image in an online open-access format.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Chaitika, K., Chieh, HF., An, KN. et al. Pei-Wo Drone: a home-based exercise guidance system with a drone for older adults. Sci Rep 15, 10668 (2025). https://doi.org/10.1038/s41598-025-94708-5
Received:
Accepted:
Published:
Version of record:
DOI: https://doi.org/10.1038/s41598-025-94708-5





