Abstract
The human hand is dexterous and versatile, allowing it to interact with physical and virtual environments. The ability to track hand movements during daily activities could be of use in the development of spatial computing, virtual and augmented reality, robotics and prosthetics. However, current techniques based on cameras, strain and inertial sensors, and electromyography sensors have limited view angles and hand positions, have constrained hand activities and sensations, and can track only discrete hand gestures, respectively. Here we report a fully integrated, wireless and wearable ultrasound imaging wristband that is combined with an artificial intelligence algorithm. The wristband can continuously track arbitrary hand configurations of the five fingers and the palm in real time during daily activities with a delay of less than 120 ms. We show that the wristband can be used for intuitive and versatile controls in virtual-reality and robotic-hand applications.
This is a preview of subscription content, access via your institution
Access options
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
$32.99 / 30 days
cancel any time
Subscribe to this journal
Receive 12 digital issues and online access to articles
$119.00 per year
only $9.92 per issue
Buy this article
- Purchase on SpringerLink
- Instant access to the full article PDF.
USD 39.95
Prices may be subject to local taxes which are calculated during checkout






Similar content being viewed by others
Data availability
The data that support the findings of this study are available from the corresponding author upon reasonable request. A sample of the dataset61 for hand tracking is available; more datasets are available from the corresponding author on request. Source data are provided with this paper.
Code availability
An example code is available at https://github.com/gengxilu/HTwithUSImage. All scripts and algorithms in this work are available from the corresponding author on reasonable request.
References
Bejczy, A. K. Sensors, controls, and man-machine interface for advanced teleoperation. Science 208, 1327–1335 (1980).
Cheok, M. J., Omar, Z. & Jaward, M. H. A review of hand gesture and sign language recognition techniques. Int. J. Mach. Learn. Cybern. 10, 131–153 (2019).
Oudah, M., Al-Naji, A. & Chahl, J. Hand gesture recognition based on computer vision: a review of techniques. J. Imaging 6, 73 (2020).
Wang, M. et al. Gesture recognition using a bioinspired learning architecture that integrates visual data with somatosensory data from stretchable sensors. Nat. Electron. 3, 563–570 (2020).
Lee, Y. et al. Visual-inertial hand motion tracking with robustness against occlusion, interference, and contact. Sci. Robot. 6, eabe1315 (2021).
Reissner, L., Fischer, G., List, R., Giovanoli, P. & Calcagni, M. Assessment of hand function during activities of daily living using motion tracking cameras: a systematic review. Proc. Inst. Mech. Eng. H 233, 764–783 (2019).
Araromi, O. A. et al. Ultra-sensitive and resilient compliant strain gauges for soft machines. Nature 587, 219–224 (2020).
Kim, K. K. et al. A substrate-less nanomesh receptor with meta-learning for rapid hand task recognition. Nat. Electron. 6, 64–75 (2023).
Yamada, T. et al. A stretchable carbon nanotube strain sensor for human-motion detection. Nat. Nanotechnol. 6, 296–301 (2011).
Furui, A. et al. A myoelectric prosthetic hand with muscle synergy-based motion determination and impedance model-based biomimetic control. Sci. Robot. 4, eaaw6339 (2019).
Moin, A. et al. A wearable biosensing system with in-sensor adaptive machine learning for hand gesture recognition. Nat. Electron. 4, 54–63 (2021).
Zbinden, J. et al. Improved control of a prosthetic limb by surgically creating electro-neuromuscular constructs with implanted electrodes. Sci. Transl. Med. 15, eabq3665 (2023).
Gu, G. et al. A soft neuroprosthetic hand providing simultaneous myoelectric control and tactile feedback. Nat. Biomed. Eng. 7, 589–598 (2023).
Hahne, J. M., Schweisfurth, M. A., Koppe, M. & Farina, D. Simultaneous control of multiple functions of bionic hand prostheses: performance and robustness in end users. Sci. Robot. 3, eaat3630 (2018).
Chen, C. et al. Hand gesture recognition based on motor unit spike trains decoded from high-density electromyography. Biomed. Signal Process. Control 55, 101637 (2020).
Disselhorst-Klug, C., Schmitz-Rode, T. & Rau, G. Surface electromyography and muscle force: limits in sEMG–force relationship and new approaches for applications. Clin. Biomech. 24, 225–235 (2009).
Jiang, N., Dosen, S., Muller, K.-R. & Farina, D. Myoelectric control of artificial limbs—is there a need to change focus? [In the spotlight]. IEEE Signal Process. Mag. 29, 152–150 (2012).
Sgambato, B. G. et al. High performance wearable ultrasound as a human–machine interface for wrist and hand kinematic tracking. IEEE Trans. Biomed. Eng. 71, 484–493 (2024).
Bimbraw, K., Nycz, C. J., Schueler, M., Zhang, Z. & Zhang, H. K. Simultaneous estimation of hand configurations and finger joint angles using forearm ultrasound. IEEE Trans. Med. Robot. Bionics 5, 120–132 (2023).
Huang, Y. et al. Ultrasound-based sensing models for finger motion classification. IEEE J. Biomed. Health Inform. 22, 1395–1405 (2017).
Lian, Y. et al. A transfer learning strategy for cross-subject and cross-time hand gesture recognition based on A-mode ultrasound. IEEE Sens. J. 24, 17183–17192 (2024).
Gao, X. et al. A wearable echomyography system based on a single transducer. Nat. Electron. 7, 1035–1046 (2024).
Spacone, G. et al. Tracking of wrist and hand kinematics with ultra low power wearable A-mode ultrasound. IEEE Trans. Biomed. Circuits Syst. 19, 536–548 (2025).
Peng, X. et al. A novel transformer-based approach for simultaneous recognition of hand movements and force levels in amputees using flexible ultrasound transducers. IEEE Trans. Neural Syst. Rehabil. Eng. 31, 4580–4590 (2023).
Zadok, D., Salzman, O., Wolf, A. & Bronstein, A. M. Towards predicting fine finger motions from ultrasound images via kinematic representation. In Proc. IEEE International Conference on Robotics and Automation 12645–12651 (IEEE, 2023).
Lu, Z. et al. Wearable real-time gesture recognition scheme based on A-mode ultrasound. IEEE Trans. Neural Syst. Rehabil. Eng. 30, 2623–2629 (2022).
Li, J., Zhu, K. & Pan, L. Wrist and finger motion recognition via M-mode ultrasound signal: a feasibility study. Biomed. Signal Process. Control 71, 103112 (2022).
Fernandes, A. J., Ono, Y. & Ukwatta, E. Evaluation of finger flexion classification at reduced lateral spatial resolutions of ultrasound. IEEE Access 9, 24105–24118 (2021).
Yang, X., Yan, J., Fang, Y., Zhou, D. & Liu, H. Simultaneous prediction of wrist/hand motion via wearable ultrasound sensing. IEEE Trans. Neural Syst. Rehabil. Eng. 28, 970–977 (2020).
He, J., Luo, H., Jia, J., Yeow, J. T. & Jiang, N. Wrist and finger gesture recognition with single-element ultrasound signals: a comparison with single-channel surface electromyogram. IEEE Trans. Biomed. Eng. 66, 1277–1284 (2018).
Yan, J., Yang, X., Sun, X., Chen, Z. & Liu, H. A lightweight ultrasound probe for wearable human–machine interfaces. IEEE Sens. J. 19, 5895–5903 (2019).
Yang, X., Sun, X., Zhou, D., Li, Y. & Liu, H. Towards wearable A-mode ultrasound sensing for real-time finger motion recognition. IEEE Trans. Neural Syst. Rehabil. Eng. 26, 1199–1208 (2018).
McIntosh, J., Marzo, A., Fraser, M. & Phillips, C. EchoFlex: hand gesture recognition using ultrasound imaging. In Proc. CHI Conference on Human Factors in Computing Systems 1923–1934 (Association for Computing Machinery, 2017).
Akhlaghi, N. et al. Real-time classification of hand motions using ultrasound imaging of forearm muscles. IEEE Trans. Biomed. Eng. 63, 1687–1698 (2015).
Baker, C. A., Akhlaghi, N., Rangwala, H., Kosecka, J. & Sikdar, S. Real-time, ultrasound-based control of a virtual hand by a trans-radial amputee. In Proc. Annual International Conference of the IEEE Engineering in Medicine and Biology Society 3219–3222 (IEEE, 2016).
Li, Y., He, K., Sun, X. & Liu, H. Human-machine interface based on multi-channel single-element ultrasound transducers: a preliminary study. In Proc. IEEE International Conference on e-Health Networking, Applications and Services 1–6 (IEEE, 2016).
Ortenzi, V., Tarantino, S., Castellini, C. & Cipriani, C. Ultrasound imaging for hand prosthesis control: a comparative study of features and classification methods. In Proc. IEEE International Conference on Rehabilitation Robotics 1–6 (IEEE, 2015).
Sikdar, S. et al. Novel method for predicting dexterous individual finger movements by imaging muscle activity using a wearable ultrasonic system. IEEE Trans. Neural Syst. Rehabil. Eng. 22, 69–76 (2013).
Guo, J.-Y., Zheng, Y.-P., Xie, H.-B. & Koo, T. K. Towards the application of one-dimensional sonomyography for powered upper-limb prosthetic control using machine learning models. Prosthet. Orthot. Int. 37, 43–49 (2013).
Castellini, C., Passig, G. & Zarka, E. Using ultrasound images of the forearm to predict finger positions. IEEE Trans. Neural Syst. Rehabil. Eng. 20, 788–797 (2012).
Shi, J., Guo, J.-Y., Hu, S.-X. & Zheng, Y.-P. Recognition of finger flexion motion from ultrasound image: a feasibility study. Ultrasound Med. Biol. 38, 1695–1704 (2012).
Kauer, J. M. Functional anatomy of the wrist. Clin. Orthop. Relat. Res. 149, 9–20 (1980).
Wang, C. et al. Bioadhesive ultrasound for long-term continuous imaging of diverse organs. Science 377, 517–523 (2022).
Feix, T., Romero, J., Schmiedmayer, H.-B., Dollar, A. M. & Kragic, D. The grasp taxonomy of human grasp types. IEEE Trans. Hum.-Mach. Syst. 46, 66–77 (2015).
Pyun, K. R. et al. Machine-learned wearable sensors for real-time hand-motion recognition: toward practical applications. Natl Sci. Rev. 11, nwad298 (2024).
Si, Y. et al. Flexible strain sensors for wearable hand gesture recognition: from devices to systems. Adv. Intell. Syst. 4, 2100046 (2022).
Li, Z., Hayashibe, M., Fattal, C. & Guiraud, D. Muscle fatigue tracking with evoked EMG via recurrent neural network: toward personalized neuroprosthetics. IEEE Comput. Intell. Mag. 9, 38–46 (2014).
Kaifosh, P. & Reardon, T. R. A generic non-invasive neuromotor interface for human–computer interaction. Nature 645, 702–711 (2025).
Hoffmann, J. et al. An empirical analysis of compute-optimal large language model training. Adv. Neural Inf. Process. Syst. 35, 30016–30030 (2022).
Zhai, X., Kolesnikov, A., Houlsby, N. & Beyer, L. Scaling vision transformers. In Proc. IEEE/CVF Conference on Computer Vision and Pattern Recognition 12104–12113 (IEEE, 2022).
Qin, C. et al. Biomechanics-informed neural networks for myocardial motion tracking in MRI. In International Conference on Medical Image Computing and Computer-Assisted Intervention (eds Martel, A. L.) 296–306 (Springer, 2020); https://link.springer.com/chapter/10.1007/978-3-030-59716-0_29
Rothberg, J. M. et al. Ultrasound-on-chip platform for medical imaging, analysis, and collective intelligence. Proc. Natl Acad. Sci. USA 118, e2019339118 (2021).
Ramalli, A., Boni, E., Roux, E., Liebgott, H. & Tortoli, P. Design, implementation, and medical applications of 2-D ultrasound sparse arrays. IEEE Trans. Ultrason. Ferroelectr. Freq. Control 69, 2739–2755 (2022).
Nuckols, R. W. et al. Individualization of exosuit assistance based on measured muscle dynamics during versatile walking. Sci. Robot. 6, eabj1362 (2021).
Hu, H. et al. A wearable cardiac ultrasound imager. Nature 613, 667–675 (2023).
Zhang, L. et al. A conformable phased-array ultrasound patch for bladder volume monitoring. Nat. Electron. 7, 77–90 (2024).
Cannata, J. M., Ritter, T. A., Chen, W.-H., Silverman, R. H. & Shung, K. K. Design of efficient, broadband single-element (20-80 MHz) ultrasonic transducers for medical imaging applications. IEEE Trans. Ultrason. Ferroelectr. Freq. Control 50, 1548–1557 (2003).
Zhou, Q. et al. Alumina/epoxy nanocomposite matching layers for high-frequency ultrasound transducer application. IEEE Trans. Ultrason. Ferroelectr. Freq. Control 56, 213–219 (2009).
Duck, F. A. Medical and non-medical protection standards for ultrasound and infrasound. Prog. Biophys. Mol. Biol. 93, 176–191 (2007).
Malešević, N. et al. A database of high-density surface electromyogram signals comprising 65 isometric hand gestures. Sci. Data 8, 63 (2021).
Lu, G. HTwithUSImage. Harvard Dataverse https://doi.org/10.7910/DVN/GQDA01 (2024).
Acknowledgements
We thank A. Takahashi from MIT Martinos Imaging Center (NIH grant number 1S10OD021569-01) for help with the magnetic resonance imaging, C. Liu and B. Wang for discussions on the hand anatomy and H. Kang for help on the ultrasonic circuit design. This work was supported in part by the Massachusetts Institute of Technology (George N. Hatsopoulos Faculty Fellowship and Uncas and Helen Whitaker Professorship; X.Z.), the National Institutes of Health (grant numbers 1R01HL153857-01 and 1R01HL167947-01; X.Z.), the National Science Foundation (grant numbers 2430106 and EFMA-1935291; X.Z.) and Department of Defense Congressionally Directed Medical Research Programs (grant number PR200524P1; X.Z.). The revision of the paper was supported in part by the National Research Foundation, Prime Minister’s Office, Singapore, under its Campus for Research Excellence and Technological Enterprise (CREATE) programme through the Singapore-MIT Alliance for Research and Technology (SMART): Wearable Imaging for Transforming Elderly Care (WITEC) Inter-Disciplinary Research Group (X.Z.). X.Z. acknowledges support from a Humboldt Research Award.
Author information
Authors and Affiliations
Contributions
G.L. and X.Z. conceived the project. G.L., Y.Z., R.L. and Q.Z. designed, fabricated and tested the ultrasound probe. G.L., B.L., J.Z., C.G., Y.Z. and Q.Z. designed and tested the ultrasound system. X.C., S.H.K., S.W., S.L. and G.L. designed, fabricated and characterized the ultrasound couplant. G.L., S.H.K., X.C. and D.L. set up the environment for experiments and performed the tests and data acquisition. G.L. and S.H.K. performed data processing, algorithm development, machine learning model development and real-time demonstrations. G.L., C.G., B.D. and A.P.C. analysed the machine learning results. G.L. and Y.Z. performed the acoustic simulations. G.L. and X.Z. wrote the paper and incorporated comments and edits from all authors. X.Z. supervised the project.
Corresponding author
Ethics declarations
Competing interests
G.L. and X.Z. are inventors of a patent application (US Provisional Application No. 63/721,387) describing the ultrasonic hand-tracking system and technology. X.Z. has a financial interest in SanaHeal, Magnendo and Sonologi. The other authors declare no competing interests.
Peer review
Peer review information
Nature Electronics thanks Honghai Liu, Siddhartha Sikdar and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Extended data
Extended Data Fig. 1 Illustration of the design of the wearable high-quality ultrasound probe.
a, The layered structure of the wearable high-quality ultrasound probe. (See Supplementary Table 3 for design specifications.) b, A photo of the 1–3 composite piezoelectric material from the top view. The pitch shows the element of the 1–3 composite structure. c, A photo of the 1–3 composite piezoelectric material from the side view. The pitch shows the element of 1–3 composite structure. Scale bars indicate 100 µm. d, Photos and detailed dimensions of the fully integrated wireless ultrasound system.
Extended Data Fig. 2 Fully integrated ultrasound wristband stably attached to the wrist via the soft yet tough couplant and mechanical straps.
a, A top-view photo of the fully integrated ultrasound wristband with the couplant and mechanical straps. b, A bottom-view photo of the fully integrated ultrasound wristband with the couplant and mechanical straps. c, A photo of a user wearing the fully integrated ultrasound wristband, which is stably attached to the wrist via mechanical straps. d, A zoom-in photo of the soft yet tough couplant on top of the ultrasound probe. The soft couplant conformally fills the gap between the flat, rigid ultrasound probe and the wrist, and isolate the ultrasound probe from the body tissue deformation.
Extended Data Fig. 3 The architecture of the T-R regression model with 22-dimension output.
‘Ch’ indicates channel number.
Extended Data Fig. 4 Tracking performance from 8 subjects (3 females, 5 males) with wrist circumference ranging from 12.8 cm to 18.4 cm (see Supplementary Table 4 for details).
The AI model for each subject is re-trained using the data from each individual subject. RMSE: root mean square error. (n = 5, independent measurements of 22 DOFs).
Extended Data Fig. 5 Wearing-position-robust tracking performance from 8 subjects (3 females, 5 males) with wrist circumference ranging from 12.8 cm to 18.4 cm (see Supplementary Table 4 for details).
The AI model for each subject is re-trained using the data from each individual subject. RMSE: root mean square error. (n = 5, independent measurements of 22 DOFs).
Extended Data Fig. 6 Evaluation of the noise sensitivity of the ultrasound wristband.
a, Ultrasound images with different levels of white-gaussian noise. b, Tracking results of all 22 DOFs on data with different levels of noise. The model was trained only on the data without add-on noise (Data are presented as mean values +/− standard deviation, n = 5 independent measurements). RMSE: root mean square error. c, Averaged tracking accuracy (averaged RMSE of all 22 DOFs) changes along with the signal-to-noise ratio (SNR). Data are presented as mean values and error bars show the standard deviation (n = 110, 5 independent measurements of 22 DOFs). δ is the standard deviation of the added white gaussian noises.
Extended Data Fig. 7 Evaluation of the hysteresis effect of the ultrasound wristband.
a, Photos showing the hand performing palm open and close actions, and the corresponding ultrasound images. b, Comparison of ultrasound images between ‘palm open’ action and ‘palm close’ action. High similarity between corresponding images indicates a low hysteresis effect.
Extended Data Fig. 8 Evaluation of the drifting effect of the ultrasound wristband.
a, Illustrations of hand and wrist training to cause nerve and muscle fatigue. b, Two representative photos and corresponding ultrasound images before and after the 1-hour exercise. c, Comparison of tracking results on the data acquired before and after the 1-hour exercise. The model was trained only on the data acquired before the exercise. RMSE: root mean square error. (Data are presented as mean values +/− standard deviation, n = 5 independent measurements).
Extended Data Fig. 9 Evaluation of the resolution dependent tracking accuracy using the ultrasound wristband.
Artificial ultrasound datasets with reduced resolution are generated by applying Gaussian blur to the original images (a), simulating imaging at lower center frequencies (b. 8 MHz, c. 5 MHz, and d. 3 MHz). The AI model is trained and tested on each dataset to assess performance under degraded resolution. a-d, Left: Representative examples of artificially blurred ultrasound images. Right: Corresponding tracking accuracy for each condition. (All data in this figure are presented as mean values +/− standard deviation, n = 5 independent measurements.).
Extended Data Fig. 10 Demonstration of tracking robustness to arm motions.
a-d, Representative ultrasound images of the wrist at different arm positions while covering. e-f, Real-time robotic hand control during arm motion. See Supplementary Video 11 for continuous tracking demonstration.
Supplementary information
Supplementary Information (download PDF )
Supplementary Notes 1–3, Figs. 1–21, Tables 1–5, captions to Supplementary Videos 1–11 and references.
Supplementary Video 1 (download MP4 )
Donning and doffing of a fully integrated ultrasound wristband by the user independently.
Supplementary Video 2 (download MP4 )
Demonstration of instant hand tracking after putting on the ultrasound wristband.
Supplementary Video 3 (download MP4 )
Ultrasound images show the anatomy changes corresponding to 22 individual DOFs.
Supplementary Video 4 (download MP4 )
Dexterous virtual hand continuously performs hand gestures that represent all 33 human grasp types categorized by the GRASP taxonomy.
Supplementary Video 5 (download MP4 )
Dexterous virtual hand enabled by the wristband is manipulating a 3D object in virtual reality.
Supplementary Video 6 (download MP4 )
Demonstration of hand-tracking robustness against variations in wearing positions.
Supplementary Video 7 (download MP4 )
Demonstration of hand tracking that is immune to visual occlusions.
Supplementary Video 8 (download MP4 )
Dexterous hand tracking enabled by the ultrasound wristband in controlling a 3D object in virtual reality with trained motions.
Supplementary Video 9 (download MP4 )
Dexterous hand tracking enabled by the ultrasound wristband in controlling a 3D object in virtual reality with untrained motions.
Supplementary Video 10 (download MP4 )
Dexterous hand tracking enabled by the ultrasound wristband in controlling a robotic hand.
Supplementary Video 11 (download MP4 )
Dexterous hand tracking enabled by the ultrasound wristband in controlling a robotic hand with arm motions.
Source data
Source data Figs. 1–6 and Extended Data Figs. 1–10 (download XLSX )
Statistical source data.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Lu, G., Kim, S., Chen, X. et al. Hand tracking using wearable wrist imaging. Nat Electron (2026). https://doi.org/10.1038/s41928-026-01594-4
Received:
Accepted:
Published:
Version of record:
DOI: https://doi.org/10.1038/s41928-026-01594-4


