Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

Human-centred design and fabrication of a wearable multimodal visual assistance system

Abstract

Artificial intelligence-powered wearable electronic systems offer promising solutions for non-invasive visual assistance. However, state-of-the-art systems have not sufficiently considered human adaptation, resulting in a low adoption rate among blind people. Here we present a human-centred, multimodal wearable system that advances usability by blending software and hardware innovations. For software, we customize the artificial intelligence algorithm to match the requirements of application scenario and human behaviours. For hardware, we improve the wearability by developing stretchable sensory-motor artificial skins to complement the audio feedback and visual tasks. Self-powered triboelectric smart insoles align real users with virtual avatars, supporting effective training in carefully designed scenarios. The harmonious corporation of visual, audio and haptic senses enables significant improvements in navigation and postnavigation tasks, which are experimentally evidenced by humanoid robots and participants with visual impairment in both virtual and real environments. Postexperiment surveys highlight the system’s reliable functionality and high usability. This research paves the way for user-friendly visual assistance systems, offering alternative avenues to enhance the quality of life for people with visual impairment.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Overview of the wearable multimodal visual assistance system.
Fig. 2: Personalizing artificial vision.
Fig. 3: Customization to improve mobility.
Fig. 4: Artificial skin sensory-motor for efficient haptic feedback.
Fig. 5: Metaverse for immersive training.
Fig. 6: Real-world testing.

Similar content being viewed by others

Data availability

All the data required to assess the conclusions of the article are available via Figshare at https://doi.org/10.6084/m9.figshare.26103583 (ref. 39) and are available for reuse for ethical and scientific purposes.

Code availability

The exemplary scripts for data processing and analysis for this study are available in the GitHub repository at https://github.com/JianTang2000/wearableSystem (ref. 40).

References

  1. Roska, B. & Sahel, J. A. Restoring vision. Nature 557, 359–367 (2018).

    Article  Google Scholar 

  2. Maidenbaum, S., Abboud, S. & Amedi, A. Sensory substitution: closing the gap between basic research and widespread practical visual rehabilitation. Neurosci. Biobehav. Rev. 41, 3–15 (2014).

    Article  Google Scholar 

  3. Tapu, R., Mocanu, B. & Zaharia, T. Wearable assistive devices for visually impaired: a state of the art survey. Pattern Recogn. Lett. 137, 37–52 (2020).

    Article  Google Scholar 

  4. Herskovitz, J. et al. Hacking, switching, combining: understanding and supporting DIY assistive technology design by blind people. In Proc. 2023 CHI Conference on Human Factors in Computing Systems (eds Albrecht, S. et al.) 1–17 (ACM, 2023).

  5. Shah, D., Osiński, B. & Levine, S. Lm-nav: robotic navigation with large pre-trained models of language, vision, and action. In Proc. 2024 Conference on Robot Learning (eds Marc, T. et al.) 492–504 (PMLR, 2024).

  6. Wu, W. et al. Vision-language navigation: a survey and taxonomy. Neural Comput. Appl. 36, 3291–3316 (2024).

    Article  Google Scholar 

  7. Spoendlin, H. & Schrott, A. Analysis of the human auditory nerve. Hear. Res. 43, 25–38 (1989).

    Article  Google Scholar 

  8. Bach‐y‐Rita, P. Tactile sensory substitution studies. Ann. N. Y. Acad. Sci. 1013, 83–91 (2004).

    Article  Google Scholar 

  9. Nguyen, T. H. et al. A wearable assistive device for the blind using tongue-placed electrotactile display: design and verification. In Proc. 2013 International Conference on Control, Automation and Information Sciences (eds Jin, B. et al.) 42–47 (IEEE, 2013).

  10. Huang, Y. et al. A skin-integrated multimodal haptic interface for immersive tactile feedback. Nat. Electron. 6, 1020–1031 (2023).

    Article  Google Scholar 

  11. Guo, H. et al. A highly sensitive, self-powered triboelectric auditory sensor for social robotics and hearing aids. Sci. Robot. 3, eaat2516 (2018).

    Article  Google Scholar 

  12. Liu, Y. et al. Soft, miniaturized, wireless olfactory interface for virtual reality. Nat. Commun. 14, 2297 (2023).

    Article  Google Scholar 

  13. Hoffmann, R. et al. Evaluation of an audio-haptic sensory substitution device for enhancing spatial awareness for the visually impaired. Optom. Vis. Sci. 95, 757–765 (2018).

    Article  Google Scholar 

  14. Li, G., Xu, J., Li, Z., Chen, C. & Kan, Z. Sensing and navigation of wearable assistance cognitive systems for the visually impaired. IEEE Trans. Cogn. Dev. Syst. 15, 122–133 (2022).

    Article  Google Scholar 

  15. Xu, C. et al. A physicochemical-sensing electronic skin for stress response monitoring. Nat. Electron. 7, 168–179 (2024).

    Article  Google Scholar 

  16. Yang, Q. et al. Mixed-modality speech recognition and interaction using a wearable artificial throat. Nat. Mach. Intell. 5, 169–180 (2023).

    Article  Google Scholar 

  17. Gu, L. et al. A biomimetic eye with a hemispherical perovskite nanowire array retina. Nature 581, 278–282 (2020).

    Article  Google Scholar 

  18. Yao, K. et al. Encoding of tactile information in hand via skin-integrated wireless haptic interface. Nat. Mach. Intell. 4, 893–903 (2022).

    Article  Google Scholar 

  19. Jocher, C. et al. Ultralytics yolov8. GitHub https://github.com/ultralytics/ultralytics (2023).

  20. Krausz, N. E., Lenzi, T. & Hargrove, L. J. Depth sensing for improved control of lower limb prosthesis. IEEE Trans. Biomed. Eng. 62, 2576–2587 (2015).

    Article  Google Scholar 

  21. González, J. & Yu, W. Multichannel audio aided dynamical perception for prosthetic hand biofeedback. In Proc. 2009 IEEE International Conference on Rehabilitation Robotics (eds Kiyoshi, N. et al.) 240–245 (IEEE, 2009).

  22. George, J. A. et al. Biomimetic sensory feedback through peripheral nerve stimulation improves dexterous use of a bionic hand. Sci. Robot. 4, eaax2352 (2019).

    Article  Google Scholar 

  23. Meijer, P. B. An experimental system for auditory image representations. IEEE Trans. Biomed. Eng. 39, 112–121 (1992).

    Article  Google Scholar 

  24. Levy-Tzedek, S., Hanassy, S., Abboud, S., Maidenbaum, S. & Amedi, A. Fast, accurate reaching movements with a visual-to-auditory sensory substitution device. Restor. Neurol. Neurosci. 30, 313–323 (2012).

    Google Scholar 

  25. Yang, Y. et al. Breathable electronic skins for daily physiological signal monitoring. Nano-Micro Lett. 14, 161 (2022).

    Article  Google Scholar 

  26. Chortos, A., Liu, J. & Bao, Z. Pursuing prosthetic electronic skin. Nat. Mater. 15, 937–950 (2016).

    Article  Google Scholar 

  27. Yu, Y. et al. All-printed soft human-machine interface for robotic physicochemical sensing. Sci. Robot. 7, eabn0495 (2022).

    Article  Google Scholar 

  28. Yu, X. et al. Skin-integrated wireless haptic interfaces for virtual and augmented reality. Nature 575, 473–479 (2019).

    Article  Google Scholar 

  29. Sanes, J. N. & Donoghue, J. P. Plasticity and primary motor cortex. Annu. Rev. Neurosci. 23, 393–415 (2000).

    Article  Google Scholar 

  30. Susal, J., Krauss, K., Tsingos, N. & Altman, M. Immersive audio for VR. In Proc. 2016 AES International Conference on Audio for Virtual and Augmented Reality (eds Andres, M. et al.) 99–124 (Audio Engineering Society, 2016).

  31. Zhao, Y. et al. Enabling independent navigation for visually impaired people through a wearable vision-based feedback system. In Proc. 2018 CHI Conference on Human Factors in Computing Systems (eds Regan, M. et al.) 1–14 (ACM, 2018).

  32. Slade, P., Tambe, A. & Kochenderfer, M. J. Multimodal sensing and intuitive steering assistance improve navigation and mobility for people with impaired vision. Sci. Robot. 6, eabg6594 (2021).

    Article  Google Scholar 

  33. Jordan, P. W., Thomas, B., McClelland, I. L. & Weerdmeester, B. Usability Evaluation in Industry (CRC, 1996).

  34. Sauro, J. A Guide to the System Usability Scale: Background, Benchmarks & Best Practices (Measuring Usability LLC, 2011).

  35. Country, M. W. Retinal metabolism: a comparative look at energetics in the retina. Brain Res. 1672, 50–57 (2017).

    Article  Google Scholar 

  36. Liao, F. et al. Bioinspired in-sensor visual adaptation for accurate perception. Nat. Electron. 5, 84–91 (2022).

    Article  Google Scholar 

  37. Park, J. et al. Avian eye-inspired perovskite artificial vision system for foveated and multispectral imaging. Sci. Robot. 9, eadk6903 (2024).

    Article  Google Scholar 

  38. Zhang, Z. et al. All-in-one two-dimensional retinomorphic hardware device for motion detection and recognition. Nat. Nanotechnol. 17, 27–32 (2022).

    Article  Google Scholar 

  39. Jian, T. et al. Human-centered design and fabrication of a wearable multimodal visual assistance system. FigShare https://doi.org/10.6084/m9.figshare.26103583 (2025).

  40. Jian, T. et al. Human-centered design and fabrication of a wearable multimodal visual assistance system. Zenodo https://doi.org/10.5281/zenodo.14752720 (2025).

Download references

Acknowledgements

This work was funded by STI 2030—Major Projects (grant no. 2022ZD0210000), National Science Foundation China grant (no. 62274110) and Shanghai Rising-Star Program (grant no. 21QA1404000). The individuals involved in the 2022ZD0210000 project include L.G. and B.Y., the 62274110 project includes L.G. and the 21QA1404000 project includes L.G. We sincerely thank all the participants who generously volunteered their time, provided valuable feedback and contributed to the study.

Author information

Authors and Affiliations

Contributions

L.G. and J.T. were responsible for the system design. J.T., Y.Z., G.J., L.X. and W.R. conducted the experiments. J.T., L.G., H.B. and Q.G. developed the AI algorithm, B.Y., J.Z, X.W. and Z.F. assisted the data collection and analysis. All authors participated in manuscript preparation.

Corresponding author

Correspondence to Leilei Gu.

Ethics declarations

Competing interests

The authors declare no competing interests.

Peer review

Peer review information

Nature Machine Intelligence thanks Wei Gao and Xinge Yu for their contribution to the peer review of this work.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary Information

Supplementary Figs. 1–28, Discussion and Tables 1 and 2.

Reporting Summary

Supplementary Video 1

Humanoid navigates to the target.

Supplementary Video 2

Humanoid navigates to the target with obstacle avoidance.

Supplementary Video 3

Artificial skins assist dynamic obstacle avoidance.

Supplementary Video 4

Artificial skins assist postnavigation task.

Supplementary Video 5

VR training.

Supplementary Video 6

A representative VIP navigates through a maze under guidance of the system.

Supplementary Video 7

Real-world testing. A VIP navigates through a cluttered conference room.

Supplementary Video 8

Real-world testing. A VIP navigates through an outdoor dynamic environment.

Supplementary Video 9

Real-world testing. A VIP navigates through a narrow corridor.

Supplementary Video 10

Real-world testing. A VIP completes a comprehensive task emulating real-world challenge.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tang, J., Zhu, Y., Jiang, G. et al. Human-centred design and fabrication of a wearable multimodal visual assistance system. Nat Mach Intell 7, 627–638 (2025). https://doi.org/10.1038/s42256-025-01018-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue date:

  • DOI: https://doi.org/10.1038/s42256-025-01018-6

This article is cited by

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing