Abstract
Magnetic resonance imaging-guided acoustic trapping is expected to manipulate drug carriers (e.g., microbubbles) within the body, potentially improving carrier concentration at tumor sites and thereby enhancing targeted therapy outcomes. However, accurate trap generation remains challenging due to complex wave propagation through multiple tissue materials. Moreover, respiration-induced tissue motion imposes stringent requirements on computational efficiency for rapid phase updates. Here we propose a machine learning-based model and a closed-loop control scheme to modulate phase patterns rapidly. The model delivers precise time-of-flight prediction (mean err. ≤ 0.24 μs) within 26 ms for 196 transducer elements. In proof-of-concept experiments, computer vision feedback permits fast (about 15 frames per second) position adjustment of a trapped polystyrene ball (Ø2.7 mm). This control scheme helps lessen the ball’s spatial drift induced by time-varying multi-medium environments. These experiments on robotic manipulation support our model’s potential for future magnetic resonance imaging-guided targeted therapy.
Similar content being viewed by others
Data availability
The detailed data sets are available at a public GitHub repository (https://github.com/mengjwu/acoustictrap3D).
Code availability
The source code is available at a public GitHub repository (https://github.com/mengjwu/acoustictrap3D).
References
Wu, M. & Liao, W. Machine learning-empowered real-time acoustic trapping: an enabling technique for increasing MRI-guided microbubble accumulation. Sensors 24, 6342 (2024).
Marzo, A. et al. Holographic acoustic elements for manipulation of levitated objects. Nat. Commun. 6, 8661 (2015).
Lo, W.-C., Fan, C.-H., Ho, Y.-J., Lin, C.-W. & Yeh, C.-K. Tornado-inspired acoustic vortex tweezer for trapping and manipulating microbubbles. Proc. Natl. Acad. Sci. 118, e2023188118 (2021).
Zhou, Q., Zhang, J., Ren, X., Xu, Z. & Liu, X. Multi-bottle beam generation using acoustic holographic lens. Appl. Phys. Lett. 116, 133502 (2020).
Ozcelik, A. et al. Acoustic tweezers for the life sciences. Nat. Methods 15, 1021–1028 (2018).
Tang, T., Shen, C. & Huang, L. Propagation of acoustic waves and determined radiation effects on axisymmetric objects in heterogeneous medium with irregular interfaces. Phys. Fluids 36, 012023 (2024).
Baudoin, M. et al. Spatially selective manipulation of cells with single-beam acoustical tweezers. Nat. Commun. 11, 4244 (2020).
Yang, Y. et al. 3D acoustic manipulation of living cells and organisms based on 2D array. IEEE Trans. Biomed. Eng. 69, 2342–2352 (2022).
Hammarström, B., Laurell, T. & Nilsson, J. Seed particle-enabled acoustic trapping of bacteria and nanoparticles in continuous flow systems. Lab Chip 12, 4296 (2012).
Rizzitelli, S. et al. Sonosensitive theranostic liposomes for preclinical in vivo MRI-guided visualization of doxorubicin release stimulated by pulsed low intensity non-focused ultrasound. J. Control. Rel. 202, 21–30 (2015).
Dai, J. et al. Learning-based efficient phase-amplitude modulation and hybrid control for MRI-guided focused ultrasound treatment. IEEE Robot. Autom. Lett. 9, 995 (2024).
Cheung, C. L. et al. Omnidirectional monolithic marker for intra-operative MR-based positional sensing in closed MRI. IEEE Trans Med. Imaging 43, 439–448 (2024).
Marzo, A. & Drinkwater, B. W. Holographic acoustic tweezers. Proc. Natl. Acad. Sci. 116, 84–89 (2019).
Tang, T. & Huang, L. Soundiation: a software in evaluation of acoustophoresis driven by radiation force and torque on axisymmetric objects. J. Acoust. Soc. Am. 152, 2934–2945 (2022).
Tang, T., Shen, C. & Huang, L. Acoustic rotation of non-spherical micro-objects: Characterization of acoustophoresis and quantification of rotational stability. J. Sound Vib. 554, 117694 (2023).
Yang, Y. et al. In-vivo programmable acoustic manipulation of genetically engineered bacteria. Nat. Commun. 14, 3297 (2023).
Cao, H. X. et al. Holographic acoustic tweezers for 5-DoF manipulation of nanocarrier clusters toward targeted drug delivery. Pharmaceutics 14, 1490 (2022).
Yang, Y. et al. Self-navigated 3D acoustic tweezers in complex media based on time reversal. Research 2021, 9781394 (2021).
Zhong, C., Jia, Y., Jeong, D. C., Guo, Y. & Liu, S. AcousNet: a deep learning based approach to dynamic 3D holographic acoustic field generation from phased transducer array. IEEE Robot. Autom. Lett. 7, 666–673 (2022).
Schoen, S. & Arvanitis, C. D. Heterogeneous angular spectrum method for trans-skull imaging and focusing. IEEE Trans. Med. Imaging 39, 1605–1614 (2020).
Wu, F., Thomas, J. L. & Fink, M. Time reversal of ultrasonic fields. Il. Experimental results. Phys. Rev. Appl. 39, 567–578 (1992).
Ley, M. W. & Bruus, H. Three-dimensional numerical modeling of acoustic trapping in glass capillaries. Phys. Rev. Appl. 8, 024020 (2017).
Zhong, C. et al. Real-time acoustic holography with physics-based deep learning for robotic manipulation. IEEE Trans. Autom. Sci. Eng. 21, 1–10 (2023).
Gor’kov, L. P. On the forces acting on a small particle in an acoustical field in an ideal fluid. J. Dokl. Akad. Nauk SSS 140, 88–91 (1961).
Wu, M. et al. A method to detect circle based on Hough transform. In Proceedings of the First International Conference on Information Sciences, Machinery, Materials and Energy, 2013–2016, (Atlantis Press, 2015).
Treeby, B. E. & Cox, B. T. k-Wave: MATLAB toolbox for the simulation and reconstruction of photoacoustic wave fields. J. Biomed. Opt. 15, 021314 (2010).
Rumelhart, D. E., Hinton, G. E. & Williams, R. J. Learning representations by back-propagating errors. Nature 323, 533–536 (1986).
Acorda, J. A., Yamada, H. & Ghamsari, S. M. Evaluation of fatty infiltration of the liver in dairy cattle through digital analysis of hepatic ultrasonograms. Vet. Radiol. Ultrasound. 35, 120–123 (1994).
Tempany, C., MacDonold, N., Stewart, E. A., & Hynynen, K. Tumor Ablation: Principles and Practice (Springer, 2005).
Gierga, D. P. et al. Quantification of respiration-induced abdominal tumor motion and its impact on IMRT dose distributions. Int. J. Radiat. Oncol. Biol. Phys 58, 1584–1595 (2004).
Im, K. & Park, Q.-H. Omni-directional and broadband acoustic anti-reflection and universal acoustic impedance matching. Nanophotonics 11, 2191–2198 (2022).
Suchenek, M. & Borowski, T. Measuring sound speed in gas mixtures using a photoacoustic generator. Int. J. Thermophys. 39, 11 (2018).
Schoen, S. Jr et al. Towards controlled drug delivery in brain tumors with microbubble-enhanced focused ultrasound. Adv. Drug Deliv. Rev. 180, 114043 (2022).
Auboiroux, V. et al. ARFI-prepared MRgHIFU in liver: simultaneous mapping of ARFI-displacement and temperature elevation, using a fast GRE-EPI sequence. Magn. Reson. Med. 68, 932–946 (2012).
Acknowledgements
This work is supported in part by National Natural Science Foundation of China (Grant No. 12504533), Guangdong Basic and Applied Basic Research Foundation (Grant No. 2023A1515110927) and Guangdong University Featured Innovation Project (Grant No. 2025KTSCX129); in part by Shaanxi Provincial Department of Education (Grant No. 24JS024). We sincerely thank Prof. Huang Lixi for his valuable suggestions on the experiments. We are also grateful to Prof. Kwok Ka-Wai and Dr. Dai Jing for their early insights that contributed to the development of the idea.
Author information
Authors and Affiliations
Contributions
Mengjie Wu is with Department of Mechanical Engineering, The University of Hong Kong, Hong Kong SAR, China. M. Wu conceptualized the study, designed the learning model and data analysis, developed the stereo vision algorithm, designed the chamber and printed circuit board, conducted the COMSOL study and manipulation experiments, and drafted and edited the manuscript. Xiaohan Li is with School of Information and Control Engineering, Xi’An University of Architecture and Technology, Xi’An, China. X. Li designed the stereo vision algorithm, recommended camera modules, and revised the manuscript. Tianquan Tang is with Department of Mechanical Engineering, The University of Hong Kong, Hong Kong SAR, China, and also with School of Mechatronic Engineering and Automation, Foshan University, Foshan, China. T. Tang is the corresponding author (tianquan@connect.hku.hk). T. Tang conceptualized the study, designed the chamber, advised on experiments, and edited the manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Peer review
Peer review information
Communications Engineering thanks Gordon Dobie and the other, anonymous, reviewer(s) for their contribution to the peer review of this work. Primary Handling Editors: [Liangfei Tian] and [Philip Coatsworth]. A peer review file is available.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Wu, M., Li, X. & Tang, T. Machine learning-facilitated real-time acoustic trapping in time-varying multi-medium environments toward magnetic resonance imaging-guided microbubble manipulation. Commun Eng (2026). https://doi.org/10.1038/s44172-026-00600-z
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s44172-026-00600-z


