Abstract
Robots, integrated into biological systems as sociable partners, offer promising advancement in the mechanistic understanding of social behaviours. These biohybrid systems bring controllability to help elucidate the underlying biological intelligence previously inaccessible through traditional techniques. However, state-of-the-art interactive robots still struggle to convey multilevel, heterogeneous information within biological systems, making it challenging to mediate the complex interaction process effectively. Here we propose an autonomous, interactive rat-like robot that can engage with freely behaving rats by learning from the anatomical structure, dynamic motions and social interaction of rats. Imitation learning based on animal demonstration enables the robot with subtle templates of social behaviour, allowing it to capture the attention of rats and significantly arouse their interest. It also integrates visual perception, target tracking and behavioural decisions to substantially augment the interaction efficiency. We demonstrate that the robot can interact with rats for a continuous half-hour. Moreover, the robot can modulate the emotional states of rats through different interaction patterns during robot–rat social interaction. These results attest that the proposed interactive robot, with its long-term and repetitive interaction capabilities, overcomes the limitations of natural social interaction within biological systems. Such biohybrid systems capable of modulating the internal states of organisms may open the door to comprehending the ‘social’ interactions between humans and artificial intelligence.
This is a preview of subscription content, access via your institution
Access options
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
$32.99 / 30 days
cancel any time
Subscribe to this journal
Receive 12 digital issues and online access to articles
$119.00 per year
only $9.92 per issue
Buy this article
- Purchase on SpringerLink
- Instant access to the full article PDF.
USD 39.95
Prices may be subject to local taxes which are calculated during checkout






Similar content being viewed by others
Data availability
All data needed to evaluate the conclusions in the paper are present in the article and/or Supplementary Information. The behaviour dataset for imitation learning and the generated templates of social behaviour data are available via Zenodo at https://doi.org/10.5281/zenodo.13968598 (ref. 65) and via GitHub at https://github.com/BIT-SMuRo/Imitation-Learning.
Code availability
Codes for the imitation learning used in this study are available via Zenodo at https://doi.org/10.5281/zenodo.13968598 (ref. 65) and via GitHub at https://github.com/BIT-SMuRo/Imitation-Learning.
References
Alexander, R. D. The evolution of social behavior. Annu. Rev. Ecol. Syst. 5, 325–383 (1974).
Clutton-Brock, T. Social evolution in mammals. Science 373, eabc9699 (2021).
Wei, D., Talwar, V. & Lin, D. Neural circuits of social behaviors: innate yet flexible. Neuron 109, 1600–1620 (2021).
Robinson, G. E., Fernald, R. D. & Clayton, D. F. Genes and social behavior. Science 322, 896–900 (2008).
Remedios, R. et al. Social behaviour shapes hypothalamic neural ensemble representations of conspecific sex. Nature 550, 388–392 (2017).
Romano, D., Donati, E., Benelli, G. & Stefanini, C. A review on animal-robot interaction: from bio-hybrid organisms to mixed societies. Biol. Cybern. 113, 201–225 (2019).
Melo, K., Horvat, T. & Ijspeert, A. J. Animal robots in the african wilderness: lessons learned and outlook for field robotics. Sci. Robot. 8, eadd8662 (2023).
Gribovskiy, A., Halloy, J., Deneubourg, J. L. & Mondada, F. Designing a socially integrated mobile robot for ethological research. Robot. Auton. Syst. 103, 42–55 (2018).
Bonnet, F. et al. Robots mediating interactions between animals for interspecies collective behaviors. Sci. Robot. 4, eaau7897 (2019).
Szopa-Comley, A. W. & Ioannou, C. C. Responsive robotic prey reveal how predators adapt to predictability in escape tactics. Proc. Natl Acad. Sci. USA 119, e2117858119 (2022).
DeLellis, P. et al. Model-based feedback control of live zebrafish behavior via interaction with a robotic replica. IEEE Trans. Robot. 36, 28–41 (2020).
Landgraf, T. et al. Animal-in-the-loop: using interactive robotic conspecifics to study social behavior in animal groups. Annu. Rev. Contr. Robot. 4, 487–507 (2021).
Chen, P. & Hong, W. Neural circuit mechanisms of social behavior. Neuron 98, 16–30 (2018).
Pezzulo, G. et al. The body talks: sensorimotor communication and its brain and kinematic signatures. Phys. Life Rev. 28, 1–21 (2019).
Klein, B. A., Stein, J. & Taylor, R. C. Robots in the service of animal behavior. Commun. Integr. Biol. 5, 466–472 (2012).
Halloy, J. et al. Social integration of robots into groups of cockroaches to control self-organized choices. Science 318, 1155–1158 (2007).
Benelli, G. et al. Behavioral asymmetries in ticks—lateralized questing of Ixodes ricinus to a mechatronic apparatus delivering host-borne cues. Acta. Tropica. 178, 176–181 (2018).
Landgraf, T. et al. Dancing honey bee robot elicits dance-following and recruits foragers. Preprint at https://arxiv.org/abs/1803.07126 (2018).
Mariano, P. et al. Evolving robot controllers for a bio-hybrid system. In Proc. ALIFE 2018: The 2018 Conference on Artificial Life. ALIFE 2018: The 2018 Conference on Artificial Life 155–162 (ASME, 2018).
Barmak, R. et al. A robotic honeycomb for interaction with a honeybee colony. Sci. Robot. 8, eadd7385 (2023).
Barmak, R. et al. Biohybrid superorganisms—on the design of a robotic system for thermal interactions with honeybee colonies. IEEE Access 12, 50849–50871 (2024).
Papaspyros, V. et al. A biohybrid interaction framework for the integration of robots in animal societies. IEEE Access 11, 67640–67659 (2023).
Romano, D. & Stefanini, C. Robot-fish interaction helps to trigger social buffering in neon tetras: the potential role of social robotics in treating anxiety. Int. J. Soc. Robot. 14, 963–972 (2022).
Siddall, R. Ethorobotic rats for rodent behavioral research: design considerations. Front. Behav. Neurosci. 17, 1281494 (2023).
Lai, A. T. et al. A robot-rodent interaction arena with adjustable spatial complexity for ethologically relevant behavioral studies. Cell Rep. 43, 113671 (2024).
Heath, S. et al. PiRat: an autonomous framework for studying social behaviour in rats and robots. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 7601–7608 (IEEE, 2018).
Del Angel Ortiz, R., Contreras, C. M., Gutiérrez-Garcia, A. G. & González, M. F. M. Social interaction test between a rat and a robot: a pilot study. Int. J. Adv. Robot. Syst. 13, 4 (2016).
Gianelli, S., Harland, B. & Fellous, J.-M. A new rat-compatible robotic framework for spatial navigation behavioral experiments. J. Neurosci. Methods 294, 40–50 (2018).
Quinn, L. K. et al. When rats rescue robots. Anim. Behav. Cogn. 5, 368–379 (2018).
Shi, Q. et al. Development of a hybrid wheellegged mobile robot WR-3 designed for the behavior analysis of rats. Adv. Robot. 25, 2255–2272 (2011).
Shi, Q. et al. Modulation of rat behaviour by using a rat-like robot. Bioinspir. Biomim 8, 046002 (2013).
Shi, Q. et al. Behavior modulation of rats to a robotic rat in multi-rat interaction. Bioinspir. Biomim. 10, 056011 (2015).
Krause, J., Winfield, A. F. & Deneubourg, J. L. Interactive robots in experimental biology. Trends Ecol. Evol. 26, 369–375 (2011).
Whishaw, I. Q. & Kolb, B. in The Laboratory Rat 215–242 (Academic Press, 2020).
Chen, Z. et al. ARBUR, a machine learning-based analysis system for relating behaviors and ultrasonic vocalizations of rats. iScience 27, 109998 (2024).
Kisko, T. M., Himmler, B. T., Himmler, S. M., Euston, D. R. & Pellis, S. M. Are 50-kHz calls used as play signals in the playful interactions of rats? II. Evidence from the effects of devocalization. Behav. Process. 111, 25–33 (2015).
Kisko, T. M., Euston, D. R. & Pellis, S. M. Are 50-kHz calls used as play signals in the playful interactions of rats? III. The effects of devocalization on play with unfamiliar partners as juveniles and as adults. Behav. Process. 113, 113–121 (2015).
Brudzynski, S. M. Ethotransmission: communication of emotional states through ultrasonic vocalization in rats. Curr. Opin. Neurobiol. 23, 310–317 (2013).
Kisko, T. M., Wöhr, M., Pellis, V. C. & Pellis, S. M. From play to aggression: high-frequency 50-kHz ultrasonic vocalizations as play and appeasement signals in rats. In Social Behavior from Rodents to Humans: Neural Foundations and Clinical Implications 91–108 (Springer, 2017).
Wang, L. et al. Incorporating neuro-inspired adaptability for continual learning in artificial intelligence. Nat. Mach. Intell. 5, 1356–1368 (2023).
Kar, K., Kornblith, S. & Fedorenko, E. Interpretability of artificial neural network models in artificial intelligence versus neuroscience. Nat. Mach. Intell. 4, 1065–1067 (2022).
Biswas, D. et al. Mode switching in organisms for solving explore-versus-exploit problems. Nat. Mach. Intell. 5, 1285–1296 (2023).
Shi, Q. et al. Implementing rat-like motion for a small-sized biomimetic robot based on extraction of key movement joints. IEEE Trans. Robot. 37, 747–762 (2021).
Li, C. et al. Design and optimization of a lightweight and compact waist mechanism for a robotic rat. Mech. Mach. Theory. 146, 103723 (2020).
Li, C. et al. Identification of rat ultrasonic vocalizations from mix sounds of a robotic rat in a noisy environment. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 7294–7299 (IEEE, 2019).
Litvin, Y., Blanchard, D. C. & Blanchard, R. J. Rat 22 kHz ultrasonic vocalizations as alarm cries. Behav. Brain. Res 182, 166–172 (2007).
Schweinfurth, M. K. The social life of Norway rats (Rattus norvegicus). eLife 9, e54020 (2020).
Neunuebel, J. P., Taylor, A. L., Arthur, B. J. & Egnor, S. R. Female mice ultrasonically interact with males during courtship displays. eLife 4, e06203 (2015).
Simola, N. & Granon, S. Ultrasonic vocalizations as a tool in studying emotional states in rodent models of social behavior and brain disease. Neuropharmacology 159, 107420 (2019).
Shi, Q. et al. Development of a small-sized quadruped robotic rat capable of multimodal motions. IEEE Trans. Robot. 38, 3027–3043 (2022).
Bing, Z. et al. Lateral flexion of a compliant spine improves motor performance in a bioinspired mouse robot. Sci. Robot. 8, eadg7165 (2023).
Wang, R. et al. Bioinspired soft spine enables small-scale robotic rat to conquer challenging environments. Soft Robot. 11, 70–84 (2023).
Shi, Q. et al. Design and control of a biomimetic robotic rat for interaction with laboratory rats. IEEE/ASME Trans. Mech. 20, 1832–1842 (2015).
Shi, Q. et al. A modified robotic rat to study rat-like pitch and yaw movements. IEEE/ASME Trans. Mech. 23, 2448–2458 (2018).
Guo, X. et al. Real-time pose estimation of rats based on stereo vision embedded in a robotic rat. In 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 4690–4695 (IEEE, 2023).
Seffer, D., Schwarting, R. K. W. & Wöhr, M. Pro-social ultrasonic communication in rats: insights from playback studies. J. Neurosci. Methods 234, 73–81 (2014).
Berz, A. C., Wöhr, M. & Schwarting, R. Response calls evoked by playback of natural 50-kHz ultrasonic vocalizations in rats. Front. Behav. Neurosci. 15, 812142 (2021).
Mitri, S., Wischmann, S., Floreano, D. & Keller, L. Using robots to understand social behaviour. Biol. Rev. 88, 31–39 (2013).
Leonardis, E. J. et al. Interactive neurorobotics: behavioral and neural dynamics of agent interactions. Front. Psychol. 13, 897603 (2022).
Jiang, P., Ergu, D., Liu, F., Cai, Y. & Ma, B. A review of Yolo algorithm developments. Proc. Comput. Sci. 199, 1066–1073 (2022).
Rublee, E., Rabaud, V., Konolige, K. & Bradski, G. ORB: an efficient alternative to SIFT or SURF. In IEEE International Conference on Computer Vision 2564–2571 (IEEE, 2011).
Chen, C. et al. A real-time motion detection and object tracking framework for future robot-rat interaction. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 7404–7409 (IEEE, 2021).
Xie, H. et al. A motion generation strategy of robotic rat using imitation learning for behavioral interaction. IEEE Robot. Autom. Lett. 7, 7351–7358 (2022).
Jin, Y., Liu, X., Shao, Y., Wang, H. & Yang, W. High-speed quadrupedal locomotion by imitation-relaxation reinforcement learning. Nat. Mach. Intell. 4, 1198–1208 (2022).
Jia, G. & Chen, Z. Bit-SMuRo/Imitation-Learning. Zenodo https://doi.org/10.5281/zenodo.13968598 (2024).
Acknowledgements
This study was supported in part by the National Natural Science Foundation of China (grant nos. 62022014 (to Q.S.) and 62088101 (to Q.H.)), the Science and Technology Innovation Program of Beijing Institute of Technology (grant no. 2022CX01010 (to Q.S.)) and the National Science and Technology Key Major Projects (STI2030-Major Projects grant no. 2022ZD02068000 (to Z.Q.)). We thank Q. Zhou and H. Wen for technical help in the visualization and statistical analysis. We thank C. Dong and D. Jiang for technical help in data mapping and motion control.
Author information
Authors and Affiliations
Contributions
Q.H. and Q.S. conceived and supervised the project. G.J., Z.C. and Q.S. implemented the methods, conducted the experiments, processed the data, analysed the results and wrote the paper. Y.Z. assisted in conducting the experiments. Experiments were supported by Z.Q., X.C., Z.B. and A.K. All authors contributed to discussions. All authors edited and approved the paper.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Peer review
Peer review information
Nature Machine Intelligence thanks Cesare Stefanini and the other, anonymous, reviewer(s) for their contribution to the peer review of this work
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Extended data
Extended Data Fig. 1 Autonomous control system that integrates visual perception, target tracking, and behavioral decisions.
a, Pre-defined social information (Tb, Ts, Ta) adjusts the goals of robots to perform specific interaction tasks. b, Recognizing and locating the freely behaving rat. Based on the onboard binocular images, the key point features of the rat are detected and matched to estimate the location of the rat via visual triangulation. c, SMuRo tracks the rat in real-time. The robot’s joints and wheels are coordinated to track the rat based on the updated visual and proprioceptive state. The tracking strategy adapts to the current robot-rat relative position to improve interaction efficiency. d, Generating corresponding templates of social behavior. The prediction module is pre-trained offline using demonstration data of social behaviors in rats. A strategy network is used to generate joint movements of robots. Once the relative distance reaches Ts from above, the interaction pattern is launched, where the motion controller deploys the generated whole-body joint and wheeled movements according to current interaction type Tb.
Extended Data Fig. 2 SMuRo performs rat-like pitching movement, emulating rats’ classic rearing behavior.
a, Video snapshot sequence of robot and rat movements. b, Evaluating the dynamic movement parameters of SMuRo performing rat-like pitching movement with respect to rats. (i) The hip pitch joint, (ii) the waist pitch joint, and (iii) the head pitch joint.
Extended Data Fig. 3 SMuRo performs the rat-like yawing motions, emulating rats’ anogenital self-sniffing behavior.
a, Video snapshot sequence of robot and rat motions. b, Evaluating the dynamic movement parameters of SMuRo performing the rat-like yawing movement with respect to rats. (i) The waist yaw joint; (ii) the head yaw joint.
Extended Data Fig. 4 The learned trajectory (a) and the robot trajectories (b) executed by the robot across four templates of social behavior.
From left to right, MAF (movement synthesis of moving away, approaching, and following), PIN (pinning), POU (pouncing), and SNC (social nose contact). All trajectories are color-coded.
Extended Data Fig. 5 Different interaction patterns of SMuRo are associated with rat behavioral reactions.
a, Trajectories of the robot (gray lines) are overlapped with the location of attempts (red for successful ones and yellow for failed ones) during the robot-rat PIN (pinning), POU (pouncing), and SNC (social nose contact) interactions. Square arena size: 1m × 1m. b, Trajectories of the rat (blue lines) overlapped with the location of active contact by the rat (blue circles). c, The average speeds of the rat and the robot in each session across interaction patterns. The error bars, including mean and s.d. values, are shown for n = 3 independent experiments.
Supplementary information
Supplementary Information (download PDF )
Supplementary Figs. 1–12, Tables 1–4, Notes and Glossary.
Supplementary Video 1 (download MP4 )
Summary of this work: main contributions of this work in English for the broad readership in artificial intelligence, robotics and neuroscience.
Supplementary Video 2 (download MP4 )
SMuRo performs rat-like pitching movement: X-ray fluoroscopy video (top left) of a rat performing the pitching movement (rearing behaviour) from the side view. SMuRo can express similar pitching movements.
Supplementary Video 3 (download MP4 )
SMuRo performs rat-like yawing movement: X-ray fluoroscopy video (top left) of a rat performing the yawing movement (anogenital self-sniffing behaviour) from the top view. SMuRo can express similar yawing movements.
Supplementary Video 4 (download MP4 )
Autonomous robot–rat PIN interaction renders the rat aversive: Top-view and side-view video streams of the interacting robot and rat in the square arena (left). Robot control GUI (top right), which includes the video stream from the binocular camera mount on the robot head (top) and the interactive area (bottom) for setting the interaction parameters (task goal: Tb = PIN; Ta = body; Ts = 100 mm) for the robot (left; automatic control mode) before the interaction experiments and for manually manoeuvring the robot (right; manual control mode). Behavioural ethogram (top; red represents the PIN interaction) and the de-noised USVs (bottom) of the experiments in a 2.5 s sliding window (bottom right); the right boundary represents the current timing (t = 0 s). The beginning of the experiment and three examples are shown.
Supplementary Video 5 (download MP4 )
Autonomous robot–rat POU interaction renders the rat appetitive: Top-view and side-view video streams of the interacting robot and rat in the square arena (left). Robot control GUI (top right), which includes the video stream from the binocular camera mount on the robot head (top) and the interactive area (bottom) for setting the interaction parameters (task goal: Tb = POU; Ta = body; Ts = 100 mm) for the robot (left; automatic control mode) before the interaction experiments and for manually manoeuvring the robot (right; manual control mode). Behavioural ethogram (top; red represents the POU interaction) and the de-noised USVs (bottom) of the experiments in a 0.5 s sliding window (bottom right); the right boundary represents the current timing (t = 0 s). The beginning of the experiment and three examples are shown.
Supplementary Video 6 (download MP4 )
Autonomous robot–SNC interaction renders the rat appetitive: Top-view and side-view video streams of the interacting robot and rat in the square arena (left). Robot control GUI (top right), which includes the video stream from the binocular camera mount on the robot head (top) and the interactive area (bottom) for setting interaction parameters (task goal: Tb = SNC; Ta = body; Ts = 100 mm) for the robot (left; automatic control mode) before the interaction experiments and for manually manoeuvring the robot (right; manual control mode). Behavioural ethogram (top; red represents the SNC interaction) and the de-noised USVs (bottom) of the experiments in a 0.5 s sliding window (bottom right); the right boundary represents the current timing (t = 0 s). The beginning of the experiment and three examples are shown.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Jia, G., Chen, Z., Zhang, Y. et al. Modulating emotional states of rats through a rat-like robot with learned interaction patterns. Nat Mach Intell 6, 1580–1593 (2024). https://doi.org/10.1038/s42256-024-00939-y
Received:
Accepted:
Published:
Version of record:
Issue date:
DOI: https://doi.org/10.1038/s42256-024-00939-y
This article is cited by
-
Memetic robots
Nature Machine Intelligence (2024)


