Abstract
Developing robotic hands that adapt to real-world dynamics remains a fundamental challenge in robotics and machine intelligence. Despite notable advances in replicating human-hand kinematics and control algorithms, robotic systems still struggle to match human capabilities in dynamic environments, primarily due to inadequate tactile feedback. To bridge this gap, we present F-TAC Hand, a biomimetic hand featuring high-resolution tactile sensing (0.1-mm spatial resolution) across 70% of its surface area. Through optimized hand design, we overcome traditional challenges in integrating high-resolution tactile sensors while preserving the full range of motion. The hand, powered by our generative algorithm that synthesizes human-like hand configurations, demonstrates robust grasping capabilities in dynamic real-world conditions. Extensive evaluation across 600 real-world trials demonstrates that this tactile-embodied system significantly outperforms non-tactile-informed alternatives in complex manipulation tasks (P < 0.0001). These results provide empirical evidence for the critical role of rich tactile embodiment in developing advanced robotic intelligence, offering promising perspectives on the relationship between physical sensing capabilities and intelligent behaviour.
Similar content being viewed by others
Main
Precise sensory–motor control in real-world scenarios is fundamental to machine intelligence and embodied artificial intelligence (AI)1,2. A hallmark challenge in this field is the control of dextrous robotic hands3. Despite advances in mechatronic systems and sophisticated finger designs that enable enhanced dexterity4, the limited availability of rich sensory feedback fundamentally restricts their ability to adapt during dynamic interactions5,6. Understanding and addressing this sensory limitation is crucial for deploying robotic hands in real-world scenarios that demand nuanced control and rapid adaptation.
The robotics community has long recognized this challenge, approaching it through increasingly sophisticated hardware and control strategies. On the hardware front, researchers have developed intricate mechanical designs that closely mimic human-hand kinematics7,8,9,10,11, primarily relying on proprioceptive sensing for joint-level feedback. These hardware advances, often combined with visual perception, have enabled various control paradigms: from planning-based methods that execute precise finger gaiting12,13 to learning-based approaches that develop control policies through training14,15,16 and recently to large language models that provide high-level task reasoning17. However, a fundamental limitation persists: without the direct sensation of local contacts—crucial information for both modelling and control—these systems fail to handle unexpected physical interactions5.
The solution may lie in understanding human hand control, which achieves remarkable precise control through a sophisticated tactile perception system. This biological system comprises two key elements: a dense array of tactile sensors embedded throughout the skin18,19,20 and specialized neural processing in the primary somatosensory cortex that rapidly interprets and integrates this massive sensory input20,21,22. This combination enables humans to instantly detect and respond to subtle contact changes during manipulation, a capability that current robotic systems have yet to replicate.
Drawing direct inspiration from this biological architecture, we present F-TAC Hand (Full-Hand Tactile-Embedded Biomimetic Hand), a system that bridges the sensory gap in robotic manipulation. The core innovation lies in its comprehensive tactile sensing capability, featuring high-resolution coverage (0.1-mm spatial resolution) across 70% of the hand surface. This is achieved through the effective integration of 17 vision-based tactile sensors in six optimized configurations, where sensor covers serve dual purposes as both sensing elements and structural components. The hand maintains full human-like dexterity, demonstrated by its high Kapandji score23 and ability to perform all 33 human grasp types24. Complementing this hardware, we developed a generative algorithm that produces human-like hand configurations, creating a rich knowledge base for object interaction. The integration enables closed-loop tactile-informed control that processes high-dimensional contact data for precise, adaptive manipulation.
To rigorously validate F-TAC Hand’s capabilities, we focused on multi-object grasping—a task that epitomizes the challenges of dextrous manipulation5,25. While single-object manipulation has been successfully addressed by one-degree-of-freedom (1-DoF) parallel grippers26,27,28, simultaneous manipulation of multiple objects presents two distinct challenges: it requires both precise contact detection across the entire hand and strategic motion adjustments to prevent object collisions. Through comprehensive tactile sensing, F-TAC Hand directly addresses these challenges. Extensive evaluation across 600 real-world trials demonstrates significant performance improvements over non-tactile alternatives (P < 0.0001), particularly in scenarios involving real-world execution noise and dynamic object interactions.
Our work advances the field through two primary contributions: a practical demonstration that full-hand tactile sensing can be achieved without compromising hand motion capabilities, and comprehensive empirical validation of its benefits. By solving the technical challenges that previously restricted tactile sensing to simple grippers, this research enables unprecedented investigations into sophisticated tactile-embodied intelligence6. More broadly, our results provide concrete evidence for the critical role of rich sensory feedback in intelligent behaviour, suggesting promising directions for developing embodied AI systems beyond purely computational approaches29,30.
Results
F-TAC Hand hardware
F-TAC Hand advances the state of dextrous robotic hands through its comprehensive tactile sensing capabilities while maintaining a full range of motion. The hand achieves human-like tactile coverage, with sensing elements extending across 70% of the palmar surface at a density of 10,000 taxels (the pixels in camera CMOS) cm−2 (Fig. 1), notably surpassing current commercial solutions such as Shadow Hand, which provides only five-point feedback over less than 20% of its surface11 (comparison with other tactile arrays is available in Supplementary Section 1). This extensive coverage is achieved through an array of vision-based tactile sensors in multiple configurations (see exploded view in Extended Data Fig. 1a and physical dimensions in Supplementary Section 1), featuring specially designed covers that align with the hand’s phalanges and palm to minimize mechanical redundancy while replicating the natural kinematic structure of the human hand (Fig. 2a). Each of the five fingers incorporates three DoFs, contributing to the hand’s total 15-DoF configuration that enables human-like dexterity. A specialized electronic module enables large-scale sensor reading acquisition while minimizing space, weight and cabling requirements (Extended Data Fig. 1b). The hand’s dimensions mirror those of an adult human hand, measuring 194 mm from wrist to middle fingertip (Fig. 2a), and its modular design architecture allows for easy adaptation to different physical dimensions while maintaining functionality.
F-TAC Hand is a dextrous robotic hand featuring a high-density tactile sensing array that matches human capabilities, as benchmarked against physiological data from ref. 19. Detailed illustrations of the sensor construction and assembly process are provided in Fig. 2 and Extended Data Fig. 1. Similarly to its biological counterpart, it leverages sophisticated tactile feedback to accomplish complex manipulation tasks, such as precise in-hand object pose arrangement, enabling simultaneous and stable grasping of multiple items, a capability highlighted as challenging but crucial in ref. 5.
a, The seamless integration of 17 vision-based sensors in six configurations, maintaining 15 DoFs—three per finger—and adult hand dimensions. Each sensor includes a streamlined camera module for efficient tactile data acquisition in confined space. PCB, printed circuit board. b, F-TAC Hand demonstrates its strength by holding a 2.5-kg dumbbell; each phalanx contributes to a total grasping force of 10.3 N. c, Schematic representation of a finger, with Kn, θn and F denoting joint stiffness, rotation angle and cable force, respectively. Offsets in rotation due to cable and joint alignment are also shown. d, Top-down-view comparison of F-TAC Hand and human finger flexion. e, Despite the numerous sensors, F-TAC Hand retains its mobility, as evidenced by a successful Kapandji test23, where the thumb fingertip sequentially touches specific points on the hand as numbered in the figure. PP, proximal phalanx; MP, middle phalanx; DIP, distal interphalangeal; PIP, proximal interphalangeal; MCP, metacarpophalangeal.
Building upon its extensive tactile sensing coverage, F-TAC Hand also achieves comprehensive motion capabilities that match state-of-the-art dextrous hands9,10,11. The hand implements full mobility using just five slim cables (Extended Data Fig. 1c) with substantial payload capacity (Fig. 2b). Each cable controls the flexion and extension of a finger (Extended Data Fig. 1d,e), working in concert with stiffness-tuned springs at each joint (Fig. 2c) to replicate the coordinated yet semi-independent movements characteristic of human hands31 (Fig. 2d). An additional degree of actuation enables thumb opposition, expanding the hand’s motion versatility (Extended Data Fig. 1e). Detailed fabrication procedures are provided in ‘Tactile sensor fabrication’ and ‘F-TAC Hand fabrication’ in Methods. The hand’s dexterity is demonstrated through two evaluations: the Kapandji test23, completing all ten designated thumb-to-hand contact points shown in Fig. 2e, and the successful execution of all 33 human grasp types (Fig. 3).
Empowered by its smart design, the workspace of F-TAC Hand enables it to perform all 33 human grasping types, as documented in ref. 24.
The hand’s tactile sensing system utilizes the photometric stereo principle32,33, converting light intensity variations into surface gradient information (Fig. 4a). Contact surface geometry is reconstructed through a two-stage process. First, an array of encoder–decoder neural networks (Fig. 4b) maps physics-based relationships between surface gradients and intensity variations for each sensor. Next, a Poisson solver generates high-fidelity surface geometries, visualized as normal maps (Fig. 4c). The detailed sensor characteristics are available in Supplementary Section 1.
a–d, Raw tactile sensor readings from the configuration shown in Fig. 1a (a) are processed by neural networks (b) to reconstruct contact site geometries (c), visualized as normal maps. The neural networks, trained on simulated data (d) generated by a physics-based image formation model, enable efficient and precise mapping of extensive raw data to geometric information at the contact interface. e,f, When grasping an object (e), F-TAC Hand captures detailed contact information through its advanced tactileer sensing capabilities (f). g, This rich tactile feedback enables F-TAC Hand to accurately perceive and interpret object characteristics, as demonstrated by its precise estimation of in-hand object poses.
The unprecedented scale of F-TAC Hand’s tactile sensing system required development of efficient calibration solutions. We addressed this through a physics-based image formation model (detailed in Supplementary Sections 2 and 3) that generates synthetic readings of elastomer deformations during contacts (Fig. 4d). This approach enables efficient neural network training (Fig. 4b) and accurate sensor calibration.
The integration of fine-grained tactile sensing with robust motion capabilities enables F-TAC Hand to effectively grasp diverse objects, including challenging cases such as crystal balls (Fig. 4e), while simultaneously capturing detailed contact information (Fig. 4f). These sensory data enable accurate object pose estimation during manipulation (Fig. 4g). Additional demonstrations are provided in Supplementary Video.
Through this combination of dense tactile arrays and advanced motor capabilities, F-TAC Hand achieves unprecedented biomimetic fidelity, advancing both robotic manipulation capabilities and our understanding of human manual dexterity.
Powering F-TAC Hand with human-like diverse grasping
While F-TAC Hand’s high articulation enables sophisticated manipulation, it presents unique challenges in grasp planning. The increased number of DoFs makes traditional mechanical equation-based methods computationally intractable34. Learning-based alternatives35, though avoiding complex analytical solutions, require extensive training data that are both costly to collect and potentially biased by human demonstration preferences—a particular challenge for highly articulated dextrous hands.
We model the robotic grasp generation of rigid objects as sampling hand poses from a Gibbs distribution conditioned on object geometry. Each grasp is associated with an energy value derived from force closure criteria, which evaluates how well the grasp can resist external forces. Lower energy values indicate better grasping capability; see ‘Probabilistic formulation for grasp generation’ in Methods and Supplementary Section 4 for details. Due to the hand’s high number of DoFs and the non-convex nature of the problem36, we sample grasps from random initializations and apply a modified Metropolis-adjusted Langevin algorithm to reduce energy and escape local minima, converging to low-energy, high-quality grasps; see ‘Exploration algorithm for complex energy landscape’ in Methods for details. We validated the approach using a diverse test set of 23 objects, including spheres, cylinders, cuboids and irregular shapes (Fig. 5a). By executing the algorithm from various initializations, F-TAC Hand’s biomimetic kinematics (Extended Data Fig. 2) and varied object geometries result in diverse grasping poses (Supplementary Video).
a, A total of 23 objects, varying in dimensions and geometrical complexity, are chosen to assess the efficacy of the grasp generation method. b–d, The landscape of possible grasps is depicted as a disconnectivity graph, generated using the ADELM algorithm. e, For the pliers, the diversity in grasp strategies mirrors its common usage in daily human activities. f, This diversity in grasping strategies is maintained even when the object possesses complex geometric features. g, Comprehensive categorization of the generated grasps for all 23 objects, on the basis of human grasp types adopted from ref. 24. The results collectively cover all 19 common grasp types, implying the human-like diversity of the generated grasping strategies.
The resulting grasping poses are analysed through the Attraction–Diffusion Energy Landscape Mapping (ADELM) algorithm37 to visualize the complex energy landscape defined in equation (2), as shown in Fig. 5b–d. In this visualization, circles represent local minima (areas of low energy), where each local minimum contains at least one feasible grasp. The circle’s size indicates how many similar grasps exist within that local minimum. The circles are colour-coded according to the majority grasp type on the basis of ref. 24: power, precision and intermediate. The vertical connecting bars between circles represent energy barriers between different local minima, where shorter bars indicate easier transitions between grasping poses, while longer bars signify transitions that are more difficult to achieve. Direct comparisons between generated grasps and human demonstrations (Fig. 5b–d) in the boxes below validate the human-like nature of our solutions. This approach maintains its effectiveness even for challenging cases such as pliers and adversarial objects38 (Fig. 5e,f), with comprehensive energy landscapes presented in Extended Data Fig. 3.
To quantitatively assess the human-like diversity of our approach, we analysed 1,800 generated grasps according to the taxonomy of ref. 24, categorizing them into 19 common grasp types (Supplementary Section 5). The resulting distribution (Fig. 5g) demonstrates comprehensive coverage across the human grasp repertoire, from frequent strategies such as power sphere and precision sphere to specialized configurations such as distal type and palmar grasps.
Further analysis using contact maps39 reveals natural clustering patterns that align with human grasp classifications. By applying dimensionality reduction through principal component analysis and visualization via t-distributed stochastic neighbour embedding (Extended Data Fig. 4), we observe distinct groupings of power and precision grasps, with intermediate grasps appropriately positioned near the boundary defined by a radial-basis-function-kernel support vector classifier. This distribution mirrors human grasp categorization patterns, where intermediate grasps share characteristics of both primary types (computational details in Supplementary Section 6).
The demonstrated ability to generate diverse, human-like hand configurations provides F-TAC Hand with both optimal and near-optimal control strategies. This algorithmic foundation, working in concert with the low-level controller, enables enhanced dexterity and adaptability in real-world manipulation scenarios.
Adaptive behaviours of F-TAC Hand
The integration of advanced tactile sensing with diverse grasping strategies enables F-TAC Hand to implement a closed-loop sensory–motor feedback mechanism, allowing real-time adaptation to environmental changes. The implementation details are illustrated in Extended Data Fig. 7 and described in ‘Context-sensitive motor controls’ in Methods.
We demonstrate F-TAC Hand’s capabilities through multi-object grasping5, a critical benchmark for hand dexterity that surpasses the limitations of 1-DoF parallel grippers. This challenging task demands precise contact detection and strategic adjustments to avoid collisions—capabilities that remain elusive for current AI systems5. While recent advances13,25 show promise, managing the stochastic nature of real-world objects, especially those with complex geometries, remains challenging. F-TAC Hand overcomes these limitations through precise contact-point identification (Fig. 6a).
a, F-TAC Hand’s ability to grasp multiple objects simultaneously in a human-like manner. b, Despite execution noise, F-TAC Hand can optimize object transport through multi-object grasping. c, Real-world disturbances significantly increase second-grasp collision rates (inset: collision detection point in red), highlighting the necessity of tactile monitoring. d, Tactile-informed adaptation significantly improves second-grasp success rates. Plots c and d show 60 independent technical replicates (blue dots), each representing a unique object combination tested across 10 trials. Box plots show 25th–75th percentiles (box), median (centre line) and minimum/maximum values (whiskers). Shaded areas (orange in c, green in d) show kernel-density estimations of data distribution. The identical P values (2.1 × 10−17) displayed in the two panels derive from one-sided paired t-tests (t(59) = 11.8) comparing perfect versus real-world execution (c) and with versus without tactile-informed adaptation (d). No adjustments were made for multiple comparisons.
To evaluate real-world performance, we mounted F-TAC Hand on a Kinova Gen3 robotic arm for multi-object transport tasks (Fig. 6b). The goal is to grasp as many objects as possible in one go to maximize transportation efficiency. While optimal strategies exist under ideal conditions (red route in Fig. 6b), real-world variables—such as imperfect robot positioning and object perception—require adaptive motor control. The other coloured routes in Fig. 6b illustrate actual scenarios where F-TAC Hand encountered wrong object positions but leveraged its comprehensive tactile sensing capabilities (Fig. 4g) to assess situations and dynamically switch to alternative strategies that accommodate available space, even if theoretically suboptimal. Additional demonstrations of adaptive behaviours, including responses to finger impairments and ball size adaptation, are shown in Extended Data Fig. 5 and Supplementary Video.
We quantified the impact of tactile sensing through extensive experiments involving 60 object combinations across 600 real-world trials. Initial grasps were programmed from a disembodied AI perspective, using theoretically optimal strategies without considering environmental dynamics. Each combination underwent ten real-world trials, with tactile feedback assessing in-hand object positions and potential collision risks. Collision detection involves two steps: using tactile information to estimate the grasped object’s pose as shown in Fig. 4g, and then checking whether the union of the grasped object geometry and the next target object geometry is null. The observed collision rate in real-world execution (M = 0.465, s.d. = 0.306) differed significantly from theoretical predictions (M = 0.000, s.d. = 0.000), highlighting the substantial gap between simulation and reality, t(59) = 11.8, P = 2.1 × 10−17 (Fig. 6c).
F-TAC Hand’s adaptive capabilities become particularly evident when comparing tactile-informed versus non-tactile-informed control. Upon detecting collision risks, the system rapidly (~100 ms) switches to alternative strategies that might be suboptimal in theory but practical in reality. In scenarios with potential collisions, the non-tactile-informed approach inevitably fails, while the tactile-informed approach maintains productivity through adaptive replanning. The tactile-informed approach achieved perfect adaptation (M = 1.000, s.d. = 0.000) compared with significantly lower success rates without tactile feedback (M = 0.535, s.d. = 0.306), t(59) = 11.8, P = 2.1 × 10−17 (Fig. 6d). Notably, in collision-free scenarios, the two approaches demonstrate comparable execution times, with tactile-informed collision checking adding only ~1 s of processing time, indicating that tactile sensing provides critical robustness with minimal computational overhead during normal operations. The detailed method and the logic chain are available in ‘Context-sensitive motor controls’ in Methods.
Discussion
F-TAC Hand represents a substantial advance in robotic sensory–motor integration, achieving unprecedented integration of comprehensive tactile sensing with human-like dexterity. Its high-density tactile coverage (70% of palmar surface, 10,000 taxels cm−2) substantially exceeds current robotic hand capabilities. This exceptional sensing is achieved through the effective integration of vision-based tactile sensors, physics-based calibration methods and specialized electronics—all while maintaining full motion capabilities.
Recent advances in tactile sensing33,40,41,42,43,44,45,46,47,48 have primarily focused on parallel grippers. While these sensor-equipped grippers demonstrate enhanced capabilities in specific tasks—such as cable following26, surface following27 and articulated object manipulation28—their low-DoF mechanical structure fundamentally limits their dexterity for complex manipulation.
In contrast, F-TAC Hand’s integration of comprehensive tactile feedback with high articulation enables more sophisticated manipulation, as demonstrated by its successful multi-object grasping under uncertain conditions. The closed-loop sensory–motor feedback enables context-sensitive adaptations, significantly improving performance in dynamic real-world scenarios. This combination of sensing and adaptability is essential for practical robotics applications requiring safe and efficient environmental interaction.
The design philosophy behind F-TAC Hand emphasizes replicability, aiming to catalyse broader research in tactile-enabled manipulation. Its achievement of human-like capabilities opens possibilities in prosthetics, teleoperation, collaborative robotics and human–robot interaction. The hardware’s compact, modular architecture facilitates efficient data acquisition and calibration while being adaptable to various robotic platforms. The training-free stochastic optimization approach for grasp generation remains platform independent, enabling rapid deployment across different hand designs (Extended Data Fig. 6). While our current implementation assumes known object geometry, this was a deliberate scope decision to focus on tactile-informed adaptive control rather than geometry reconstruction. Pre-grasping geometry acquisition could be readily integrated using existing vision techniques49,50, and real-time reconstruction during manipulation represents a promising direction for future work. This combination of diverse grasping capabilities and environmental adaptability makes F-TAC Hand particularly suited for complex manipulation tasks.
Beyond technical achievements, our results suggest that practical AI requires tight integration between sensory processing and strategic adaptation. The demonstrated importance of comprehensive tactile feedback in achieving human-like dexterity aligns with cognitive and neuroscientific perspectives that emphasize the essential role of physical interaction in intelligence51,52,53,54.
Methods
Tactile sensor fabrication
The tactile sensor design for F-TAC Hand’s distal phalanx (Extended Data Fig. 1a) addresses key challenges in miniaturization and integration. A custom camera module using a single flexible flat cable for both power and data transmission resolves traditional cabling constraints. The sensor housing’s U-shaped clevis and tang structure enables the interconnection necessary for anthropomorphic articulation.
Contact detection relies on analysing elastomer surface deformation through reflected light intensity. To achieve uniform illumination in the confined phalanx space, we developed a specialized Lambertian membrane. This membrane combines an air-brushed spherical aluminium film (mill-resistant matte oil with 1-μm spherical aluminium powder) with a clear silicone base (Smooth-On Solaris parts A&B, 1:1 ratio). The illumination system comprises surface-mounted Lumileds LUXEON 2835 Color Line light-emitting diodes (red, green, blue and white) arranged around an acrylic support, enhanced by light-diffuser films. An OV2640 image sensor with a 160° wide-angle lens provides colour-compatible imaging, while 7-mm × 7-mm × 4-mm heat sinks ensure thermal stability.
The complete sensing system architecture (Extended Data Fig. 1b) integrates these components with a custom control module. The module interfaces with cameras through digital video ports, maintaining 240-px × 240-px image buffers. Spatial resolution of 0.1 mm per pixel is achieved, verified through known-object calibration. An expanded Serial Peripheral Interface bus coordinates sequential camera captures, with USB connectivity for PC data transmission and U2D2 protocol for servo control.
The tactile components are integrated into anatomically scaled phalanx covers matching adult hand dimensions. This modular, single-cable design overcomes traditional challenges in implementing high-resolution, extensive tactile sensing in robotic hands.
F-TAC Hand fabrication
F-TAC Hand’s structure (Extended Data Fig. 1c) integrates 17 compact vision-based sensors in six configurations to achieve human-hand proportions. The four fingers—index, middle, ring and little—share a common architecture (Extended Data Fig. 1d) with three serial revolute joints: metacarpophalangeal, proximal interphalangeal and distal interphalangeal, each offering 0–90° range. These joints utilize aluminium shafts supported by deep-groove ball bearings, with screw-bushing fixation and torsion springs maintaining a 0° rest position.
The thumb design (Extended Data Fig. 1e) features an additional carpometacarpal joint DoF, enabling 90° motion range with a 45° offset from its proximal interphalangeal joint axis. The two-part palm base facilitates assembly and houses compact tactile sensors, with the upper region incorporating dual cameras in a single sensor for enhanced perception (Extended Data Fig. 1c).
Finger actuation employs a cable-driven mechanism, with a single cable routed along both sides of each finger’s phalanxes, converging at the palm base. Torsion springs facilitate a return to rest position upon cable relaxation. Each finger is powered by a DYNAMIXEL XC330X-T288-T servo motor. For experimental validation, F-TAC Hand mounts onto a 7-DoF Kinova Gen3 manipulator.
Probabilistic formulation for grasp generation
The generation of grasp configurations for multifingered robotic hands presents notable challenges, particularly when maximizing dextrous capabilities. Instead of relying on data-driven approaches that demand extensive annotated datasets, we formulate grasp generation as a Gibbs distribution sampling problem:
where H = (T, q) represents the hand’s pose and joint configurations, O denotes the target object, E(H, O) defines the grasping energy function and Z is the intractable normalizing constant. The hand’s surface geometry S(H) is computed through forward kinematics.
This energy function combines two weighted components—grasp quality energy Egrasp and physical plausibility energy Ephy:
To assess the quality of the grasp, we use force closure criteria to define Egrasp(H, O):
where x = {xi} represents frictional contact points on S(H), and FC(x, O) assesses force closure formation on the object.
Ephy enforces physical constraints by penalizing hand–object penetration and joint limit violations:
where \({d}_{O}^{{\rm{SDF}}}(v)\) defines the signed distance function from point v to object O, and \([{q}_{j}^{\min },{q}_{j}^{\max }]\) specifies joint limits for each of the J joints.
This probabilistic formulation enables scalable generation of diverse, effective grasp configurations.
Exploration algorithm for complex energy landscape
The nonlinearity of hand kinematics and contact-point selection creates a complex energy landscape for E, making naive gradient-based sampling prone to suboptimal local minima. We address this through a modified Metropolis-adjusted Langevin algorithm that alternates between contact-point sampling and gradient-based pose optimization.
The algorithm initializes with random hand pose H and contact points x ⊂ S(H). Through L iterations, it updates H and x to maximize P(H, O). Each iteration stochastically chooses between updating the hand pose via Langevin dynamics or replacing a contact point with a uniform sample from the hand surface. Updates undergo Metropolis–Hastings acceptance criteria, favouring lower-energy configurations.
This combination of stochastic updates enables escape from local minima, while Metropolis acceptance guides sampling toward low-energy configurations. An algorithm efficiency analysis is detailed in Supplementary Section 7.
Context-sensitive motor controls
Extended Data Fig. 7 demonstrates adaptive control in a four-ball transport scenario, where ball repositories are weighted and combined by volume. Initially, at t1, F-TAC Hand plans to grasp a golf ball and softball using its little finger and remaining digits. To illustrate the control mechanism, we introduce a manual perturbation during golf-ball acquisition, causing F-TAC Hand to secure the golf ball with its index finger at t2. The occupation of the index finger invalidates the planned softball grasp (light grey in Extended Data Fig. 7), necessitating a strategy revision. Through comprehensive tactile sensing, F-TAC Hand detects the situation and adapts by executing an alternative approach—grasping a yoga ball using its thumb, index and middle fingers. While this solution was initially considered suboptimal, it demonstrates the system’s capacity for real-time adaptation to unexpected conditions.
Reporting summary
Further information on research design is available in the Nature Portfolio Reporting Summary linked to this article.
Data availability
The data that support the findings of this study are available from Zenodo55 (https://doi.org/10.5281/zenodo.15193164).
Code availability
The code used for performing grasp synthesis and training calibration models is available from Zenodo55 (https://doi.org/10.5281/zenodo.15193164).
References
Brooks, R. A. Intelligence without representation. Artif. Intell. 47, 139–159 (1991).
Segal, M. A more human approach to artificial intelligence. Nature 571, S18 (2019).
Billard, A. G. In good hands: a case for improving robotic dexterity. Science 386, eadu2950 (2024).
Ma, R. R. & Dollar, A. M. On dexterity and dexterous manipulation. In Proc. IEEE International Conference on Robotics and Automation (IEEE, 2011).
Billard, A. & Kragic, D. Trends and challenges in robot manipulation. Science 364, eaat8414 (2019).
Lepora, N. F. The future lies in a pair of tactile hands. Sci. Robot. 9, eadq1501 (2024).
Jacobsen, S. C., Wood, J. E., Knutti, D. F. & Biggers, K. B. The UTAH/M.I.T. dextrous hand: work in progress. Int. J. Robot. Res. 3, 21–50 (1984).
Deimel, R. & Brock, O. A novel type of compliant and underactuated robotic hand for dexterous grasping. Int. J. Robot. Res. 35, 161–185 (2016).
Hughes, J. A. E., Maiolino, P. & Iida, F. An anthropomorphic soft skeleton hand exploiting conditional models for piano playing. Sci. Robot. 3, eaau3098 (2018).
De Pascali, C., Naselli, G. A., Palagi, S., Scharff, R. B. N. & Mazzolai, B. 3D-printed biomimetic artificial muscles using soft actuators that contract and elongate. Sci. Robot. 7, eabn4155 (2022).
Dexterous Hand Series (Shadow Robot, accessed 15 December 2024); https://www.shadowrobot.com/dexterous-hand-series/
Morgan, A. S., Hang, K., Wen, B., Bekris, K. & Dollar, A. M. Complex in-hand manipulation via compliance-enabled finger gaiting and multi-modal planning. IEEE Robot. Autom. Lett. 7, 4821–4828 (2022).
Li, Y. et al. Grasp multiple objects with one hand. IEEE Robot. Autom. Lett. 9, 4027–4034 (2024).
OpenAI: Andrychowicz, M. et al. Learning dexterous in-hand manipulation. Int. J. Robot. Res. 39, 3–20 (2020).
Qin, Y. et al. DexMV: imitation learning for dexterous manipulation from human videos. In Proc. European Conference on Computer Vision 570–587 (Springer, 2022).
Chen, T. et al. Visual dexterity: in-hand reorientation of novel and complex object shapes. Sci. Robot. 8, eadc9244 (2023).
Ma, Y. J. et al. Eureka: human-level reward design via coding large language models. In Proc. International Conference on Learning Representations (ICLR, 2024).
Westling, G. & Johansson, R. S. Factors influencing the force control during precision grip. Exp. Brain Res. 53, 277–284 (1984).
Vallbo, A. B. et al. Properties of cutaneous mechanoreceptors in the human hand related to touch sensation. Hum. Neurobiol. 3, 3–14 (1984).
Johansson, R. S. & Flanagan, J. R. Coding and use of tactile signals from the fingertips in object manipulation tasks. Nat. Rev. Neurosci. 10, 345–359 (2009).
Penfield, W. & Boldrey, E. Somatic motor and sensory representation in the cerebral cortex of man as studied by electrical stimulation. Brain 60, 389–443 (1937).
Kaas, J. H., Nelson, R. J., Sur, M., Lin, C.-S. & Merzenich, M. M. Multiple representations of the body within the primary somatosensory cortex of primates. Science 204, 521–523 (1979).
Kapandji, A. Clinical test of apposition and counter-apposition of the thumb. Ann. Chir. Main 5, 67–73 (1986).
Feix, T., Romero, J., Schmiedmayer, H.-B., Dollar, A. M. & Kragic, D. The grasp taxonomy of human grasp types. IEEE Trans. Hum. Mach. Syst. 46, 66–77 (2015).
Yao, K. & Billard, A. Exploiting kinematic redundancy for robotic grasping of multiple objects. IEEE Trans. Robot. 39, 1982–2002 (2023).
She, Y. et al. Cable manipulation with a tactile-reactive gripper. Int. J. Robot. Res. 40, 1385–1401 (2021).
Lloyd, J. & Lepora, N. F. Pose-and-shear-based tactile servoing. Int. J. Robot. Res. 43, 1024–1055 (2024).
Zhao, Z. et al. Tac-Man: tactile-informed prior-free manipulation of articulated objects. IEEE Trans. Robot. 41, 538–557 (2024).
Turing, A. M. Computing machinery and intelligence. Mind 59, 433–460 (1950).
Mitchell, M. Debates on the nature of artificial general intelligence. Science 383, eado7069 (2024).
Sancho-Bru, J. L., Perez-Gonzalez, A., Vergara-Monedero, M. & Giurintano, D. A 3-D dynamic model of human finger for studying free movements. J. Biomech. 34, 1491–1500 (2001).
Woodham, R. J. Photometric method for determining surface orientation from multiple images. Opt. Eng. 19, 139–144 (1980).
Yuan, W., Dong, S. & Adelson, E. H. GelSight: high-resolution robot tactile sensors for estimating geometry and force. Sensors 17, 2762 (2017).
Siciliano, B., Khatib, O. & Kröger, T. Springer Handbook of Robotics Vol. 200 (Springer, 2008).
Ichnowski, J., Avigal, Y., Satish, V. & Goldberg, K. Deep learning can accelerate grasp-optimized motion planning. Sci. Robot. 5, eabd7710 (2020).
Liu, T., Liu, Z., Jiao, Z., Zhu, Y. & Zhu, S.-C. Synthesizing diverse and physically stable grasps with arbitrary hand structures using differentiable force closure estimator. IEEE Robot. Autom. Lett. 7, 470–477 (2021).
Hill, M., Nijkamp, E. & Zhu, S.-C. Building a telescope to look into high-dimensional image spaces. Q. Appl. Math. 77, 269–321 (2019).
Mahler, J. et al. Dex-Net 2.0: deep learning to plan robust grasps with synthetic point clouds and analytic grasp metrics. In Proc. Robotics: Science and Systems (RSS, 2017).
Li, P. et al. GenDexGrasp: generalizable dexterous grasping. In Proc. IEEE International Conference on Robotics and Automation (IEEE, 2023).
Yousef, H., Boukallel, M. & Althoefer, K. Tactile sensing for dexterous in-hand manipulation in robotics—a review. Sens. Actuators A 167, 171–187 (2011).
Fishel, J. A. & Loeb, G. E. Sensing tactile microvibrations with the BioTac—comparison with human sensitivity. In Proc. International Conference on Biomedical Robotics and Biomechatronics (IEEE, 2012).
Kim, J. et al. Stretchable silicon nanoribbon electronics for skin prosthesis. Nat. Commun. 5, 5747 (2014).
Liu, H. et al. Finger contact sensing and the application in dexterous hand manipulation. Auton. Robots 39, 25–41 (2015).
Ward-Cherrier, B. et al. The TacTip family: soft optical tactile sensors with 3D-printed biomimetic morphologies. Soft Robot. 5, 216–227 (2018).
Li, W. et al. F-TOUCH sensor: concurrent geometry perception and multi-axis force measurement. Sensors 21, 4300–4309 (2020).
Sun, H., Kuchenbecker, K. J. & Martius, G. A soft thumb-sized vision-based sensor with accurate all-round force perception. Nat. Mach. Intell. 4, 135–145 (2022).
Li, W. et al. L3 F-TOUCH: a wireless GelSight with decoupled tactile and three-axis force sensing. IEEE Robot. Autom. Lett. 8, 5148–5155 (2023).
Li, W. et al. MiniTac: an ultra-compact 8 mm vision-based tactile sensor for enhanced palpation in robot-assisted minimally invasive surgery. IEEE Robot. Autom. Lett. 9, 11170–11177 (2024).
Curless, B. & Levoy, M. A volumetric method for building complex models from range images. In Proc. Annual Conference on Computer Graphics and Interactive Techniques, 303–312 (ACM, 1996).
Wang, P. et al. NeuS: learning neural implicit surfaces by volume rendering for multi-view reconstruction. In Proc. Advances in Neural Information Processing Systems (NeurIPS, 2021).
Simon, H. A. Rational choice and the structure of the environment. Psychol. Rev. 63, 129 (1956).
Merleau-Ponty, M. Phenomenology of Perception (Routledge, 1962).
Varela, F. J., Thompson, E. & Rosch, E. The Embodied Mind, Revised Edition: Cognitive Science and Human Experience (MIT Press, 2017).
Vong, W. K., Wang, W., Orhan, A. E. & Lake, B. M. Grounded language acquisition through the eyes and ears of a single child. Science 383, 504–511 (2024).
Zhao, Z. et al. Embedding high-resolution touch across robotic hands enables adaptive human-like grasping: data and codes. Zenodo https://doi.org/10.5281/zenodo.15193164 (2025).
Ezgripper (SAKE Robotics, accessed 15 December 2024); https://sakerobotics.com/
Barrett BH8-282 3-Fingered Gripper (Barrett Technology, accessed 15 December 2024); https://advanced.barrett.com/barretthand
Allegro Hand (Wonik Robotics, accessed 15 December 2024); https://www.wonikrobotics.com/research-robot-hand
Acknowledgements
We thank Z. Chen (BIGAI) and Q. Gao (PKU) for their work on figure design. We are grateful to H. Liang (BIGAI) for his assistance with mechanical design, to M. Toszeghi (QMUL) for his proofreading efforts, to W. Yuan (UIUC), B. Dai (PKU, BIGAI) and Y. Su (BIGAI) for engaging in discussions, to Z. Qi (THU) for his assistance with grasping classification data, to Y. Niu (PKU, BIGAI) for his dedication in coding the Kinova driver, to L. Ruan (UCLA) for his assistance in voiceover, and to Y. Yang (PKU) and Y. Wang (PKU) for their contributions to the Shadow Hand hardware, including the video setup by Q. Wang (PKU). We acknowledge the support from J. Cui (BIGAI, PKU), Y. Ma (BIGAI), Y. Wu (BIGAI, PKU) and M. Han (UCLA) in creating portions of Supplementary Video. Special thanks are due to W. Zhang and L. Li from the 301 Hospital for their professional expertise in human-hand tactile sensing, and to R. Zhang and NVIDIA for their generous graphics processing units and hardware support. We are grateful to Offbeat Ripple Studio for their expertise and collaboration in producing Supplementary Video. Finally, we extend our gratitude to the National Comprehensive Experimental Base for Governance of Intelligent Society, Wuhan East Lake High-Tech Development Zone, for their generous support. This work is supported in part by the National Science and Technology Major Project (2022ZD0114900; S.-C.Z. and Y.Z.), the National Natural Science Foundation of China (62376031; Y.Z.) and the Beijing Nova Program (20230484487; Y.Z.).
Author information
Authors and Affiliations
Contributions
Z.Z.: building the hardware, devising control algorithms, coding, designing studies, running the F-TAC Hand adaptive behaviour study, analysing data and writing. W.L.: building tactile sensors, coding, running the F-TAC Hand adaptive behaviour study, analysing data and writing. Y.L.: devising grasp synthesis algorithms, analysing data, running the diverse grasp generation study, organizing data annotation, coding and writing. T.L.: devising grasp synthesis algorithms, coding and writing. B.L.: devising tactile sensor calibration algorithms, coding, analysing data and writing. M.W.: building the PCB and writing. K.D.: brainstorming ideas. H.L: conceiving and directing the research, and writing. Y.Z.: brainstorming ideas, writing, directing the research and providing the environment and funding support for conducting this research. Q.W.: funding support for conducting this research. K.A.: brainstorming ideas, directing the work on robotic tactile sensing and writing. S.-C.Z.: providing the environment and funding support for conducting this research.
Corresponding authors
Ethics declarations
Competing interests
The authors declare no competing interests.
Peer review
Peer review information
Nature Machine Intelligence thanks Changhao Xu and Kuanming Yao for their contribution to the peer review of this work.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Extended data
Extended Data Fig. 1 Mechatronic design of the F-TAC Hand.
a, Exploded view of a vision-based tactile sensor as a distal phalanx. b, Electrical components and system scheme. c, Schematic of the F-TAC Hand assembly and cable-driven mechanism. d, Finger model with mechanical components. e, Thumb model with mechanical components.
Extended Data Fig. 2 Kinematic model of the F-TAC Hand.
We adopt the modified Denavit-Hartenberg (DH) norm to establish the coordinates for the palm base and finger phalanxes. The transformations between these coordinates are represented in DH tables. In these tables, ai−1 is the distance along Xi−1 between Zi−1 and Zi, αi−1 is the angle about Xi−1 between Zi−1 and Zi, di is the distance along Zi between Xi−1 and Xi, and θi is the angle about Xi between Xi−1 and Xi.
Extended Data Fig. 3 Multi-object landscape.
We examine the grasp relationships among six example objects (pawn, vase, multimeter, board eraser, coffee bottle, and Coke can) using a large disconnectivity graph. This landscape comprises 79 basins, each categorized into one of three grasp types (Power, Intermediate, Precision) based on the majority of grasps it contains. Within each basin, similar grasp strategies are observed across different objects, as illustrated in (a)-(f).
Extended Data Fig. 4 Extended results of the grasp generation algorithm.
Visualization of grasp samples with t-SNE reveals that most Power grasps and Precision grasps are clustered separately, with Intermediate grasps lying in between. This map indicates a strong alignment between the generated results and human definitions of grasp types.
Extended Data Fig. 5 More adaptive behaviors by the F-TAC Hand.
F-TAC Hand’s stable grasping with some fingers disabled (shown in light gray), mirroring human compensation for finger injuries.
Extended Data Fig. 6 Extension to other hand topologies.
Our algorithm generalizes to various hand types without requiring specific mechanical structures or training samples. a, eight objects are used for testing with four different hands: b, two-finger EZGripeer56, c, Barrett 3-fingered Gripper57, d four-finger Allegro Hand58, and e, anthropomorphic Shadow Hand11.
Extended Data Fig. 7 Illustration of methods to realize human-like dexterous grasping.
F-TAC Hand employs a two-stage strategy for multi-object transportation. It adjusts for in-hand position variations due to perturbations, dynamically adapting its second-stage strategy to prevent collisions and maximize efficiency (Light gray in the grasping repository indicates that grasping strategies are rendered infeasible at the current time).
Supplementary information
Supplementary Information
Supplementary Figs. 1–4, Tables 1–4 and Sections 1–7.
Supplementary Video 1
F-TAC Hand introduction.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Zhao, Z., Li, W., Li, Y. et al. Embedding high-resolution touch across robotic hands enables adaptive human-like grasping. Nat Mach Intell 7, 889–900 (2025). https://doi.org/10.1038/s42256-025-01053-3
Received:
Accepted:
Published:
Issue date:
DOI: https://doi.org/10.1038/s42256-025-01053-3
This article is cited by
-
A comprehensive review of piezoelectric BaTiO3-based polymer composites for smart tactile sensing
Emergent Materials (2025)