Next Article in Journal
Development of a Self-Regulating Solar Shading Actuator Based on the Thermal Shape Memory Effect
Previous Article in Journal
Proxy-Based Sliding Mode Force Control for Compliant Grinding via Diagonal Recurrent Neural Network and Prandtl-Ishlinskii Hysteresis Compensation Model
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Anthropomorphic Soft Hand: Dexterity, Sensing, and Machine Learning

by
Yang Wang
1,
Tianze Hao
2,3,
Yibo Liu
1,
Hua** ** **ao
Dr. Hua** **ao is an associate professor at the College of Mechanical and Transportation China He [...]
1,4,*,
Shuhai Liu
1,4 and
Hongwu Zhu
1
1
College of Mechanical and Transportation Engineering, China University of Petroleum-Bei**g, Bei**g 102249, China
2
Tian** Key Laboratory for Advanced Mechatronic System Design and Intelligent Control, School of Mechanical Engineering, Tian** University of Technology, Tian** 300384, China
3
National Demonstration Center for Experimental Mechanical and Electrical Engineering Education, School of Mechanical Engineering, Tian** University of Technology, Tian** 300384, China
4
Center of Advanced Oil and Gas Equipment, China University of Petroleum-Bei**g, Bei**g 102249, China
*
Author to whom correspondence should be addressed.
Actuators 2024, 13(3), 84; https://doi.org/10.3390/act13030084
Submission received: 1 January 2024 / Revised: 28 January 2024 / Accepted: 8 February 2024 / Published: 21 February 2024
(This article belongs to the Section Actuators for Robotics)

Abstract

:
Humans possess dexterous hands that surpass those of other animals, enabling them to perform intricate, complex movements. Soft hands, known for their inherent flexibility, aim to replicate the functionality of human hands. This article provides an overview of the development processes and key directions in soft hand evolution. Starting from basic multi-finger grippers, these hands have made significant advancements in the field of robotics. By mimicking the shape, structure, and functionality of human hands, soft hands can partially replicate human-like movements, offering adaptability and operability during gras** tasks. In addition to mimicking human hand structure, advancements in flexible sensor technology enable soft hands to exhibit touch and perceptual capabilities similar to humans, enhancing their performance in complex tasks. Furthermore, integrating machine learning techniques has significantly promoted the advancement of soft hands, making it possible for them to intelligently adapt to a variety of environments and tasks. It is anticipated that these soft hands, designed to mimic human dexterity, will become a focal point in robotic hand development. They hold significant application potential for industrial flexible grip** solutions, medical rehabilitation, household services, and other domains, offering broad market prospects.

1. Introduction

Over millions of years of evolutionary development, humans have developed remarkably dexterous hands that are unmatched by those of other animals. The complex structure of the human hand, encompassing an assemblage of bones, ligaments, and muscles, collaborates synergistically to combine robustness and delicate control, enabling both forceful and intricate maneuvers. In robotics, the advent of robots seeks to either replace or assist human efforts in executing specific tasks and operations. Robotic hands and end effectors are crucial components of robotic systems that directly interact with the external environment and are responsible for executing various actions, gras**, and manipulation tasks.
To enhance the flexibility of mechanical hands, traditional rigid robotic hands incorporate numerous motors, linkages, gears, and springs to achieve the desired functionality. However, this approach significantly increases the complexity of the structure. Although rigid humanoid hands are precise and responsive, they can cause irreparable damage when handling fragile objects [1,2,3]. Additionally, the potential weight of rigid structures can pose an injury risk during human–robot interactions. Contrarily, anthropomorphic soft hands leverage the inherent flexibility of their materials, offering remarkable advantages, such as compliance and high resistance to external impact, compression, torsion, and collisions [4,5,6]. These benefits stemming from their “soft” characteristics have prompted increased research in related studies.
Soft hands have received more attention due to their innate compliance, permitting interaction with the environment via techniques akin to those of natural organisms. The inherent compliance reduces the stringent requirements for complex and precise kinematic modeling and high-resolution sensor feedback, simplifying traditional gras** problems encountered by rigid robotic hands. The grasp of soft hands can be effectively controlled by adjusting the input pressure. The passive adaptability and compliance in their structure significantly simplify grasp planning problems [7,8]. Therefore, soft hands are a promising choice for human-centered machine gras** tasks, offering performance benefits such as interaction safety, grasp reliability, and cost-efficacy.
The recreation of humanoid dexterous hands is an ongoing pursuit in the field of robotics. By imitating the shape, structure, and functionality of human hands, soft humanoid dexterous hands can partially achieve human-like movements, providing broad adaptability and operability in gras** and hand manipulation [9,10]. It is foreseeable that soft humanoid dexterous hands will become a focal point in the development of robotic hands, finding increasingly widespread applications and market prospects in areas such as flexible gras** solutions in industrial production [11,12], medical rehabilitation [13,14,15,16], and home services [17,18].
However, there are several differences between human and soft hands. First, soft hands are still unable to match the flexibility and adaptability of human hands. Human hands possess exceptional perception and adjustment abilities, adapting to various shapes and environments, while soft hands face challenges when dealing with objects in complex three-dimensional (3D) spaces [19,20]. Second, the control systems of soft hands are relatively complex, potentially requiring advanced algorithms and sensing technologies to achieve fine manipulation similar to human hands [21,22]. The high coordination and flexibility exhibited by human hands during various tasks may necessitate further research and technological innovation for soft hands, while durability and stability present additional challenges [23,24], requiring attention.
This review aims to provide an overview of the main directions in soft hand development, address the challenges and current issues in the research of soft humanoid dexterous hands, and explore potential future trends. The crucial factors that determine whether a soft hand can match the capabilities of a human hand are its grip**/manipulation performance and perception capability. Therefore, this review primarily focuses on the evolution of grip**/manipulation performance and advancements in tactile perception of soft hands. Additionally, the inclusion of machine learning in soft robots is discussed, as it enables them to possess more sophisticated perception abilities and improved grip**/manipulation strategies, allowing them to intelligently adapt to various environments and tasks. Since the materials, manufacturing processes, and driving methods for soft hands have been widely investigated [25,26,27,28,29], these aspects are not extensively discussed in this review. Popular materials, manufacturing methodologies, driving force, and control policy used in soft hands are listed in Table 1.

2. Dexterity and Grip**/Manipulation Performance

The gras** and manipulation capabilities of the human hand are highly complex and remarkable, showcasing the unique adaptability and flexibility developed throughout human evolution. First, the five fingers of the human hand possess independent motion capabilities, each with its own joint system that allows for individual flexion, extension, abduction, and adduction. This multi-joint structure grants the human hand exceptional dexterity, enabling the fingers to adapt to objects of various shapes and sizes (Figure 1a). The human hand also exhibits coordinated movements, allowing the fingers to work together to accomplish intricate manipulation tasks. The synergy of the fingers allows the human hand to perform precise gras** actions, such as delicately picking up small objects or adjusting the posture of larger objects. This coordination enables efficient object manipulation in various tasks. The flexibility of the human hand is driven by its highly complex biological structure and the remarkable neural control system, enabling independent, precise movements of the joints in a 3D space. The motion of these joints is intricately regulated by neural control systems, achieving highly coordinated and flexible control over the fingers. This section focuses on the dexterity and grip**/manipulation performance of different types of soft hands. The discussion categorizes these soft hands into three types based on their structure and functionality: non-anthropomorphic grippers, underactuated anthropomorphic hands, and anthropomorphic dexterous hands.

2.1. Nonanthropomorphic Grippers

As the end effector, the complete functionality of a robot hand is essential in the field of automation, with gras** being the most easily achievable operation. Therefore, two-finger and multi-finger soft grippers are fundamental for simulating human hand movements.
Soft grippers can grasp and release objects while executing specific actions. Pneumatic grippers are the most commonly employed, deforming when inflated, which enables approaching and securely gras** an object. The goal of research on robots using two-finger grippers for gras** tasks is to build a gras** recognition system that is fast, accurate, and appropriate for use by robotic hands [50,51]. However, multi-finger grippers are often employed since two-finger grippers frequently find it difficult to successfully complete gras** tasks when dealing with objects of complex shapes. Due to their multi-finger design, they are capable of multi-contact gras** of target objects, improving the gras** success rate and reliability [59,60].
Two-finger and multi-finger grippers demonstrate efficiency and speed when repeatedly gras** the same type of object. Soft grippers have a lower design time and production cost than other actuators due to their comparatively simpler structure. Therefore, they are popular for simple tasks in industrial applications and include commercial products, such as Festo’s tentacle gripper and mGrip from Soft Robotics Inc. [61,62,63].
In this context, a significant emphasis has been placed on enhancing output force and actuation speed when develo** novel grippers. To achieve these objectives, many studies have integrated bistable mechanisms into the soft gripper design [51,64], which are characterized by two stable equilibrium states, representing local minima in the total potential energy. The incorporation of bistable mechanisms reduces control complexity, enables fast motions, and promotes energy conservation.
However, the functionality of these grippers still suffers from single gras** mode, enveloped gras**, or pinch gras**, limiting their applications and reducing their reliability in gras** small objects. Even in dual gras** mode [50], these grippers are still limited to object gras**, lacking the capacity for intricate manipulation.

2.2. Underactuated Anthropomorphic Soft Hands

Two-finger and multi-finger grippers face challenges in adapting to situations where the objects to be grasped are constantly changing, necessitating the use of a dexterous hand. In this context, the term “dexterity” refers to the ability of the hand to exhibit a wide range of postures. A hand is considered more dexterous when it displays higher variability, increasing its gras** diversity and in-hand manipulation involving different gras** patterns. For instance, small objects can be precisely grasped with a few fingers, while larger objects may require an envelo** grip, and thin objects can be grasped via a thumb opposition grip [4,5,65].
To adapt to a wider range of object shapes and sizes, soft hands have been designed and developed, inspired by the shape of the human hand. The human hand can manipulate objects of various morphologies and materials and can tune the pose and position of objects (in-hand manipulation) with high dexterity in limited spaces. Constructing dexterous anthropomorphic hands capable of autonomously gras** and manipulating objects has been an important aim during robotic system design.
Represented by the RBO hand, these developed soft hands are primarily designed based on the principle of passive adaptation to the shape of the object [8,52]. Passive adaptability enables the hand to dynamically adjust its surface in response to contact forces, achieving a shape-matching effect with the object. Enhancing shape-matching improves the contact area between the hand and the object, eliminating the requirement for explicit sensing and control [15,66,67]. This not only increases resilience against uncertainties in hand position, finger control, and environmental factors, but also enables a passive adaptive hand to establish contact in any direction without sustaining damage [53]. Consequently, the compliant hand can effectively utilize the environment as a guide during gras** motions, further strengthening grip robustness. Therefore, passive adaptability is key in soft hand design to achieve successful gras** under uncertainty [5,68].
Traditionally, achieving dexterous gras** capabilities in robotic hands involved complex multi-joint structures and intricate actuation mechanisms. In addition to requiring sophisticated perception and control systems, these hands are expensive and challenging to design [69]. A significant direction in soft hand design is to simplify the system and improve robustness. Underactuated soft hands utilize reasonable structural designs to control hand movements with fewer degrees of freedom (DOFs) than finger joints, reducing the complexity of the entire hand system while improving reliability [70].
Unlike rigid dexterous hands, soft hands do not require additional actuators to achieve human-like bending motions. For instance, Feix et al. introduced a configuration featuring a single actuator per finger, enabling the soft hand to effectively accomplish 31 out of 33 grasp postures outlined in the widely recognized Feix taxonomy [71]. Similarly, the RBO 2 hand, with a mere 7 DOFs, successfully executes 31 out of 33 grasp postures from the Feix taxonomy [52]. Furthermore, Fras et al. proposed a biomimetic hand design where each finger possessed only one DOF yet was still capable of performing a diverse range of human-like gestures and effectively gras** various objects [67].

2.3. Dexterous Anthropomorphic Hands

Underactuated hands have shown robustness in gras** tasks, providing cost-effectiveness and system simplification. However, their operational capabilities are inherently limited due to their underactuated nature. This presents a trade-off between robustness and functionality in robotic hand design. A notable limitation of underactuated soft hands is their restricted dexterity, particularly in manipulating and repositioning objects [19,54]. Moreover, the majority of soft hands, limited by soft mechanical design and manufacturing technologies, typically only exhibit several DOFs [5,52,56]. Underactuated fingers constrain the robotic hand’s workspace, thus limiting its flexibility. This shortcoming hinders their broad application in complex, human-centric tasks. This is primarily attributed to the limited dexterity of soft actuators, which typically function as the fingers in soft hands. Most soft actuators have pre-defined motion trajectories, resulting in fixed trajectories for the fingers and overlap** motion limited to a singular point [72,73,74]. This paradox has spurred research into specific operational processes and the design of more dexterous hands, which are intelligent, multipurpose mechanical structures designed for a variety of tasks [4,6,20,75].
The common workspace between fingers is critical for in-hand manipulation. Designing multi-DOF soft fingers is a viable solution to address this, as demonstrated by integrating multiple actuators into a single finger [5,19,76,77]. Yet, this integration increases finger size and weight. Therefore, develo** compact, lightweight multi-DOF soft fingers remains a valuable goal. Additionally, biomimetic dexterity, encompassing both appearance and kinematic functionality, is a key consideration in soft hand design, aiming for the effective handling of everyday objects.
Human-like designs with additional actuators can significantly enhance soft hand flexibility. Notable examples include the BCL-13 hand [20], the BCL-26 hand [5], the Blue hand [19], and particularly, the Blue hand with a total of 21 DOFs, able to perform all 33 grasp types in the Feix taxonomy and pass all Kapandji tests for thumb dexterity [19]. Another example is the dual-mode actuators, which enable fingers to execute both bending and twisting motions [66]. With ongoing advancements in soft robotics, soft humanoid dexterous hands are evolving towards more humanoid appearances and motion characteristics, as seen in the BCL-26 hand [5], the RBO Hand 3 [6], and others [30,75,78].
The thumb and thenar muscles play a vital role in hand dexterity (Figure 1b). The thumb is opposable, meaning it can move in opposition to the other fingers, allowing for precision gras** and manipulation of objects. This opposable movement is made possible by the coordinated action of the thenar muscles, which control the movements of the thumb. The thumb is the primary contributor to hand motion, achieving nearly 40% of overall hand movements [79]. The thenar muscles (Figure 1b) are significant for thumb movement and essential for various daily actions, such as gras**, grip**, pinching, clam**, twisting, and tying [4,80]. First, the thumb can oppose the other four fingers, which is a prerequisite for gras** objects [30]. Second, the thumb can simultaneously translate, rotate, and flex, which is an ability that other primate hands are incapable of [81]. During the object-gras** process, the thumb adapts its position based on the shape of the object. This necessitates the carpometacarpal (CMC) joint of the thumb to perform not only flexion and extension motions but also abduction and adduction movements. These various thumb motions enable different types of gras** and pinching actions [82].
Despite the importance of the thumb, its motion differs significantly from that of the other fingers, and research on the mechanical design of the thumb is limited [83]. Many modular designs still treat the thumb the same as the other fingers, merely placing it in an opposing configuration [84], severely limiting its functionality and affecting the gras** ability of the entire hand. Recently, studies have revealed a flexible thumb with an active thenar, improving the gras** ability of the soft hand [4]. However, practical considerations, such as space constraints for actuators weight and cost limitations, challenge the implementation of multiple DOFs in the thumb.
Previous research has often overlooked the functionality of the palm, focusing primarily on soft finger designs. The hand muscles facilitate palm flexibility, allowing it to bend and form a concave shape, which is essential for gras** objects. The three key arches, namely the longitudinal, distal transverse, and oblique, achieve dexterous palm motion (Figure 1b) [85]. Many soft, humanoid, dexterous hands replace the palm with rigid materials, lacking actuation and limiting their gras** and manipulation capabilities. To address this, researchers have incorporated flexible actuators into the palm, enabling active palm-like functions. For instance, Wang et al. proposed an active palm with pneumatic actuation to enhance hand dexterity [78,86], while the RBO Hand 2 and 3 feature activatable anthropomorphic palms [6,52]. Experimental results have demonstrated that the active palm in anthropomorphic hands is a key factor in improving the performance of thumb opposition and envelope gras** [75,87]. A comparison of the main features of various types of soft hands is presented in Table 1. Examples of nonanthropomorphic grippers, underactuated anthropomorphic hand, and anthropomorphic dexterous hand are presented in Figure 2.

2.4. Methods to Enhance Gras** and Manipulation Performance

To augment the functionality of soft humanoid hands, researchers have embarked on innovative design explorations aimed at expanding their application range while concurrently enhancing their flexibility. A predominant challenge in the realm of soft hands is their limited output force [92,93]. This constraint directly impacts their carrying capacity, narrowing the spectrum of objects they can effectively grasp and thereby curtailing their application breadth. To address this, the integration of variable stiffness elements into soft hand systems has been proposed [94,95]. Activating these elements increases the structural stiffness of the gripper, aiding in the lifting of heavier loads. These elements bolster the carrying capacity of soft hands without markedly compromising their compliance and adaptability when inactive. Particle jamming is the most popular approach for variable stiffness due to its safety and easy availability [96,97]. Other variable stiffness methodologies include interference-based methods [98], motor-based methods [99], variable modulus-based methods [100], electromagnetic field-based methods [101], and phase change material-based methods [102]. Examples of stiffness-based methods to adjust gras** and manipulation performance are presented in Figure 3.
Despite these improvements, the maximum carrying capacity of soft hands still lags behind that of conventional rigid grippers and human hands. For instance, the pneumatic soft hand integrated with shape memory polymers, as developed by Zhang et al., can lift a 1.5 kg dumbbell using three fingers—a notable achievement for a soft gripper, yet still trailing behind its rigid counterparts (Figure 3d) [103]. The incorporation of variable stiffness elements often introduces additional actuation methods and elongates response times.
In recent years, some researchers have explored the fusion of rigid and soft structures, employing collaborative mechanisms to synergize their respective advantages, thereby achieving superior overall performance [32]. These rigid–soft coupling designs have demonstrated significant promise. For example, the single–stable rigid–soft coupling gripper proposed by Tang et al. can securely grasp an egg and stably lift an 11.4 kg dumbbell [104]. Liu et al. developed flexible hybrid pneumatic actuators for soft hand fingers [32]. The soft humanoid hand exhibits satisfactory comprehensive performance, including fast response, substantial gras** force, affordability, lightweight construction, and ease of fabrication and repair. Nonetheless, these designs confront challenges, such as not fully capitalizing on the high output force of rigid structures and the flexibility of soft structures. Moreover, current rigid–soft coupling gripper designs often feature complex and bulky structures, which substantially limit their compliance. Furthermore, high stability is imperative for robotic grippers, as soft robotic grippers frequently undergo deformation or prolonged vibrations due to external forces like gravity or impact, potentially impairing their operational efficiency and precision.
Environments containing lubricants like water or oil can significantly impact the performance of soft humanoid hands. The frictional interaction in such scenarios is crucial for stable gras**. Human fingertip skin undergoes various degrees of stretching during object gras**, with ridge patterns bolstering gras** ability [105,106]. In wet environments, fingerprints enhance the gras** area, thereby improving the success rate [107,108]. This mechanism differs from the adhesive effect observed in tree frog toe pads or gecko claw setae, which operate at the nanoscale [109,110]. Human fingerprints are relatively macroscopic and primarily leverage frictional changes to augment gras**. Hao et al. indicated that fingerprint-like surface textures significantly enhanced the pinch-gras** ability of soft humanoid hands in water and oil lubrication conditions, surpassing the performance on smooth surfaces [111]. Additionally, applying a concentrated hyaluronic acid solution to the surfaces with fingerprint-like textures enabled the soft hand to grasp a variety of common medical instruments, marking a substantial improvement over smooth surfaces [112].

3. Tactile Perception

A dexterous anthropomorphic hand lacking a tactile sensing feedback mechanism falls short in comparison to the human hand, as it lacks the ability to autonomously interact with its surroundings. This deficiency hinders the ability to regulate contact forces accurately and execute precise manipulations [113]. Substantial efforts have been directed towards develo** external perception capabilities for soft hands, with tactile sensing identified as crucial for safe, refined object gras** and manipulation [114,115].
Human interaction with the external world is largely mediated through tactile sensing via the skin, the body’s largest sensory organ, which plays a vital role in survival, exploration, and response. Similarly, tactile sensors are fundamental in robotics for achieving biomimetic sensing and intelligent interaction. These sensors convert tactile stimuli into electrical signals that computers can interpret, holding significant potential in human healthcare, biomimetic robotics, and human–machine interaction. Recent advancements in flexible electronics and nanotechnology have facilitated the development of tactile sensing technologies that emulate human skin characteristics [116,117]. Embedded sensors and electronic skin (e-skin) are the most applied strategies to provide tactile perception for soft hands. Several examples of soft hands with integrated perception devices are presented in Figure 4.

3.1. Embedded Sensors

A direct approach to equip** soft hands with perception capabilities involves the integration of sensors onto their surfaces. By incorporating sensors directly onto the soft hand’s exterior, it becomes possible to capture and interpret information about the surrounding environment. These sensors can be designed to detect various stimuli such as pressure, temperature, proximity, and even tactile feedback. For instance, tactile sensors have been integrated into soft actuators and hands for closed-loop control of gras** force [30,126], while novel soft sensors capable of detecting contact forces and object curvature during gras** have been proposed [127]. Stretchable optical waveguides were applied as embedded sensors in a soft hand system to detect curvature, elongation, and force during object interaction (Figure 4a) [118]. These optoelectronic strain sensors, characterized by ease of fabrication, chemical inertness, and low hysteresis, exhibited the ability to act as multi-modal sensors. Zhou et al. designed and developed a soft hand equipped with soft sensors using 3D printing for effective integration with myoelectric control systems (Figure 4d) [121].
Marasco et al. developed an artificial sense of touch in a prosthetic hand equipped with pressure sensors, providing sensory feedback via a force-tactile interface [128]. This interface stimulated sensory nerves redirected to the residual limb skin, applying proportional pressure. Additionally, e-skin can enhance prosthetics by providing sensory perception and interaction capabilities for amputees and individuals with nerve damage (Figure 4e) [21].
A novel dual-purpose design has been proposed for soft fingertips to enhance the performance of in-hand manipulation, presenting a unique approach for delicately manipulating soft objects without causing damage [129]. This innovative approach involves equip** robotic hands with soft fingertips that possess both tactile sensing and active shape-changing capabilities. By actively manipulating embedded air cavities, the fingertip demonstrates precise control over the in-hand manipulation of soft objects, aided by pressure feedback control. Additionally, a 3D-printed soft hand has been developed for prosthetic applications, incorporating flexible sensors that are compatible with advanced myoelectric control systems such as pattern recognition control and simultaneous proportional control [130]. By integrating flexible position sensors into the fingers, the hand can monitor finger positions to prevent self-collisions, enabling smoother and more intuitive transitions between gestures. The combination of various gestures empowers the hand to perform multi-stage gras** and manipulate multiple objects simultaneously.
For sophisticated manipulation, it is necessary to recognize the object pose estimation observed in human fingers, which remains a fundamental yet challenging task in robotics. Various methods for hand pose estimation (HPE) [131] and hand–object pose estimation (HOPE) [132,133] have been developed, mirroring human manipulation strategies. Inspired by proprioception, embedded cameras and deep learning architectures have been proposed for object recognition in soft fingers. Liu et al. presented a soft finger design that integrated inner vision and kinesthetic sensing to replicate the object pose estimation capabilities observed in human fingers [115]. The proposed framework offered a comprehensive solution that leveraged raw images captured by the soft fingers to estimate the pose of objects held in the hand, providing an end-to-end approach for accurate object pose estimation. The framework underwent testing with seven objects, achieving impressive results with a pose error of 2.02 mm, an angle error of 11.34 degrees, and a classification accuracy of 99.05%.
While these solutions demonstrate the advantages of tactile feedback in soft grippers or fingers, they primarily provide aggregated contact information. Distributed sensor networks can offer spatial information on contact events for a more comprehensive understanding. He et al. proposed a soft hand featuring an active palm and tactile perception enabled by distributed pneumatic sensors integrated into both the palm and fingers [30]. Leveraging multi-material 3D printing enabled the direct printing of tactile sensors on the hand, contrasting with traditional tactile methods that necessitated separate attachment as part of multiple fabrication steps. The presented hand successfully executed 32 out of 33 Feix taxonomic parameters for gras** and all 11 Kapandji tests.

3.2. Electronic Skin (E-Skin)

The concept of electronic skin (e-skin) draws inspiration from the sensory and mechanical properties of human skin, with the goal of develo** electronic devices that can emulate these characteristics. E-skin is designed to replicate the functionalities of human skin, such as the ability to sense touch, temperature, pressure, and even humidity. The development of e-skin involves the integration of various components, including flexible and stretchable materials, sensors, and electronic circuits. These components work together to create a thin and flexible electronic layer that can be applied to the surface of objects or even directly onto the human body. E-skin, with capabilities comparable or superior to human skin, holds vast potential in fields such as intelligent robotics [134,135], wearable medical devices [136,137], and human–machine interaction [122,138]. Additionally, e-skin can enhance prosthetics by providing sensory perception and interaction capabilities for amputees and individuals with nerve damage [21].
Although various e-skins have been developed for monitoring pressure, recent innovations involve integrating individual tactile force sensor pixels into soft robots. Yamaguchi et al. developed a soft hand with integrated e-skin pressure sensors, focusing on incorporating strain effects into both the sensor and the overall mechanical system [116]. The e-skin demonstrated insensitivity to strain resulting from structural actuation bending and the ability to monitor tactile force distribution when the soft hand held an object.
Monitoring object sliding within the soft hand is helpful for gras** by providing feedback to adjust the grip** force and prevent slippage. While human hands naturally possess this capability, current robotic hands lack it because of the complexity of integrating various sensors. Successful integration can significantly improve the performance of soft hands in human–machine interaction applications. Two primary challenges arise: integrating sensors into high-strain areas without compromising dexterity and optimizing sensor sensitivity to match execution force and detectable threshold pressure during object contact. Soft hands integrated with e-skin can monitor tactile pressure, temperature, and sliding during gras** and manipulation without compromising robot softness [113]. In addition to force measurement, object sliding or slip** when held in a soft hand can be tracked by assessing the time delay of the tactile forces detected by the sensor array.
Advancements have produced e-skins capable of sensing temperature, pressure, vibration, and strain, similar to human skin [139]. Various biomimetic strategies have been employed, utilizing interlocked microstructures combined with resistive [140], piezoresistive [141], ferroelectric [142], triboelectric [143], or capacitive [144] sensor arrays. These multifunctional sensing systems improve sensitivity, reduce response time, and increase linearity to enhance the perception performance of soft hands. However, distinguishing different stimuli remains challenging since soft hands are exposed to various objects in different posture and loading conditions.

3.3. Multi-Dimensional Force Perception and Multi-Modal Sensing

Human skin can perceive tactile stimuli, distinguish between normal and shear forces, and discern the temperature, hardness, and roughness of touched objects [145]. Emulating this capability in humanoid dexterous hands is crucial for accurately identifying grasped objects using tactile feedback from sensors. Therefore, designing tactile sensors with 3D force perception and multi-modal sensing capabilities is significant for enhancing the intelligence of dexterous hands. Tactile sensing is essential for providing contact properties such as pressure, motion direction, location, and surface hardness/texture [146]. This is particularly important for anthropomorphic soft hands, where internal torque sensing may not always be feasible due to design or cost constraints. The real-time measurement and differentiation of force direction, typically normal and shear, are necessary for providing slip and surface property information, which cannot be obtained through traditional wrist force, torque, and proprioceptive sensors in current robots [147]. To achieve the advanced capability, array based and multi-layer based design strategies (Figure 5) are usually applied in conjunction with carefully functional materials.
Boutry et al. developed an e-skin capable of differentiating normal and shear forces by mimicking the mechanism of the ridges and mechanoreceptors in human skin (Figure 4b) [123]. The capacitor sensor array was sensitive to various stimuli, and the nature of unknown stimuli could be assessed by analyzing recorded signals against a known library of stimulus–response curves. This capability was realized via a 3D configuration that replicated the interconnected dermis–epidermis interface present in human skin. Nevertheless, these systems come at the expense of hand manipulation speed and controller performance. Qu et al. proposed the utilization of a flexible triboelectric tactile sensor (FTTS) designed in a shape inspired by human fingerprints (Figure 4i) [148]. This sensor incorporated eutectic gallium–indium (EGaIn) liquid metal and silicone materials, operating on the principle of a triboelectric nanogenerator. Using three independent fingerprint-like channels, the sensor could detect pressure intensity and position, simulate passwords, identify materials, and monitor pulses.
Recent popular strategies for fabricating multi-modal tactile sensors include integrating various sensing units [150,151], designing sensing arrays to decouple different types of signals [149,152], develo** novel materials able to simultaneously sense multiple stimuli [153,154], and designing novel structures [143,148]. Multi-modal sensors with decoupled sensing mechanisms allow a robotic hand to acquire more precise tactile information about the target object, facilitating dexterous object manipulation. They also benefit accurate real-time health monitoring, allowing for the separate measurement of different physiological parameters, such as body temperature and movement, improving the accuracy and safety of the human–machine interface. Mechanism and output signals of flexible sensors with capabilities of multi-dimensional force perception or multi-modal sensing are summarized in Table 2. Future robots will require complex feedback regarding force, temperature, and touching surface properties for tasks trivial to humans, like gras** a glass or inserting a key into a lock, making the development of a multi-modal complex sensing system an inspiring and challenging research topic.
The advancement of flexible electronics and nanotechnology has driven the development of tactile sensing technology, enabling it to simulate the characteristics of human skin. Tactile sensors are crucial for robots to achieve biomimetic perception and intelligent interaction, as they convert tactile stimuli into signals that computers can interpret. This technology has the potential to play an important role in human healthcare, biomimetic robots, and human–computer interaction. Soft humanoid hands require tactile perception to ensure safe interaction with objects. By using integrated tactile sensing devices, soft hands can perceive and interpret tactile information of external objects to adjust contact forces and perform precise operations. This not only improves the interaction ability between the soft hand and the environment but also increases the flexibility and reliability of the robot in various tasks. The further development of tactile sensing technology will further enhance the application potential of soft humanoid hands in fields such as medical rehabilitation, home services, and industrial flexible gras**.

4. Machine Learning in Soft Hands

The remarkable adaptability of the human hand in performing various tasks is a testament to its biomechanical superiority, which is further enhanced by the learning and memory mechanisms of the nervous system. The goal of the next generation of soft hands is to achieve the capability of accurately perceiving their surroundings and making correct decisions in response to stimuli. The advancements in machine learning have proven beneficial in increasing the accuracy of tactile perception and improving the decision-making strategies of soft hands during their interactions with different environments. Similarly, soft hands can benefit from incorporating machine learning techniques to address their inherent limitations, such as nonlinearity and hysteresis, which arise from structural compliance, material viscoelasticity [166,167], inconsistent outputs during loading and unloading cycles, and higher complexity due to factors like drift and high DOFs, challenging the mathematical modeling and calibration of soft hands and grippers.
Machine learning, renowned for its efficacy in solving nonlinear problems across various fields [168,169], has been increasingly applied to soft robotics. Its applications extend to soft sensor calibration [170,171], soft actuator position control [172,173], and more intricate tasks like gras** and in-hand manipulation [69]. Research indicates that machine learning-based methods have successfully mitigated many of the current challenges faced by soft robotic hands. A schematic of an anthropomorphic soft hand interacting with the environment with machine learning models is presented in Figure 6.
Sensor calibration aims to accurately estimate physical stimuli, such as strain and stress, from soft sensor inputs, such as resistance and capacitance, using the Artificial Neural Network (ANN), k-nearest Neighbors (kNN), Recurrent Neural Network (RNN) [167,174], Convolutional Neural Network (CNN), Support Vector Machine (SVM), and Long Short-Term Memory (LSTM) models. These models are widely applied to evaluate gras** information, such as object recognition, gras** stability, and hand pose [175]. To address the hysteresis characteristics in output signals, both mathematical models and data-based methods have been used for optimization [176]. Data-based approaches often employ machine learning algorithms and convex optimization to fine-tune hysteresis model parameters [167]. Luo et al. designed a bioinspired soft sensor array (BOSSA) via theoretical and experimental investigations into the triboelectric effect and cascaded electrodes [177]. Using a data-driven algorithm, multilayer perceptron, the BOSSA was capable of environmental self-awareness, with pressure- and material-sensing abilities. Deep learning, well-established in fields like computer vision and natural language processing [178], has also been applied to soft sensor research. For instance, Navarro et al. utilized feed-forward neural networks and transfer learning to calibrate soft pneumatic mechanosensors, comparing results with those from numerical methods like the finite element method [179].
Inputs from sensors are processed by machine learning models, usually the feed-forward neural network (FNN), RNN, and CNN, to estimate pose/position and to generate control strategies for the proprioception and motion control of soft grippers and hands [22,180]. Machine learning has also been useful in soft robotics for generating gras** poses for unknown objects [181,182]. Demonstration learning has been combined with reinforcement learning to transfer gras** capabilities from human operators to robotic systems [183]. Dexterous robotic hands with multiple fingers are capable of an extensive range of actions, and their morphological resemblance to the human hand presents significant potential for expediting robot learning. Mandikal et al. introduced an innovative method for acquiring proficiency in robotic gras** by leveraging human–object interactions available in openly accessible videos [184]. Using deep reinforcement learning, it can easily scale to new objects without collecting human demonstrations. Learning-based techniques have also facilitated the transfer of human-operated soft hand response strategies to robots for gras** new items in human–robot handover scenarios [185]. These advancements highlight the synergy between machine learning methods and the inherent adaptability of soft hands to unknown environments.
A deep learning-based method has been developed to address the challenge of predicting whether the grasp will be successful in soft hands [186]. This framework utilizes two neural architectures: a classifier for the a-posteriori detection of failure events and a predictor that uses readings from Inertial Measurement Units (IMUs) to estimate object sliding. By leveraging these neural architectures, the framework can effectively anticipate grasp failures in soft hands before they occur. Santina et al. proposed a strategy that mimics human motion to achieve autonomous soft hand gras** [182]. This approach employs a deep neural network classifier to analyze the visual information of the object to be grasped. Based on this analysis, the classifier predicts the action that a human would likely perform to achieve the desired goal. The predicted action is then used to select one of several human-inspired primitives, combining anticipatory actions with touch-based reactive gras**. Research on machine learning related to anthropomorphic soft hands is shown in Table 3.
This section demonstrates how soft hands can partially replicate human-like movements in gras** tasks by imitating the shape, structure, and function of human hands, providing flexibility and operability. The article predicts that these soft hands, designed to mimic human dexterity, will become the focus of robot hand development and have broad market prospects in industrial flexible grip solutions, medical rehabilitation, home services, and other fields. Future advancements may stem from further sensor and actuator integration, aiming to develop soft hands that can rival the capabilities of biological organisms.

5. Outlook

5.1. Potential Applicaitons

In the field of healthcare, the application of the anthropomorphic soft hand is very extensive. For example, in surgery, soft robotic arms can cooperate with doctors to perform minimally invasive surgery, allowing for more precise operations and reducing harm to patients. In addition, soft robotic arms can also be applied in rehabilitation assistive devices to assist rehabilitation patients in self-care training or provide more intelligent assistive tools for disabled individuals to improve their quality of life.
In industrial production, anthropologic soft hands can collaborate with human workers to complete the assembly work of small and vulnerable parts. This type of soft robotic arm has a strong adaptability to meticulous operations, which will be of great help in the production of electronic components, medical devices, and other products that require careful operation. In addition to the production field, soft robotic arms can also be applied in hazardous environments, such as radioactive areas or chemical processing, to avoid direct human operations in high-risk environments.
In the field of personal assistance, the anthropomorphic soft hand also has broad application prospects. For example, it can be applied to assist elderly people in their daily lives, performing simple and common actions such as picking up items, wi** the table, etc. In addition, soft robotic arms can also be designed to take care of people with disabilities, such as hel** them complete self-care activities, and even engaging in intelligent communication and interaction.
Overall, the application of the anthropomorphic soft hand in fields such as healthcare, industry, and personal assistance will have a profound impact in practical scenarios. It will improve work efficiency, reduce human errors, promote a safer working environment, and improve quality of life in the fields of healthcare and personal assistance. The development of soft robotic arms will strongly promote the advancement of robotics technology and become one of the key technologies in the future.

5.2. Challenges and Future Directions

As the end effector, robotic hands determine essential functions for the robotic system, like gras** and in-hand manipulation. Soft, dexterous hands adapt actively or passively to environmental changes, compensating for rigid hand limitations. These grippers and hands excel in adaptability and interaction, executing tasks gently and securely, even with delicate items, in unstructured settings. The goal of robotic gripper design is to replicate human-inspired dexterity for autonomous object manipulation. Yet, despite the advantages of soft, human-like, dexterous hands, they still present significant disparities and limitations compared to their human counterparts. Therefore, addressing these challenges during continuous soft-hand development is a priority.
Most underactuated soft grippers and hands typically exhibit a single gras** mode when handling objects of varying sizes. The dependability of gras** diverse types of objects, particularly small ones, is frequently limited by the mismatched or insufficient contact area. In addition, underactuated hands often lack sensing capabilities, which means their anti-interference ability during the gras** process is a concern.
Dexterous soft hands with multiple joints and 20 or more DOFs require complex mechanical structures, presenting a significant challenge for design because of the compact, narrow digit space, especially the thumb. This scenario is exacerbated by integrating a considerable number of sensing units into the main configuration. Researchers face challenges in the industrialization and commercialization of dexterous soft hands due to the trade-off between their overall performance and the time and cost of design.
Compared to rigid and human hands, the load capacity of current soft hands is relatively low, hindering more extensive application. Although development in material science provides potential solutions to this problem, further investigation into improved variable stiffness methodologies to enhance gras** performance is necessary. In addition, adding friction layers with specially designed microstructures may also contribute to stable gras** in various environments.
Environmental perception plays a pivotal role in manipulation tasks. Several tactile sensors have been developed for specific parameters, demonstrating high performance comparable to or even surpassing that of human skin. Nevertheless, creating tactile sensors that encompass all the properties of human skin remains a formidable challenge. The development of enhanced decoupling mechanisms and methods is vital for multi-module sensors to ensure the production of unaltered signals and the accurate restoration of stimuli.
Machine learning has been employed to capture human gestures/poses and control soft hands to achieve object gras** and in-hand manipulation. While numerous promising results have been reported, several challenges persist, including the proper training of general machine learning models and addressing the nonlinearities inherent in soft systems. Overcoming hurdles, such as the need for a substantial amount of data, unexpected error sources, and the necessity for real-time measurements and controls, is crucial for further advancements in this field. To complete tasks in unknown or complex environments, a soft robotic arm requires a high level of perception ability and intelligent decision-making systems. However, embedding highly integrated sensing systems and intelligent algorithms into software materials to achieve the environmental perception, data processing, and decision execution functions of robotic arms is a complex engineering task. Among them, the ability to process a large amount of sensor data in real-time and make rapid and accurate responses is currently the bottleneck of technological development.
Furthermore, a commonly overlooked issue is the fabrication errors associated with soft hands. During soft hand fabrication, 3D technology is commonly used for casting. However, the fabrication accuracy falls short when compared to rigid counterparts. These manufacturing flaws significantly reduce the deformation, bending, and output force stability of anthropomorphic soft hands. Due to the close collaboration between soft robotic arms and human workers, they must have extremely high safety performance to avoid harm to people or objects during operation. However, ensuring the reliability and low failure rate of soft robotic arms, especially when experiencing frequent deformation or contact with different objects and surfaces, remains a challenge in technological development. The wear and tear, aging, and maintenance strategies of soft robotic arms are also current issues that need to be overcome.
The activities of soft robotic arms typically require external energy sources, such as pneumatic or hydraulic systems, which are often bulky and inconvenient to carry. This greatly limits the application of soft robotic arms in situations where there is no fixed energy supply point. Meanwhile, the energy conversion efficiency of soft robotic arms is low when undergoing multiple bending and stretching movements, which may lead to excessive energy consumption, thereby limiting their continuous working time and practicality.
With the continuous progress of robot technology, it is expected that future robot soft hands will integrate more advanced multimodal perception systems and incorporate innovative materials and intelligent structures to improve load capacity and adaptability to variable stiffness. In addition, new design and manufacturing technologies will greatly improve the performance of soft hands and reduce production errors. By utilizing advanced machine learning and adaptive control algorithms, soft hands will be able to simulate human hand movements more naturally and be applied in increasingly expanding fields such as advanced manufacturing, service robots, medical assistance, and even disaster response scenarios. At the same time, the sustainability and maintenance issues of soft hand design will also be given attention, and improvements in user interface and interaction performance will make soft hands more user-friendly. These innovations will not only greatly expand the application scope of robotic hands, but also have the potential to change the way humans and robots interact, improving the efficiency and safety of human–machine cooperation. In the future, robot soft hands are expected to achieve more precise operations, unlock new application prospects, and become indispensable assistants in human life and work.

Author Contributions

Conceptualization, H.X.; writing—original draft preparation, Y.W. and Y.L.; writing—review and editing, T.H., H.X., S.L. and H.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This study was funded by the Bei**g Natural Science Foundation (No. 3232013).

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Rus, D.; Tolley, M.T. Design, fabrication and control of soft robots. Nature 2015, 521, 467–475. [Google Scholar] [CrossRef]
  2. Ilievski, F.; Mazzeo, A.D.; Shepherd, R.F.; Chen, X.; Whitesides, G.M. Soft robotics for chemists. Angew. Chem. 2011, 123, 1930–1935. [Google Scholar] [CrossRef]
  3. Laschi, C.; Mazzolai, B.; Cianchetti, M. Soft robotics: Technologies and systems pushing the boundaries of robot abilities. Sci. Robot. 2016, 1, eaah3690. [Google Scholar] [CrossRef] [PubMed]
  4. Hao, T.; ** and in-hand manipulation. IEEE Robot. Autom. Lett. 2018, 3, 3379–3386. [Google Scholar] [CrossRef]
  5. Gu, G.; Zhang, N.; Xu, H.; Lin, S.; Yu, Y.; Chai, G.; Ge, L.; Yang, H.; Shao, Q.; Sheng, X. A soft neuroprosthetic hand providing simultaneous myoelectric control and tactile feedback. Nat. Biomed. Eng. 2023, 7, 589–598. [Google Scholar] [CrossRef] [PubMed]
  6. Chin, K.; Hellebrekers, T.; Majidi, C. Machine learning for soft robotic sensing and control. Adv. Intell. Syst. 2020, 2, 1900171. [Google Scholar] [CrossRef]
  7. Khin, P.M.; Low, J.H.; Ang, M.H., Jr.; Yeow, C.H. Development and grasp stability estimation of sensorized soft robotic hand. Front. Robot. AI 2021, 8, 619390. [Google Scholar] [CrossRef]
  8. Park, J.; Heo, P.; Kim, J.; Na, Y. Qualitative stability analysis of soft hand exoskeleton based on tendon-driven mechanism. Int. J. Precis. Eng. Manuf. 2020, 21, 2095–2104. [Google Scholar] [CrossRef]
  9. Elango, N.; Faudzi, A.A.M. A review article: Investigations on soft materials for soft robot manipulations. Int. J. Adv. Manuf. Technol. 2015, 80, 1027–1037. [Google Scholar] [CrossRef]
  10. Coyle, S.; Majidi, C.; LeDuc, P.; Hsia, K.J. Bio-inspired soft robotics: Material selection, actuation, and design. Extrem. Mech. Lett. 2018, 22, 51–59. [Google Scholar] [CrossRef]
  11. Schmitt, F.; Piccin, O.; Barbé, L.; Bayle, B. Soft robots manufacturing: A review. Front. Robot. AI 2018, 5, 84. [Google Scholar] [CrossRef] [PubMed]
  12. Stano, G.; Percoco, G. Additive manufacturing aimed to soft robots fabrication: A review. Extrem. Mech. Lett. 2021, 42, 101079. [Google Scholar] [CrossRef]
  13. Zaidi, S.; Maselli, M.; Laschi, C.; Cianchetti, M. Actuation technologies for soft robot grippers and manipulators: A review. Curr. Robot. Rep. 2021, 2, 355–369. [Google Scholar] [CrossRef]
  14. Shorthose, O.; Albini, A.; He, L.; Maiolino, P. Design of a 3D-printed soft robotic hand with integrated distributed tactile sensing. IEEE Robot. Autom. Lett. 2022, 7, 3945–3952. [Google Scholar] [CrossRef]
  15. Serrano, D.; Copaci, D.; Arias, J.; Moreno, L.E.; Blanco, D. SMA-Based Soft Exo-Glove. IEEE Robot. Autom. Lett. 2023, 8, 5448–5455. [Google Scholar] [CrossRef]
  16. Liu, X.; Zhao, Y.; Geng, D.; Chen, S.; Tan, X.; Cao, C. Soft humanoid hands with large gras** force enabled by flexible hybrid pneumatic actuators. Soft Robot. 2021, 8, 175–185. [Google Scholar] [CrossRef]
  17. Zhang, N.; Ge, L.; Xu, H.; Zhu, X.; Gu, G. 3D printed, modularized rigid-flexible integrated soft finger actuators for anthropomorphic hands. Sens. Actuators A Phys. 2020, 312, 112090. [Google Scholar] [CrossRef]
  18. Yang, Y.; Chen, Y. Novel design and 3D printing of variable stiffness robotic fingers based on shape memory polymer. In Proceedings of the 2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob), Singapore, 26–29 June 2016; pp. 195–200. [Google Scholar]
  19. Bionicsofthand Pneumatic Robot Hand with Artificial Intelligence. 2023. Available online: https://www.festo.com/gb/en/e/about-festo/research-and-development/bionic-learning-network/highlights-from-2018-to-2021/bionicsofthand-id68106/ (accessed on 11 January 2024).
  20. Curkovic, P.; Cubric, G. Fused Deposition Modelling for 3D printing of Soft Anthropomorphic Actuators. Int. J. Simul. Model. 2021, 20, 12. [Google Scholar] [CrossRef]
  21. Deng, E.; Tadesse, Y. A soft 3D-printed robotic hand actuated by coiled SMA. Actuators 2020, 10, 6. [Google Scholar] [CrossRef]
  22. Du, H.; Yao, Y.; Zhou, X. A facile fabricating method for smart soft robotic hand. Polym. Eng. Sci. 2023, 63, 118–125. [Google Scholar] [CrossRef]
  23. Tavakoli, M.; de Almeida, A.T. Adaptive under-actuated anthropomorphic hand: ISR-SoftHand. In Proceedings of the 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 14–18 September 2014; pp. 1629–1634. [Google Scholar]
  24. Zhang, C.; Li, M.; Chen, Y.; Yang, Z.; He, B.; Li, X.; ** and pinching gras** modes. IEEE/ASME Trans. Mechatron. 2020, 26, 146–155. [Google Scholar] [CrossRef]
  25. Zhang, P.; Tang, B. A two-finger soft gripper based on bistable mechanism. IEEE Robot. Autom. Lett. 2022, 7, 11330–11337. [Google Scholar] [CrossRef]
  26. Deimel, R.; Brock, O. A novel type of compliant and underactuated robotic hand for dexterous gras**. Inter. J. Robotl Res. 2016, 35, 161–185. [Google Scholar] [CrossRef]
  27. Hughes, J.; Maiolino, P.; Iida, F. An anthropomorphic soft skeleton hand exploiting conditional models for piano playing. Sci. Robot. 2018, 3, eaau3098. [Google Scholar] [CrossRef] [PubMed]
  28. Salem, M.E.; Wen, R.; Xu, M.H.; Yan, L.; ** Tasks. Soft Robot. 2023, 10, 527–544. [Google Scholar] [CrossRef] [PubMed]
  29. Abondance, S.; Teeple, C.B.; Wood, R.J. A dexterous soft robotic hand for delicate in-hand manipulation. IEEE Robot. Autom. Lett. 2020, 5, 5502–5509. [Google Scholar] [CrossRef]
  30. Zhu, M.; Mori, Y.; Wakayama, T.; Wada, A.; Kawamura, S. A fully multi-material three-dimensional printed soft gripper with variable stiffness for robust gras**. Soft Robot. 2019, 6, 507–519. [Google Scholar] [CrossRef]
  31. Li, H.; Zhou, P.; Zhang, S.; Yao, J.; Zhao, Y. A high-load bioinspired soft gripper with force booster fingers. Mech. Mach. Theory 2022, 177, 105048. [Google Scholar] [CrossRef]
  32. Bao, G.; Ma, X.; Luo, X.; Shao, T.F.Z.; Zhang, L.; Yang, Q. Full compliant continuum robotic finger and its kinematic model. Iran. J. Sci. Technology. Trans. Mech. Eng. 2014, 38, 389. [Google Scholar]
  33. Yamanaka, Y.; Katagiri, S.; Nabae, H.; Suzumori, K.; Endo, G. Development of a food handling soft robot hand considering a high-speed pick-and-place task. In Proceedings of the 2020 IEEE/SICE International Symposium on System Integration (SII), Honolulu, HI, USA, 12–15 January 2020; pp. 87–92. [Google Scholar]
  34. Gao, G.; Chang, C.-M.; Gerez, L.; Liarokapis, M. A pneumatically driven, disposable, soft robotic gripper equipped with multi-stage, retractable, telescopic fingers. IEEE Trans. Med. Robot. Bionics 2021, 3, 573–582. [Google Scholar] [CrossRef]
  35. Thuruthel, T.G.; Abidi, S.H.; Cianchetti, M.; Laschi, C.; Falotico, E. A bistable soft gripper with mechanically embedded sensing and actuation for fast gras**. In Proceedings of the 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Honolulu, HI, USA, 12–15 January 2020; pp. 1049–1054. [Google Scholar]
  36. Bullock, I.M.; Ma, R.R.; Dollar, A.M. A hand-centric classification of human and robot dexterous manipulation. IEEE Trans. Haptics 2012, 6, 129–144. [Google Scholar] [CrossRef] [PubMed]
  37. Li, Y.; Chen, Y.; Ren, T.; Hu, Y.; Liu, H.; Lin, S.; Yang, Y.; Li, Y.; Zhou, J. A dual-mode actuator for soft robotic hand. IEEE Robot. Autom. Lett. 2021, 6, 1144–1151. [Google Scholar] [CrossRef]
  38. Fras, J.; Althoefer, K. Soft biomimetic prosthetic hand: Design, manufacturing and preliminary examination. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 1–6. [Google Scholar]
  39. Sinatra, N.R.; Teeple, C.B.; Vogt, D.M.; Parker, K.K.; Gruber, D.F.; Wood, R.J. Ultragentle manipulation of delicate structures using a soft robotic gripper. Sci. Robot. 2019, 4, eaax5425. [Google Scholar] [CrossRef] [PubMed]
  40. Andrychowicz, M.; Baker, B.; Chociej, M.; Jozefowicz, R.; McGrew, B.; Pachocki, J.; Petron, A.; Plappert, M.; Powell, G.; Ray, A. Learning dexterous in-hand manipulation. Int. J. Robot. Res. 2020, 39, 3–20. [Google Scholar] [CrossRef]
  41. Salvietti, G.; Hussain, I.; Malvezzi, M.; Prattichizzo, D. Design of the passive joints of underactuated modular soft hands for fingertip trajectory tracking. IEEE Robot. Autom. Lett. 2017, 2, 2008–2015. [Google Scholar] [CrossRef]
  42. Feix, T.; Romero, J.; Schmiedmayer, H.-B.; Dollar, A.M.; Kragic, D. The grasp taxonomy of human grasp types. IEEE Trans. Hum.-Mach. Syst. 2015, 46, 66–77. [Google Scholar] [CrossRef]
  43. Su, C.; Wang, R.; Lu, T.; Wang, S. SAU-RFC hand: A novel self-adaptive underactuated robot hand with rigid-flexible coupling fingers. Robotica 2023, 41, 511–529. [Google Scholar] [CrossRef]
  44. Hang, K.; Morgan, A.S.; Dollar, A.M. Pre-grasp sliding manipulation of thin objects using soft, compliant, or underactuated hands. IEEE Robot. Autom. Lett. 2019, 4, 662–669. [Google Scholar] [CrossRef]
  45. Tan, N.; Gu, X.; Ren, H. Simultaneous robot-world, sensor-tip, and kinematics calibration of an underactuated robotic hand with soft fingers. IEEE Access 2017, 6, 22705–22715. [Google Scholar] [CrossRef]
  46. Liu, Y.; **. Smart Mater. Struct. 2023, 32, 125012. [Google Scholar] [CrossRef]
  47. Wang, H.; Abu-Dakka, F.J.; Le, T.N.; Kyrki, V.; Xu, H. A novel soft robotic hand design with human-inspired soft palm: Achieving a great diversity of grasps. IEEE Robot. Autom. Mag. 2021, 28, 37–49. [Google Scholar] [CrossRef]
  48. Nanayakkara, V.K.; Cotugno, G.; Vitzilaios, N.; Venetsanos, D.; Nanayakkara, T.; Sahinkaya, M.N. The role of morphology of the thumb in anthropomorphic gras**: A review. Front. Mech. Eng. 2017, 3, 5. [Google Scholar] [CrossRef]
  49. Yoneda, T.; Morihiro, D.; Ozawa, R. Development of a multifingered robotic hand with the thenar grasp function. Adv. Robot. 2020, 34, 661–673. [Google Scholar] [CrossRef]
  50. Feix, T.; Pawlik, R.; Schmiedmayer, H.-B.; Romero, J.; Kragic, D. A comprehensive grasp taxonomy. In Proceedings of the Robotics, Science and Systems: Workshop on Understanding the Human Hand for Advancing Robotic Manipulation, Seattle, WA, USA, 28 June–1 July 2009; pp. 2–3. [Google Scholar]
  51. Chen, W.; Li, G.; Li, N.; Wang, W.; Yu, P.; Wang, R.; Xue, X.; Zhao, X.; Liu, L. Soft exoskeleton with fully actuated thumb movements for gras** assistance. IEEE Trans. Robot. 2022, 38, 2194–2207. [Google Scholar] [CrossRef]
  52. Zhou, H.; Mohammadi, A.; Oetomo, D.; Alici, G. A novel monolithic soft robotic thumb for an anthropomorphic prosthetic hand. IEEE Robot. Autom. Lett. 2019, 4, 602–609. [Google Scholar] [CrossRef]
  53. Chalon, M.; Dietrich, A.; Grebenstein, M. The thumb of the anthropomorphic awiwi hand: From concept to evaluation. Int. J. Humanoid Robot. 2014, 11, 1450019. [Google Scholar] [CrossRef]
  54. Sangole, A.P.; Levin, M.F. Arches of the hand in reach to grasp. J. Biomech. 2008, 41, 829–837. [Google Scholar] [CrossRef]
  55. Wang, H.; Xu, H.; Abu-Dakka, F.J.; Kyrki, V.; Yang, C.; Li, X.; Chen, S. A bidirectional soft biomimetic hand driven by water hydraulic for dexterous underwater gras**. IEEE Robot. Autom. Lett. 2022, 7, 2186–2193. [Google Scholar] [CrossRef]
  56. Pozzi, M.; Malvezzi, M.; Prattichizzo, D.; Salvietti, G. Actuated Palms for Soft Robotic Hands: Review and Perspectives. IEEE/ASME Trans. Mechatron. 2023, 1–11. [Google Scholar] [CrossRef]
  57. Lin, J.; Hu, Q.; ** modes. Sens. Actuators A. Phys. 2022, 347, 113978. [Google Scholar] [CrossRef]
  58. Fei, Y.; Wang, J.; Pang, W. A novel fabric-based versatile and stiffness-tunable soft gripper integrating soft pneumatic fingers and wrist. Soft Robot. 2019, 6, 1–20. [Google Scholar] [CrossRef]
  59. Ren, T.; Li, Y.; Liu, Q.; Chen, Y.; Yang, S.X.; Yuan, H.; Li, Y.; Yang, Y. Novel Bionic Soft Robotic Hand With Dexterous Deformation and Reliable Gras**. IEEE Trans. Instrum. Meas. 2023, 72, 1–10. [Google Scholar] [CrossRef]
  60. Chen, C.; Sun, J.; Wang, L.; Chen, G.; Xu, M.; Ni, J.; Ramli, R.; Su, S.; Chu, C. Pneumatic bionic hand with rigid-flexible coupling structure. Materials 2022, 15, 1358. [Google Scholar] [CrossRef]
  61. Li, Y.; Wei, Y.; Yang, Y.; Chen, Y. A novel versatile robotic palm inspired by human hand. Eng. Res. Express 2019, 1, 015008. [Google Scholar] [CrossRef]
  62. Li, Y.; Chen, Y.; Yang, Y.; Wei, Y. Passive particle jamming and its stiffening of soft robotic grippers. IEEE Trans. Robot. 2017, 33, 446–455. [Google Scholar] [CrossRef]
  63. Lee, J.; Kim, J.; Park, S.; Hwang, D.; Yang, S. Soft robotic palm with tunable stiffness using dual-layered particle jamming mechanism. IEEE/ASME Trans. Mechatron. 2021, 26, 1820–1827. [Google Scholar] [CrossRef]
  64. Choi, W.H.; Kim, S.; Lee, D.; Shin, D. Soft, multi-DoF, variable stiffness mechanism using layer jamming for wearable robots. IEEE Robot. Autom. Lett. 2019, 4, 2539–2546. [Google Scholar] [CrossRef]
  65. In, H.; Kang, B.B.; Sin, M.; Cho, K.-J. Exo-glove: A wearable robot for the hand with a soft tendon routing system. IEEE Robot. Autom. Mag. 2015, 22, 97–105. [Google Scholar] [CrossRef]
  66. Sun, Y.; Song, Y.S.; Paik, J. Characterization of silicone rubber based soft pneumatic actuators. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 4446–4453. [Google Scholar]
  67. Hu, W.; Lum, G.Z.; Mastrangeli, M.; Sitti, M. Small-scale soft-bodied robot with multimodal locomotion. Nature 2018, 554, 81–85. [Google Scholar] [CrossRef] [PubMed]
  68. Rodrigue, H.; Wang, W.; Han, M.-W.; Kim, T.J.; Ahn, S.-H. An overview of shape memory alloy-coupled actuators and robots. Soft Robot. 2017, 4, 3–15. [Google Scholar] [CrossRef] [PubMed]
  69. Zhang, Y.F.; Zhang, N.; Hingorani, H.; Ding, N.; Wang, D.; Yuan, C.; Zhang, B.; Gu, G.; Ge, Q. Fast-response, stiffness-tunable soft actuator by hybrid multimaterial 3D printing. Adv. Funct. Mater. 2019, 29, 1806698. [Google Scholar] [CrossRef]
  70. Tang, Y.; Chi, Y.; Sun, J.; Huang, T.H.; Yin, J. Leveraging elastic instabilities for amplified performance: Spine-inspired high-speed and high-force soft robots. Sci. Adv. 2020, 6, eaaz6912. [Google Scholar] [CrossRef] [PubMed]
  71. Han, H.-Y.; Shimada, A.; Kawamura, S. Analysis of friction on human fingers and design of artificial fingers. In Proceedings of the IEEE International Conference on Robotics and Automation, Minneapolis, MN, USA, 22–28 April 1996; pp. 3061–3066. [Google Scholar]
  72. Takahashi, A.; Yamaguchi, A.; Nonomura, Y. Friction between Two Finger Models: Effects of Fingerprints on Friction Dynamics. Chem. Lett. 2014, 43, 1899–1900. [Google Scholar] [CrossRef]
  73. Alben, S.; Witt, C.; Baker, T.V.; Anderson, E.; Lauder, G.V. Dynamics of freely swimming flexible foils. Phys. Fluids 2012, 24, 051901. [Google Scholar] [CrossRef]
  74. Marchese, A.D.; Onal, C.D.; Rus, D. Autonomous soft robotic fish capable of escape maneuvers using fluidic elastomer actuators. Soft Robot. 2014, 1, 75–87. [Google Scholar] [CrossRef]
  75. Hao, Y.; Biswas, S.; Hawkes, E.W.; Wang, T.; Zhu, M.; Wen, L.; Visell, Y. A multimodal, envelo** soft gripper: Shape conformation, bioinspired adhesion, and expansion-driven suction. IEEE Trans. Robot. 2020, 37, 350–362. [Google Scholar] [CrossRef]
  76. Seibel, A.; Yıldız, M.; Zorlubaş, B. A gecko-inspired soft passive gripper. Biomimetics 2020, 5, 12. [Google Scholar] [CrossRef]
  77. Hao, T.; ** performance of soft grippers with fingerprint-like surface texture for objects with slippery surfaces. Tribol. Int. 2023, 189, 108992. [IEEE Trans. Neural Syst. Rehabil. Eng. 2022, 30, 550–558. [Google Scholar] [CrossRef]
  78. Zimmermann, C.; Ceylan, D.; Yang, J.; Russell, B.; Argus, M.; Brox, T. Freihand: A dataset for markerless capture of hand pose and shape from single rgb images. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Seoul, Republic of Korea, 27 October–2 November 2019; pp. 813–822. [Google Scholar]
  79. Hasson, Y.; Varol, G.; Tzionas, D.; Kalevatykh, I.; Black, M.J.; Laptev, I.; Schmid, C. Learning joint reconstruction of hands and manipulated objects. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019; pp. 11807–11816. [Google Scholar]
  80. Hampali, S.; Sarkar, S.D.; Rad, M.; Lepetit, V. Keypoint transformer: Solving joint identification in challenging hands and object interactions for accurate 3d pose estimation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, New Orleans, LA, USA, 18–24 June 2022; pp. 11090–11100. [Google Scholar]
  81. Shih, B.; Shah, D.; Li, J.; Thuruthel, T.G.; Park, Y.-L.; Iida, F.; Bao, Z.; Kramer-Bottiglio, R.; Tolley, M.T. Electronic skins and machine learning for intelligent soft robots. Sci. Robot. 2020, 5, eaaz9239. [Google Scholar] [CrossRef]
  82. Liu, F.; Deswal, S.; Christou, A.; Sandamirskaya, Y.; Kaboli, M.; Dahiya, R. Neuro-inspired electronic skin for robots. Sci. Robot. 2022, 7, eabl7344. [Google Scholar] [CrossRef]
  83. Wang, L.; Jiang, K.; Shen, G. Wearable, implantable, and interventional medical devices based on smart electronic skins. Adv. Mater. Technol. 2021, 6, 2100107. [Google Scholar] [CrossRef]
  84. Ma, Z.; Li, S.; Wang, H.; Cheng, W.; Li, Y.; Pan, L.; Shi, Y. Advanced electronic skin devices for healthcare applications. J. Mater. Chem. B 2019, 7, 173–197. [Google Scholar] [CrossRef]
  85. Chen, K.; Liang, K.; Liu, H.; Liu, R.; Liu, Y.; Zeng, S.; Tian, Y. Skin-Inspired Ultra-Tough Supramolecular Multifunctional Hydrogel Electronic Skin for Human–Machine Interaction. Nano-Micro Lett. 2023, 15, 102. [Google Scholar] [CrossRef] [PubMed]
  86. Chortos, A.; Liu, J.; Bao, Z. Pursuing prosthetic electronic skin. Nat. Mater. 2016, 15, 937–950. [Google Scholar] [CrossRef] [PubMed]
  87. Jason, N.N.; Ho, M.D.; Cheng, W. Resistive electronic skin. J. Mater. Chem. C 2017, 5, 5845–5866. [Google Scholar] [CrossRef]
  88. Zhong, F.; Hu, W.; Zhu, P.; Wang, H.; Ma, C.; Lin, N.; Wang, Z. Piezoresistive design for electronic skin: From fundamental to emerging applications. Opto-Electron. Adv. 2022, 5, 210029-210021–210029-210032. [Google Scholar] [CrossRef]
  89. Liu, W.; Lin, D.; Zeng, W.; Wang, Q.; Yang, J.; Peng, Z.; Zhang, Q.; Zhu, G. A multifunctional flexible ferroelectric transistor sensor for electronic skin. Adv. Mater. Interfaces 2021, 8, 2101166. [Google Scholar] [CrossRef]
  90. Rao, J.; Chen, Z.; Zhao, D.; Ma, R.; Yi, W.; Zhang, C.; Liu, D.; Chen, X.; Yang, Y.; Wang, X. Tactile electronic skin to simultaneously detect and distinguish between temperature and pressure based on a triboelectric nanogenerator. Nano Energy 2020, 75, 105073. [Google Scholar] [CrossRef]
  91. Zhang, J.; Wan, L.; Gao, Y.; Fang, X.; Lu, T.; Pan, L.; Xuan, F. Highly stretchable and self-healable MXene/polyvinyl alcohol hydrogel electrode for wearable capacitive electronic skin. Adv. Electron. Mater. 2019, 5, 1900285. [Google Scholar] [CrossRef]
  92. Dargahi, J.; Najarian, S. Human tactile perception as a standard for artificial tactile sensing—A review. Int. J. Med. Robot. Comput. Assist. Surg. 2004, 1, 23–35. [Google Scholar] [CrossRef]
  93. Haddadin, S.; De Luca, A.; Albu-Schäffer, A. Robot collisions: A survey on detection, isolation, and identification. IEEE Trans. Robot. 2017, 33, 1292–1312. [Google Scholar] [CrossRef]
  94. Li, Q.; Natale, L.; Haschke, R.; Cherubini, A.; Ho, A.-V.; Ritter, H. Tactile sensing for manipulation. Int. J. Humanoid Robot. 2018, 15, 1802001. [Google Scholar] [CrossRef]
  95. Qu, X.; Xue, J.; Liu, Y.; Rao, W.; Liu, Z.; Li, Z. Fingerprint-shaped triboelectric tactile sensor. Nano Energy 2022, 98, 107324. [Google Scholar] [CrossRef]
  96. Harada, S.; Kanao, K.; Yamamoto, Y.; Arie, T.; Akita, S.; Takei, K. Fully printed flexible fingerprint-like three-axis tactile and slip force and temperature sensors for artificial skin. ACS Nano 2014, 8, 12851–12857. [Google Scholar] [CrossRef]
  97. Zhu, L.; Wang, Y.; Mei, D.; Ding, W.; Jiang, C.; Lu, Y. Fully elastomeric fingerprint-shaped electronic skin based on tunable patterned graphene/silver nanocomposites. ACS Appl. Mater. Interfaces 2020, 12, 31725–31737. [Google Scholar] [CrossRef]
  98. Qiu, Y.; Tian, Y.; Sun, S.; Hu, J.; Wang, Y.; Zhang, Z.; Liu, A.; Cheng, H.; Gao, W.; Zhang, W. Bioinspired, multifunctional dual-mode pressure sensors as electronic skin for decoding complex loading processes and human motions. Nano Energy 2020, 78, 105337. [Google Scholar] [CrossRef]
  99. Duan, S.; Shi, Q.; Hong, J.; Zhu, D.; Lin, Y.; Li, Y.; Lei, W.; Lee, C.; Wu, J. Water-modulated biomimetic hyper-attribute-gel electronic skin for robotics and skin-attachable wearables. ACS Nano 2023, 17, 1355–1371. [Google Scholar] [CrossRef]
  100. Ma, Z.; Zhang, J.; Li, J.; Shi, Y.; Pan, L. Frequency-enabled decouplable dual-modal flexible pressure and temperature sensor. IEEE Electron Device Lett. 2020, 41, 1568–1571. [Google Scholar] [CrossRef]
  101. Han, S.; Alvi, N.U.H.; Granlöf, L.; Granberg, H.; Berggren, M.; Fabiano, S.; Crispin, X. A multiparameter pressure–temperature–humidity sensor based on mixed ionic–electronic cellulose aerogels. Adv. Sci. 2019, 6, 1802128. [Google Scholar] [CrossRef] [PubMed]
  102. Zhang, T.; Liu, H.; Jiang, L.; Fan, S.; Yang, J. Development of a flexible 3-D tactile sensor system for anthropomorphic artificial hand. IEEE Sens. J. 2012, 13, 510–518. [Google Scholar] [CrossRef]
  103. Lee, B.-Y.; Kim, S.; Oh, S.; Lee, Y.; Park, J.; Ko, H.; Koo, J.C.; Jung, Y.; Lim, H. Human-Inspired Tactile Perception System for Real-Time and Multimodal Detection of Tactile Stimuli. Soft Robot. 2023. [Google Scholar] [CrossRef]
  104. Sun, Z.; Wang, S.; Zhao, Y.; Zhong, Z.; Zuo, L. Discriminating soft actuators’ thermal stimuli and mechanical deformation by hydrogel sensors and machine learning. Adv. Intell. Syst. 2022, 4, 2200089. [Google Scholar] [CrossRef]
  105. Charalambides, A.; Bergbreiter, S. Rapid manufacturing of mechanoreceptive skins for slip detection in robotic gras**. Adv. Mater. Technol. 2017, 2, 1600188. [Google Scholar] [CrossRef]
  106. Wang, Y.; Ding, W.; Mei, D. Development of flexible tactile sensor for the envelop of curved robotic hand finger in gras** force sensing. Measurement 2021, 180, 109524. [Google Scholar] [CrossRef]
  107. Pannen, T.J.; Puhlmann, S.; Brock, O. A low-cost, easy-to-manufacture, flexible, multi-taxel tactile sensor and its application to in-hand object recognition. In Proceedings of the 2022 International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2022; pp. 10939–10944. [Google Scholar]
  108. Yuan, Z.; Shen, G.; Pan, C.; Wang, Z.L. Flexible sliding sensor for simultaneous monitoring deformation and displacement on a robotic hand/arm. Nano Energy 2020, 73, 104764. [Google Scholar] [CrossRef]
  109. Chen, H.; Song, Y.; Guo, H.; Miao, L.; Chen, X.; Su, Z.; Zhang, H. Hybrid porous micro structured finger skin inspired self-powered electronic skin system for pressure sensing and sliding detection. Nano Energy 2018, 51, 496–503. [Google Scholar] [CrossRef]
  110. Chen, H.; Miao, L.; Su, Z.; Song, Y.; Han, M.; Chen, X.; Cheng, X.; Chen, D.; Zhang, H. Fingertip-inspired electronic skin based on triboelectric sliding sensing and porous piezoresistive pressure detection. Nano Energy 2017, 40, 65–72. [Google Scholar] [CrossRef]
  111. Yang, W.; **. Curr. Robot. Rep. 2020, 1, 239–249. [Google Scholar] [CrossRef]
  112. Della Santina, C.; Arapi, V.; Averta, G.; Damiani, F.; Fiore, G.; Settimi, A.; Catalano, M.G.; Bacciu, D.; Bicchi, A.; Bianchi, M. Learning from humans how to grasp: A data-driven architecture for autonomous gras** with anthropomorphic soft hands. IEEE Robot. Autom. Lett. 2019, 4, 1533–1540. [Google Scholar] [CrossRef]
  113. Gupta, A.; Eppner, C.; Levine, S.; Abbeel, P. Learning dexterous manipulation for a soft robotic hand from human demonstrations. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Republic of Korea, 9–14 October 2016; pp. 3786–3793. [Google Scholar]
  114. Mandikal, P.; Grauman, K. Dexvip: Learning dexterous gras** with human hand pose priors from video. In Proceedings of the Conference on Robot Learning, Auckland, New Zealand, 14–18 December 2022; pp. 651–661. [Google Scholar]
  115. Choi, C.; Schwarting, W.; DelPreto, J.; Rus, D. Learning object gras** for soft robot hands. IEEE Robot. Autom. Lett. 2018, 3, 2370–2377. [Google Scholar] [CrossRef]
  116. Arapi, V.; Zhang, Y.; Averta, G.; Catalano, M.G.; Rus, D.; Della Santina, C.; Bianchi, M. To grasp or not to grasp: An end-to-end deep-learning approach for predicting gras** failures in soft hands. In Proceedings of the 2020 3rd IEEE International Conference on Soft Robotics (RoboSoft), New Haven, CT, USA, 15 May–15 July 2020; pp. 653–660. [Google Scholar]
Figure 1. Schematic expression of (a) human hand joints; (b) and muscle groups and palm arches.
Figure 1. Schematic expression of (a) human hand joints; (b) and muscle groups and palm arches.
Actuators 13 00084 g001
Figure 2. Development of various types of soft hands. Top: nonanthropomorphic grippers, (a) starfish-like grippers [2], (b), soft gripper for both safe gras** and non-destructive firmness evaluation [88] (c), middle, underactuated anthropomorphic hand, (d), UC soft hand with compact twisted string actuation mechanism [89], (e), soft hand with magnets to robustify the performance [90] (f), soft glove with four kinds of rehabilitation postures [91], and bottom: anthropomorphic dexterous hand (g), BCL-26 soft hand with 16 DOFs [5], (h), RBO Hand 3 with a dexterous opposable thumb [6] (i), soft hand with active palm [75].
Figure 2. Development of various types of soft hands. Top: nonanthropomorphic grippers, (a) starfish-like grippers [2], (b), soft gripper for both safe gras** and non-destructive firmness evaluation [88] (c), middle, underactuated anthropomorphic hand, (d), UC soft hand with compact twisted string actuation mechanism [89], (e), soft hand with magnets to robustify the performance [90] (f), soft glove with four kinds of rehabilitation postures [91], and bottom: anthropomorphic dexterous hand (g), BCL-26 soft hand with 16 DOFs [5], (h), RBO Hand 3 with a dexterous opposable thumb [6] (i), soft hand with active palm [75].
Actuators 13 00084 g002
Figure 3. Rigid−flexible integration and variable stiffness design of soft grippers and hands. (a) Rigid−flexible integration configuration [33], (b), integration of shape memory polymer for variable stiffness [103] (c), particle jamming method for two fingered−gripper (d), particle jamming method for three−fingered gripper [96].
Figure 3. Rigid−flexible integration and variable stiffness design of soft grippers and hands. (a) Rigid−flexible integration configuration [33], (b), integration of shape memory polymer for variable stiffness [103] (c), particle jamming method for two fingered−gripper (d), particle jamming method for three−fingered gripper [96].
Actuators 13 00084 g003
Figure 4. Tactile perception of anthropomorphic soft hands with embedded sensor (a), stretchable optical waveguides-based sensor [118] (b), capacitive sensor [119] (c), flexible optical fiber-based sensor [120] (d), air pressure sensors [121] (e), distributed soft sensing unit [30] and e-skin (f), e-skin for both pressure sensing and actuation [122] (g), capacitive e-skin for detecting normal and tangential forces [123] (h), mechanoreceptor- and nociceptor-based e-skin for neuromorphic tactile perception [124] (i), fingerprint inspired triboelectric e-skin for texture perception [125]).
Figure 4. Tactile perception of anthropomorphic soft hands with embedded sensor (a), stretchable optical waveguides-based sensor [118] (b), capacitive sensor [119] (c), flexible optical fiber-based sensor [120] (d), air pressure sensors [121] (e), distributed soft sensing unit [30] and e-skin (f), e-skin for both pressure sensing and actuation [122] (g), capacitive e-skin for detecting normal and tangential forces [123] (h), mechanoreceptor- and nociceptor-based e-skin for neuromorphic tactile perception [124] (i), fingerprint inspired triboelectric e-skin for texture perception [125]).
Actuators 13 00084 g004
Figure 5. Array-based or/and multi-layer based design strategy for multimodal soft sensors. (a) Fingerprint-inspired array-based e-skin for pressure and position detection (Highlighted area indicates corresponding COMSOL simulation of forces applied) [148] (b), soft senor with both array and multi-layer structure for three dimensional force, slip, and temperature perception [149].
Figure 5. Array-based or/and multi-layer based design strategy for multimodal soft sensors. (a) Fingerprint-inspired array-based e-skin for pressure and position detection (Highlighted area indicates corresponding COMSOL simulation of forces applied) [148] (b), soft senor with both array and multi-layer structure for three dimensional force, slip, and temperature perception [149].
Actuators 13 00084 g005
Figure 6. Schematics of an anthropomorphic soft hand interacting with the environment with machine learning models.
Figure 6. Schematics of an anthropomorphic soft hand interacting with the environment with machine learning models.
Actuators 13 00084 g006
Table 1. Summary of anthropomorphic soft hands.
Table 1. Summary of anthropomorphic soft hands.
MaterialsFabricationActuationControlDOFMain FeaturesCategoryYearRef
Dragon skin-10, Dragon skin-20, Dragon skin-30, Ecoflex 00-10Casting moldingPneumaticClosed-loop control14Flexible thenarAnthropomorphic dextureous hand2022[4]
Dragonskin 10, Ecoflex 00-30Casting moldingPneumaticcoordinated control6Replicate the human-like grasp posturesAnthropomorphic dextureous hand2022[7]
Silicone rubber, fibersCasting moldingPneumaticOpen-loop control12Flexible palmUnderactuated anthropomorphic hand2013[8]
Dragon skin 30Casting molding Open-loop control21 Soft parallel palm2023[19]
Ecoflex 00-50, Mold Star 30, ABSCasting moldingPneumaticOpen-loop control13Gras** planningAnthropomorphic dextureous hand2018[20]
Dragon Skin 10, fibersCasting moldingPneumaticOpen-loop control22Flexible operation functionAnthropomorphic dextureous hand2022[30]
TPU, ABSFluidic and Tendon actuationOpen-loop control5SMA-Based Exo-GloveSoft Exo-Glove2023[31]
Resin, PET, Nylon gauzePlanar laser cutting and stackingPneumaticOpen-loop control6Hybrid pneumatic actuatorsUnderactuated anthropomorphic hand2021[32]
VytaFlex 20, ELASTOSIL M 46013D printing, soft lithographyPneumaticOpen-loop control12Multi material 3D printedAnthropomorphic dextureous hand2020[33]
Dragon skin-10, Dragon skin-30, fibersCasting moldingPneumaticOpen-loop control3SMP actuatedDextureous finger2016[34]
Dragon Skin 10, Ecoflex 00-30, nylon thread, fibersCasting moldingPneumaticOpen-loop control12Highly integrated designAnthropomorphic dextureous hand2023[35]
Vero, Agilus303D printingPneumaticClosed-loop control53D printed fingersUnderactuated anthropomorphic hand2021[36]
PDMS, CNTs3D printingLight-drivenOpen-loop control5SMA actuatedUnderactuated anthropomorphic hand2020[37]
PDMS, SMA, fiberglassCasting moldingTendon-drivenOpen-loop control10SMA actuatedUnderactuated anthropomorphic hand2023[38]
TPU, SMA3D printingSMAOpen-loop control10Elastic joints and soft padsAnthropomorphic dextureous hand2014[39]
TPU3D printingPneumaticOpen-loop control10Soft-Rigid Hybrid fingersAnthropomorphic dextureous hand2023[40]
Silicone rubber, ABSPneumaticOpen-loop control5Self-healing soft fingersUnderactuated anthropomorphic hand2017[41]
Dragonskin-10, ecoflex 00-30, fibersCasting moldingPneumaticClosed-loop control-Deployable, atraumatic grasperSurgical grasper2014[42]
Smooth-Sil 936, fibersCasting moldingFluid-drivenOpen-loop control1Pneu-net actuatorDextureous finger2014[43]
Ecoflex-30, SMA, PDMSCasting moldingSMAClosed-loop control5ECF jetUnderactuated anthropomorphic hand2011[44]
TPU3D printingTendonOpen-loop control5Completely softSoft Exo-Glove2021[45]
Electro-conjugate fluid(ECF)ECF jetOpen-loop control5Planar Laser Cutting and StackingUnderactuated anthropomorphic hand2021[46]
NinjaFlex, particles3D printingPneumaticClosed-loop control3Humanoid hand skeletonDextureous finger2019[47]
Agilus30, Vero3D printingPneumaticOpen-loop control5Pneumatic Exo-GloveSoft Exo-Glove2016[48]
Dragon Skin 10, Ecoflex 00-30PneumaticMyoelectric control43D printedUnderactuated anthropomorphic hand2017[49]
Dragon skin 30, paperCasting moldingPneumaticOpen-loop control4Dualmodule pneumatic actuatorNonanthropomorphic Grippers2020[50]
Silicone, PLACasting moldingTendon-drivenOpen-loop control2Two-finger gripNonanthropomorphic Grippers2022[51]
Silicone, fiberCasting moldingPneumaticOpen-loop control7Active PalmUnderactuated anthropomorphic hand2016[52]
Vero White, Tango Black3D printing Open-loop control18Soft-rigid hybrid handAnthropomorphic dextureous hand2018[53]
TPU3D printingPneumaticOpen-loop control9Hand sign languageAnthropomorphic dextureous hand2019[54]
Fiber, M4601, memory foamCasting moldinghydraulicOpen-loop control8Underwater gripperNonanthropomorphic Grippers2016[55]
Tendons, Agilus Black material3D printingTendon-drivenOpen-loop control3Gras**Underactuated anthropomorphic hand2022[56]
SmoothSil-960, MoldStar-30Casting moldingPneumaticOpen-loop control9Three-finger gripNonanthropomorphic Grippers2023[57]
Smooth-Sil 945, Ecoflex 00-30Casting moldingPneumaticOpen-loop control8Delicate In-hand manipulationSoft robotic hand2020[58]
Table 2. Flexible sensors with capabilities of multi-dimensional force perception or multi-modal sensing.
Table 2. Flexible sensors with capabilities of multi-dimensional force perception or multi-modal sensing.
OutputMechanismE-SkinRef
Force, sliding, temperatureResistant[116]
Curvature, elongation, forceOptical waveguides-based [118]
Normal forces, and tangentialCapacitive [123]
Force, textureTriboelectric [125]
Pressure, position, materialTriboelectric[148]
Normal and shear forcePiezoresistive [155]
Temperature, vibration, shear force, normal forceThermoresistive, piezoresistive, piezoelectric [156]
Strain, temperatureResistant, thermoresistive [157]
Normal and shear force, slidingCapacitive, resistant[158]
Three-axis forceResistant [159]
Location, intensityPiezoresistive [160]
Pressure, slidingCapacitive, triboelectric [161]
Sliding, pressurePiezoresistive, triboelectric[162]
Sliding, pressurePiezoresistive, triboelectric[163]
Temperature, pressureThermoresistive, piezoresistive [164]
Strain, pressure, and temperatureCapacitive, resistant, thermoresistive [165]
Table 3. Research on machine learning related to anthropomorphic soft hands.
Table 3. Research on machine learning related to anthropomorphic soft hands.
Machine Learning MethodsLearning FeaturesImplemented FunctionsPerformance IndexRef
ANNReducing the dependence on feature engineeringTexture recognitionAccurary: 92.5% to 93.33%[125]
RNNHierarchical recurrent sensing networkEstimating pressure responses and localizing the positionAccuracy: 85%[171]
CNN, RNNStates and actions based on feedback from the environmentControlling the position of soft robotic armAverage error: 4.8 mm[172]
Bayesian optimizationPerformance of IPMC3D printing of IPMC actuators and the motion control of the crawling robotMaximum possible distance:95%[173]
DNNData collected by probe terminal electrodesPosition recognition and pressure sensingAccuracy: 98%[174]
Policy learning methodHigh-resolution tactile inputsPredicting future grasp actionsAccuracy: 76–98%[175]
Multilayer perceptron modelContact signalsObject Recognition of Robot HandsAccuracy: 98.6%[177]
ANNChanges in airflow, volume changesPredicting contact location and force magnitudeAverage error: 1.94–2.88 mm[179]
ANNRGB images, RGB-D images, or point cloudsObject Pose Estimation for Robotic Gras**-[181]
CNNRGB imagesGenerating gras** poses for unknown objectsAccuracy: 81.1%[182]
CNNState informationFlexible operations of RBO Hand 2-[183]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, Y.; Hao, T.; Liu, Y.; **ao, H.; Liu, S.; Zhu, H. Anthropomorphic Soft Hand: Dexterity, Sensing, and Machine Learning. Actuators 2024, 13, 84. https://doi.org/10.3390/act13030084

AMA Style

Wang Y, Hao T, Liu Y, **ao H, Liu S, Zhu H. Anthropomorphic Soft Hand: Dexterity, Sensing, and Machine Learning. Actuators. 2024; 13(3):84. https://doi.org/10.3390/act13030084

Chicago/Turabian Style

Wang, Yang, Tianze Hao, Yibo Liu, Hua** **ao, Shuhai Liu, and Hongwu Zhu. 2024. "Anthropomorphic Soft Hand: Dexterity, Sensing, and Machine Learning" Actuators 13, no. 3: 84. https://doi.org/10.3390/act13030084

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop