A new bio-inspired approach allows robots to master the delicate art of grasping fragile or slippery objects by mimicking how the human brain processes tactile sensations.
Picking up a raw egg without crushing it, or grabbing a heavy, slippery cup without dropping it, is second nature to humans. Our hands naturally adjust their grip in response to what we feel in real time. For robots, however, this kind of adaptive grasping has remained a major engineering challenge – until now.
In a study published in the journal National Science Review, a team of researchers from Tsinghua University in China has unveiled a new “sensory-control synergy” approach that teaches robots to learn these universal skills directly from humans.
“Humans use tactile sensation to recognise grasping states in real time and instantly fine-tune grip according to grasping states,” said Professor Rong Zhu, the study’s corresponding author. “We want to give robots that similar ability — to sense, cognise, and act on tactile feeling in real time”.
The digital sense of touch
To bridge the gap between human instinct and robotic programming, the researchers first developed a special tactile glove equipped with custom-made sensors on the fingertips.
When worn by a human, the glove captures multimodal tactile data – specifically contact, slip, and pressure – during grasping actions.
However, the key innovation lies not just in collecting the data, but in how the robot is taught to understand it. Inspired by human neurocognition, the raw data is encoded into high-level semantic states, such as “stable,” “slightly unstable,” or “highly unstable”.
This method filters out unnecessary noise, such as slight deviations in hand position, allowing the robot to focus on the “feel” of the grasp rather than precise geometry.
“Instead of training the robot with thousands of precise tactile measurements during grasping various objects, we teach the robot to recognise general states of interaction,” Professor Zhu explained. “This strategy makes the approach data-efficient and highly transferable”.
Fuzzy logic
To make decisions based on these “feelings,” the team developed a fuzzy logic controller that mimics human decision-making.
If the system detects that the object is “highly unstable,” the robot automatically grabs more intensively. If the state is “stable,” it maintains its hold.
The system proved remarkably effective when transferred to a robotic hand equipped with tactile sensors.
Experimental validation showed that the robot achieved an average grasping success rate of 95.2 per cent across a variety of challenging items, including slippery umbrellas, fragile raw eggs, and heavy bottles.
In dynamic tests, the robot successfully resisted external pulls and prevented slips by sensing the disturbance and autonomously increasing its grip strength.
Brewing success
To demonstrate the system’s versatility, the team tasked the robot with a complex real-world challenge: hand-brewing coffee.
From locating items and scooping coffee powder to stirring and serving, the robot used its tactile-based control to handle uncertainties at every step of the process.
Remarkably, this framework can be built using small datasets created by a single person, yet it generalises well to new, unseen objects.
“Robots learn universal grasping through understanding sensory and control logic behind sensing data rather than imitating human motion trails,” Professor Zhu said. “We teach robots the ability to draw inferences from one instance.”