MIT Creates A High Sensory Robotic Hand

A team at MIT has created a robotic hand with high-tough resolution sensing that can accurately identify an object after just one grasp. The device – called the GelSight EndoFlex – features robotic fingers with a rigid 3D-printed endoskeleton encased in a soft outer shell that has multiple high-resolution sensors embedded under its transparent silicone skin.

Previous designs of robotic hands have placed all of their sensors into the fingertips, so an object must be in full contact with those fingertips to be identified. Other designs use lower-resolution sensors spread across the entire length of the finger, but these cannot capture as much detail. The endoskeleton of each finger of MIT’s new hand contains a pair of GelSight touch sensors, which are embedded into the top and middle sections underneath the transparent skin. Using a camera and colored LEDs to gather visual information about an object’s shape, the sensors provide continuous detection along the entire length of the finger. This means that each finger can capture data regarding many parts of an object simultaneously. An algorithm is then used to map the contours on the grasped object’s surface and turn them into 3D images. Those images are then sent to a machine-learning algorithm that uses them as inputs to identify the object. 

The GelSight EndoFlex has three fingers – two fingers arranged in a Y pattern with a third finger acting as an opposing thumb. The hand captures six images when it grasps an object (two from each finger) which are used as inputs to identify the object with about 85% accuracy. The team expects that this number should improve as the technology is developed further. The researchers also have plans to work on improving the hardware to reduce the amount of wear and tear in the silicone over time, along with adding more actuation to the thumb so it can perform a wider variety of tasks.

“Our goal with this work was to combine all the things that make our human hands so good into a robotic finger that can do tasks other robotic fingers can’t currently do,” said Sandra Liu, co-lead author of the paper.