A Robotic Finger Learning to Touch

7Mar - by aiuniverse - 0 - In Data Robot

Source: machinedesign.com

Researchers at Columbia Engineering have developed a fully integrated, sensorized robot finger.

Until now, touch sensors have been difficult to integrate into robot fingers. Among the challenges were the inability to cover multi-curved surfaces, high wire count and difficulty fitting into small fingertips—all of which prevented applications with dexterous hands.

What sets this robot finger apart is its ability to localize touch with very high precision (<1 mm) over large, multi-curved 3D surfaces, much like a human hand, according to the engineers. Their approach was the novel use of overlapping signals from light emitters and receivers embedded in a transparent waveguide layer that covers the functional areas of the finger.

“There has long been a gap between standalone tactile sensors and fully integrated tactile fingers—tactile sensing is still far from ubiquitous in robotic manipulation,” says Matei Ciocarlie, associate professor in the departments of mechanical engineering and computer science, who led the research in collaboration with electrical engineering professor Ioannis (John) Kymissis. “In this paper, we have demonstrated a multi-curved robotic finger with accurate touch localization and normal force detection over complex 3D surfaces.”

By measuring light transport between every emitter and receiver, the researchers showed that they can obtain a very rich signal data set that changes in response to deformation of the finger due to touch. They then demonstrated that purely data-driven deep learning methods can extract useful information from the data, including contact location and applied normal force, without the need for analytical models.

For their efforts, the payoff was a fully integrated, sensorized robot finger, with a low wire count. The robot finger was built using accessible manufacturing methods and designed for easy integration into dexterous hands.

The finger uses light to sense touch. The researchers shined a light from more than 30 LEDs just below the “skin” into a layer made of transparent silicone. The finger also has more than 30 photodiodes for measuring how the light bounces around. Whenever the finger touches something, its skin deforms, so light shifts around in the transparent layer underneath. Measuring how much light goes from every LED to every diode, the researchers end up with close to 1,000 signals that each contain some information about the contact that was made. Since light can also bounce around in a curved space, these signals can cover a complex 3D shape such as a fingertip, explained the researchers.

“The human finger provides incredibly rich contact information—more than 400 tiny touch sensors in every square centimeter of skin!” said Ciocarlie. “That was the model that pushed us to try and get as much data as possible from our finger. It was critical to be sure all contacts on all sides of the finger were covered—we essentially built a tactile robot finger with no blind spots.”

The Columbia team turned to machine learning algorithms for extracting and processing the data. For instance, it can tell where the finger is being touched, what is touching the finger, or how much force is being applied.

“Our results show that a deep neural network can extract this information with very high accuracy,” said Kymissis, who is also a member of the Data Science Institute. “Our device is truly a tactile finger designed from the very beginning to be used in conjunction with AI algorithms.”

The finger can be attached and integrated into robotic hands with relative ease, said the researchers. The finger collects almost 1,000 signals, but only needs a 14-wire cable connecting it to the hand, and it needs no complex off-board electronics.

The researchers already have two dexterous hands (capable of grasping and manipulating objects) in their lab being outfitted with these fingers—one hand has three fingers, and the other one four. In the coming months, the team will be using these hands to try and demonstrate dexterous manipulation abilities based on tactile and proprioceptive data.

“Dexterous robotic manipulation is needed now in fields such as manufacturing and logistics, and is one of the technologies that, in the longer term, are needed to enable personal robotic assistance in other areas, such as healthcare or service domains,” Ciocarlie said.

Facebook Comments