Researchers Build Robotic Arms Combining Low Cost Tactile Sensor with Machine Learning
Researchers from ETH Zürich have declared that they’ve leveraged machine learning to build a low-cost tactile sensor. The sensor can measure force distribution in high resolution and with high accuracy.
These features allow the robot arm to hold sensitive, fragile objects with more skill. Enabling robotic grippers to feel is essential to making them more powerful.
In humans, touch sensors allow them to pick up fragile or slippery objects with hands without the worry of crushing or dropping the object.
If an object is about fall through fingers, humans adjust the strength accordingly. Scientists need robotic grippers that pick-up products to have a similar type of feedback as humans get from their touch senses. The new sensor that the researchers have developed is claimed to be a vital step in the direction of a “robotic skin.”
The sensor has an elastic silicone skin with colored plastic microbeads and a regular camera fastened to its underside.
The vision-based sensor can see when it touches an object, and an indentation appears in the silicone skin. The contact changes the pattern of the microbeads that can be registered by the fisheye lens on the underside of the sensor.
The robotic skin the scientists came up with can differentiate between several forces acting on the sensor surface and calculate them with high degrees of resolution and accuracy.
The staff can decide the route from which the force is acting. When calculating the forces that are pushing the microbeads and in which directions, the team makes use of a set of experiments and information.