Artificial Brain Gives Robots Unprecedented Sensing Capabilities
Robots have come a long way in their functionality, but there are still many sensing capabilities that can’t be achieved by these systems that compare to how humans interact with their environments.
To solve this issue, researchers at the National University of Singapore (NUS) have created a complex artificial brain system called NeuTouch that mimics human neural networks to provide neuromorphic processing for robotic systems. This should provide them with more sophisticated sensing functionality, including what’s needed to pick up, hold, and manipulate objects in a way that mimics human interactions.
The current problem with robotic systems is they depend on visual processing rather than the actual sense of touch that humans have to help us handle and manipulate objects, said Benjamin C.K. Tee, an assistant professor at NUS Materials Science and Engineering, who co-led the development of NeuTouch with Assistant Professor Harold Soh from NUS Computer Science.
“Robots need to have a sense of touch in order to interact better with humans, but robots today still cannot feel objects very well,” he told Design News. “Touch sensing allows robots to perceive objects based on their physical properties, e.g., surface texture, weight, and stiffness. Such tactile sensing capability augments the robot’s perception of the physical world with information beyond what standard vision and auditory modalities can provide.”
Building a Complete System
The new solution builds on technology Tee and fellow researchers created last year when they developed an artificial nervous system that can give robots and prosthetic devices a sense of touch on par with or even better than human skin.
This system, called Asynchronous Coded Electronic Skin (ACES), can detect touches more than 1,000 times faster than the human sensory nervous system, as well as identify the shape, texture, and hardness of objects 10 times faster than the blink of an eye, Design News reported at the time.
NeuTouch can process sensory data from ACES using neuromorphic technology, which is an area of computing that emulates the neural structure and operation of the human brain. To do this, researchers integrated Intel’s Loihi neuromorphic research chip into the system, Tee said.
By using ACES, NeuTouch can mimic the function of the fast-adapting (FA) mechano-receptors of a human fingertip, which captures dynamic pressure, or dynamic skin deformations, Tee said.
“FA responses are crucial for dexterous manipulation tasks that require rapid detection of object slippage, object hardness, and local curvature,” he told Design News.
Testing for Results
To test the system, researchers fitted a robotic hand with ACES and used it to read braille, passing the tactile data to Loihi via the cloud to convert the micro bumps felt by the hand into a semantic meaning.
In these experiments, Loihi achieved over 92 percent accuracy in classifying the Braille letters, while using 20 times less power than a normal microprocessor.
In other tests, researchers demonstrated how they could improve the robot’s perception capabilities by combining both vision and touch data in a spiking neural network. They tasked a robot equipped with both artificial skin and vision sensors to classify various opaque containers containing differing amounts of liquid. They also tested the system’s ability to identify rotational slip, which is important for stable grasping.
In both tests, the spiking neural network that used both vision and touch data was able to classify objects and detect object slippage with 10 percent more accuracy than a system that used only vision.
Moreover, NeuTouch also could classify the sensory data while it was being accumulated, unlike the conventional approach where data is classified after it has been fully gathered.
The tests also demonstrated the efficiency of neuromorphic technology; Loihi processed the sensory data 21 percent faster than a top-performing graphics processing unit (GPU) while using more than 45 times less power.
Researchers published a paper on their work online and presented their findings at the Robotics: Science and Systems conference.
Applications and Post-COVID 19 Uses
Some applications for NeuTouch include integrating the system into robot grippers to detect slip, which is key to manipulating fragile objects safely and with stability, such as in factory or supply-chain settings, Tee told Design News.
“Accurate detection of slip will allow the robot controller to re-grasp the object and remedy poor initial grasp locations,” he told us. “This feature can be applied to develop more intelligent robots to take over mundane operations such as packing of items in warehouses, which robotic arms can easily adapt to unfamiliar items and apply the appropriate amount of strength to manipulate the items without slippage.”
The system also can be used to create autonomous robots “capable of deft manipulation in (unstructured) physical spaces, since the robots have the ability to feel and better perceive their surroundings,” he added.
Moving forward, researchers plan to continue their work to develop the artificial skin for applications in the logistics and food manufacturing industries where there is a high demand for robotic automation, Tee told Design News.
This type of functionality will especially become more critical in a post-COVID 19 world for creating applications that avoid human contact by letting robots do the work, he said.