Look out! A low-powered solution to keep robots from crashing
One of the big regulatory hurdles to the commercial adoption of drones — the kind that fly as well as the kind that roll down sidewalks and may one day deliver you a pizza — is collision safety. A small handful of safety incidents could have major consequences for adoption, and many regulatory bodies are loath to take chance on automated equipment designed to operate in public spaces without much precedent.
Sensor redundancy certainly helps the case commercial for autonomous mobile robots and flying delivery drones, and radar is a fantastic sensor for collision avoidance when used in concert with other sensing modalities. But radar is problematic in that it adds payload weight and increases power requirements, upsetting the careful balancing act engineers must perform, particularly when developing flying robots.
A Belgian research and innovation hub focusing on nanotechnology has developed solution that may open the floodgates for radar-based collision avoidance in UAV, ground drones, and robots of various stripes. Imec, headquartered in Leuven, has built what it claims is the world’s first spiking neural network-based chip for radar signal processing.
“Today, we present the world’s first chip that processes radar signals using a recurrent spiking neural network,” says Ilja Ocket, program manager of neuromorphic sensing at imec. “SNNs operate very similarly to biological neural networks, in which neurons fire electrical pulses sparsely over time, and only when the sensory input changes. As such, energy consumption can significantly be reduced. What’s more, the spiking neurons on our chip can be connected recurrently – turning the SNN into a dynamic system that learns and remembers temporal patterns. The technology we are introducing today is a major leap forward in the development of truly self-learning systems.”
Interestingly, the chip was initially designed to support electrocardiogram (ECG) and speech processing in power-constrained devices. But its generic architecture turned out to be easy to reconfigure to process a variety of other sensory inputs like sonar, radar, and lidar data. The use case of drones seemed to naturally suggest itself from there. The drone industry, after all, is built on a foundation of power-constrained devices. Further, those devices need to react quickly to changes in their environment in order and, crucially, avoid obstacles.
“Hence, a flagship use-case for our new chip includes the creation of a low-latency, low-power anti-collision system for drones. Doing its processing close to the radar sensor, our chip should enable the radar sensing system to distinguish much more quickly – and accurately – between approaching objects. In turn, this will allow drones to nearly instantaneously react to potentially dangerous situations,” says Ilja Ocket. “One scenario we are currently exploring features autonomous drones that depend on their on-board camera and radar sensor systems for in-warehouse navigation, keeping a safe distance from walls and shelves while performing complex tasks. This technology could be used in plenty of other use-cases as well – from robotics scenarios to the deployment of automatic guided vehicles (AGVs) and even health monitoring.”
It’s a great example of the technology and sensor convergence that’s driving much of the automation revolution. It could also be an important step as companies and organizations the world over make their cases to regulatory agencies to open access to skies and public roads for commercial drones.