Engineers have given their robot a sense of touch with synthetic smart skin

Cal Jeffrey

Posts: 2,452   +562
Staff member
In a nutshell: Robots with full-sensory capabilities similar to humans are still quite far off. However, engineers at the National University of Singapore have published research taking them a step closer to that reality. Combining an event-based vision system and synthetic tactile-sensing skin with Intel's Loihi neuromorphic chip, the team produced a highly accurate model that's up to 1,000 times faster than the human nervous system.

Researchers at the National University of Singapore (NUS) have released a study showing how neuromorphic technology allowed them to develop a synthetic skin for robots that is 1,000 faster than a human's sensory system. The tech can also identify the shape, texture, and hardness of an object 10 times more quickly than the blink of an eye.

The NUS research team leveraged Intel's Loihi self-learning neuromorphic chip to process signals from the artificial skin. The first test conducted was having a robot fitted with the skin read Braille. Loihi was able to receive the tactile data and translate the Braille with better than 92 percent accuracy. The neuromorphic architecture also consumed 20 times less power than a standard Von Neumann processor.

The team then incorporated tactile data from the synthetic skin and visual signals from an event-based camera into a sensory payload to identify containers with varying amounts of liquid. The skin can detect slight changes in slippage, which allowed the robot to help sense weight. When added to the visual cues, the system was 10 percent more accurate in classifying the containers than using event-based vision alone.

For comparison, the researchers conducted this test using both a high-end GPU and the Loihi chip. It found that the spiked neural network processed the combined data only slightly faster (21-percent) than a "top-performing" GPU, but the power consumption (1.3W) was less than the idle power consumption of the GPU.

"We're excited by these results," said NUS School of Computing Assistant Professor Harold Soh. "They show that a neuromorphic system is a promising piece of the puzzle for combining multiple sensors to improve robot perception. It's a step toward building power-efficient and trustworthy robots that can respond quickly and appropriately in unexpected situations."

Further work in combining neuromorphic processing with sensory data could lead to robots capable of highly delicate work. They would also be able to dynamically react to changing conditions such as adjusting grip depending on if an object is wet, dry, heavy, light, hard, or soft.

The researchers presented their findings (virtually) this week at the Robotics Science and Systems convention. You can watch the five-minute keynote above, or if you are into all the nuts and bolts, they posted the paper on their website.

Permalink to story.

 

Uncle Al

Posts: 7,237   +5,636
I hope these discoveries are being seriously looked at by the makers of artificial limbs, especially for hands and feet. Being able to regain "feeling" would be a serious game changer for these folks!