Robots That ‘Feel’ Could Begin New Era with Safer Human-Robotic Interaction

robots that feel

Two researchers from the National University of Singapore (NUS), who are members of the Intel Neuromorphic Research Community (INRC), presented new findings demonstrating the promise of event-based vision and touch sensing with Intel’s neuromorphic processing for robotics. The work highlights sense of touch in robotics can significantly improve capabilities and functionalities.

 Mike Davies, director of Intel’s Neuromorphic Computing Lab said, “This research from National University of Singapore provides a compelling glimpse to the future of robotics where information is both sensed and processed in an event-driven manner combining multiple modalities. The work adds to a growing body of results showing that neuromorphic computing can deliver significant gains in latency and power consumption once the entire system is re-engineered in an event-based paradigm spanning sensors, data formats, algorithms, and hardware architecture”.

Most of today’s robots operate only on visual processing. Researchers at NUS hope to change this using their recently developed artificial skin, which can detect touch more than 1,000 times faster than the human sensory nervous system and identify the shape, texture and hardness of objects 10 times faster than the blink of an eye.

Enabling a human-like sense of touch in robotics could improve current functionality and can lead to new use cases. The ability to feel and better perceive surroundings could let safer human-robotic interaction.

While the creation of artificial skin is one step in bringing this vision to life, it also requires a chip to draw accurate conclusions based on the skin’s sensory data in real time. Assistant professor Benjamin Tee from the NUS Department of Materials Science and Engineering and NUS Institute for Health Innovation & Technology said, “Making an ultra-fast artificial skin sensor solves about half the puzzle of making robots smarter. They also need an artificial brain that can ultimately achieve perception and learning as another critical piece in the puzzle. Our unique demonstration of an AI skin system with neuromorphic chips such as the Intel Loihi provides a major step forward towards power-efficiency and scalability”.

Assistant professor Harold Soh from the Department of Computer Science at the NUS School of Computing said, “We’re excited by these results. They show that a neuromorphic system is a promising piece of the puzzle for combining multiple sensors to improve robot perception. It’s a step toward building power-efficient and trustworthy robots that can respond quickly and appropriately in unexpected situations”.

You can read more details here.

Recent News

Related Posts