Researchers partner with Intel to create ultra-sensitive artificial skin

The field of robotics has made some significant progress in recent years, but one of the main differences between us and our robot counterparts is our sensory ability. It’s not always enough to have the flexibility to perform tasks alone, because our sense of touch provides the brain with a lot of background information. Today, researchers at the National University of Singapore (NUS) unveiled new tools that will allow robots to perceive touch, potentially opening windows for them to perform more tasks.

Today’s announcement has been in the works for a long time. Last year, the Materials Science and Engineering team at the National University of Singapore, led by assistant professor Benjamin Tee, detailed the development of an artificial skin sensor designed to make robots and prosthetics feel touched. As Tee explained in today’s announcement, developing the artificial skin is only part of the way the team is on the way to achieving its goals, which is to actually use the skin to enhance the robot’s capabilities.

Creating an ultra-fast artificial skin sensor solves the half-challenge of making robots smarter, and they also need an artificial brain that can eventually achieve perception and learning as another key part of the puzzle. In this case, the brain is Intel’s Loihi neuromorphic computing chip. Loihi uses a spike neural network to process touch data collected by artificial skin and visual data collected by event-based cameras. In one test, researchers at the National University of Singapore asked a robot to “read” the Braille letters and then pass the haptic message to Loihi through the cloud. Loihi can classify these letters with more than 92% accuracy, and it does this with 20 times less power than a standard Von Neumann processor.

In another test, the researchers paired the touch system with an event-based camera, allowing the robot to classify various opaque containers containing different amounts of liquid. Using tactile and visual data, Loihi was able to identify the rotational slip of each container, which NUS researchers say is important for maintaining a stable grip. Finally, the researchers found that pairing event-based vision with haptic data from artificial skin, and then processing the information with Loihi, increased the accuracy of object classification by 10 percent compared to a visual-only system.

In everyday life, robots that can be felt and seen may have many applications, whether they are caregivers for the elderly or the sick, or working with new products in factories. Giving robots the ability to feel can also open the door to automated surgery, which would be a major advance in robotics.

Researchers partner with Intel to create ultra-sensitive artificial skin

Researchers partner with Intel to create ultra-sensitive artificial skin

Researchers partner with Intel to create ultra-sensitive artificial skin