Scientists at the Massachusetts Institute of Technology are working on soft robotics, a robotics technique that uses flexible materials rather than non-traditional rigid materials. At present, the ability of soft robots is limited due to the lack of good perception. A robot needs to feel what it touches and the position of its fingers, an ability that most soft robots lack.
Researchers at the Massachusetts Institute of Technology have published two new papers outlining new tools to give robots a better sense of how they are interacting with them. Specifically, the paper describes new methods for robots to see and classify projects, as well as to get softer, more subtle touches. The soft robots have sensor-mounted skin on their hands that can be used to pick up heavier items, from fragile items such as crisps to bottles, the team said.
One of the papers builds on research last year by the Massachusetts Institute of Technology and Harvard University. In that study, the team developed a soft robot, a tapered origami structure. It folds on an object and picks up items that are 100 times heavier than it self. The team has now added haptic sensors to the origami structure that are connected to pressure sensors by milk capsules.
The new sensor allows the robot to grab and classify objects, giving the robot a better understanding of what it’s grabbing. The sensor accurately identifies 10 objects with an accuracy rate of more than 90%, even if an object slides out of the fixture. The second paper outlines a hand claw called “GelFlex” invented by researchers that uses an embedded camera and deep learning to achieve high-resolution haptic perception and perception of the body’s movements. The claws look like a two-finger cup gripper, which uses a tendon-driven mechanism to drive the fingers. When testing metal objects of various shapes, the system can accurately identify more than 96% of objects.