MIT 3D Printed Soft Robot Hand Adapts to Objects

One downside to the extra flexibility of soft robots is they often have difficulty accurately measuring where an object is, or even if they have successfully picked it up. MIT has found a way to change all that.

Photo Caption: Three fingers on a new soft robotic gripper each have special sensors that can estimate the size and shape of an object accurately enough to identify it from a set of multiple items. (Photo Credit: Jason Dorfman/CSAIL)

“As a human, if you’re blindfolded and you pick something up, you can feel it and still understand what it is,” says Katzschmann. “We want to develop a similar skill in robots - essentially, giving them ‘sight’ without them actually being able to see.”

The team is hopeful that, with further sensor advances, the system could eventually identify dozens of distinct objects, and be programmed to interact with them differently depending on their size, shape, and function. 

How it Works

Researchers control the gripper via a series of pistons that push pressurized air through the silicone fingers. The pistons cause little bubbles to expand in the fingers, spurring them to stretch and bend.

The hand can grip using two types of grasps: “enveloping grasps,” where the object is entirely contained within the gripper, and “pinch grasps,” where the object is held by the tips of the fingers.

Outfitted for the popular Baxter manufacturing robot, the gripper significantly outperformed Baxter’s default gripper, which was unable to pick up a CD or piece of paper and was prone to completely crushing items like a soda can.

Like Rus’ previous robotic arm, the fingers are made of silicone rubber, which was chosen because of its qualities of being both relatively stiff, but also flexible enough to expand with the pressure from the pistons. Meanwhile, the gripper’s interface and exterior finger-molds are 3D-printed, which means the system will work on virtually any robotic platform.

In the future, Rus says the team plans to put more time into improving and adding more sensors that will allow the gripper to identify a wider variety of objects.

“If we want robots in human-centered environments, they need to be more adaptive and able to interact with objects whose shape and placement are not precisely known,” Rus says. “Our dream is to develop a robot that, like a human, can approach an unknown object, big or small, determine its approximate shape and size, and figure out how to interface with it in one seamless motion.”

This work was done in the Distributed Robotics Laboratory at MIT with support from The Boeing Company and the National Science Foundation.

Editor’s Note: This article first appeared on MIT News and was republished with permission.




About the Author

MIT News · MIT News is dedicated to communicating to the media and the public the news and achievements of the students at the Massachusetts Institute of Technology.
Contact MIT News: newsoffice@mit.edu  ·  View More by MIT News.
Follow MIT on Twitter. Follow on FaceBook



Comments



Log in to leave a Comment

Article Topics

Robot Fun · 3D Printers · News · Media · Videos · All Topics


Editors’ Picks

WiBotic PowerPad Wirelessly Charges Drones
WiBotic’s PowerPad wirelessly charges everything from large industrial drones to smaller...

Meet Jing Xiao: WPI’s New Director of Robotics
In January 2018, Jing Xiao will become the new director of the Robotics...

Disney: Focus on the Robot Experience
The robot experience included in a business strategy is important not only...

Flirtey Wants Drones to Deliver Defibrillators in Nevada
Flirtey and REMSA have partnered to use drones to delivery automated external...