Human hands can hold an infant, play a sonata or perform open heart surgery. For decades, robot grippers have strived to match our hands’ sensitivity and performance, usually unsuccessfully.
Sometimes, however, dexterity isn’t everything. It’s much easier and more efficient for example to scoop beach sand using a bucket than with fingers and hands. Humans frequently and instinctively use available surfaces to help us grasp and hold stuff, and now MIT researchers have helped robots learn the trick as well. Instead of attempting to deftly grasp something by leveraging advanced computer vision algorithms and robotic hand engineering, the bot gripper simply shoves a target object against a stationary surface, which makes it much easier to further manipulate.
Picking up an object is much more complicated for robots than we imagine. The robot has to consider factors such as environmental geometry, friction, fundamental laws of physics and so on. The MIT researchers’ novel method can speed up the grasp-planning process from more than 10 minutes to less than a second. The research has been published in The International Journal of Robotics Research.

Previous algorithms could take hours to plan motions for a robotic gripper using a long computational process combining physical laws such as Newton’s laws of motion. MIT researchers improved the process by leveraging the surrounding environment to help robots accomplish physical tasks. Their algorithm works by calculating an essentially visual, cone-shaped friction map — a “motion cone” — for all possible configurations of the robotic gripper, the object it is holding, and the planes against which it can push.
Experiment results show the algorithm’s predictions are reliable and match the physical outcomes in the lab, requiring less than one second to plan out motions that would take traditional algorithms on average over 500 seconds to calculate.
It’s believed this three-way-interaction among robot, object and surroundings could help robots learn ways to more efficiently perform tasks that involve picking and sorting, especially in busy industrial environments. The research team is hoping to extend the approach to enable robotic grippers to also handle different types of tools, for potential application for example in manufacturing.
This is not the only sorting project MIT is working on. Earlier this year, MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) developed a robot arm with soft grippers, “RoCycle,” which can pick up objects from a conveyor belt and identify their materials by touch. In a mock recycling-plant setup, RoCycle correctly classified 27 objects with 85 percent accuracy.
The paper Planar In-Hand Manipulation via Motion Cones is available here.
Author: Yuqing Li | Editor: Michael Sarazen
0 comments on “Get a Grip! MIT Boosts Robot Grasping”