MIT researchers are creating a data set of reactions designed to tell robots how various objects react to pushing. The goal? To train future robots how to interact with objects in the real world.
The project involves letting robots learn about different shapes and weight distributions first-hand through machine learning.
?We need a lot of rich data to make sure our robots can learn,? said Maria Bauza, a graduate student in the Department of Mechanical Engineering who worked on the project. ?Here, we?re collecting data from a real robotic system, [and] the objects are varied enough to capture the richness of the pushing phenomena. This is important to help robots understand how pushing works, and to translate that information to other similar objects in the real world.?
From the release:
To capture the data, the researchers designed an automated system consisting of an industrial robotic arm with precise control, a 3D motion-tracking system, depth and traditional cameras, and software that stitches everything together. The arm pushes around modular objects that can be adjusted for weight, shape, and mass distribution. For each push, the system captures how those characteristics affect the robot?s push.
The dataset, called ?Omnipush,? contains 250 different pushes of 250 objects, totaling roughly 62,500 unique pushes. It?s already being used by researchers to, for instance, build models that help robots predict where objects will land when they?re pushed.
Because the robots have no context for how objects move in 3D space this data give them diverse and complex data on physical interactions. It will definitely be helpful when robots begin interacting with objects in the home or in the street where it will be nice to no if a gentle push is preferred over a violent shove.