MIT researchers have compiled a dataset that captures the detailed behavior of a robotic system physically pushing hundreds of different objects. Using the dataset, robots “learn” pushing dynamics. To capture the data, the researchers used an automated system consisting of an industrial robotic arm, a 3D motion-tracking system, depth and traditional cameras, and software that stitches everything together.
The arm pushes around modular objects that can be adjusted for weight, shape, and mass distribution. For each push, the system captures how those characteristics affect the robot’s push. The dataset, called Omnipush, is already being used to build models that help robots predict where objects will land when they’re pushed.