Image of “Pushy” Robots
A key to compiling the Omnipush dataset was building modular objects (pictured) that enabled the robotic system to capture a vast diversity of pushing behavior. The central pieces contain markers on their centers and points so a motion-detection system can detect their position within a millimeter. (Image courtesy of the researchers)

MIT researchers have compiled a dataset that captures the detailed behavior of a robotic system physically pushing hundreds of different objects. Using the dataset, robots “learn” pushing dynamics. To capture the data, the researchers used an automated system consisting of an industrial robotic arm, a 3D motion-tracking system, depth and traditional cameras, and software that stitches everything together.

The arm pushes around modular objects that can be adjusted for weight, shape, and mass distribution. For each push, the system captures how those characteristics affect the robot’s push. The dataset, called Omnipush, is already being used to build models that help robots predict where objects will land when they’re pushed.