'Conduct-a-Bot': How to Control Drones and Robots With Gestures

Enabling robots to understand nonverbal cues like gestures can be an important step towards more widespread human-robot collaboration. This system, from MIT's Computer Science & Artificial Intelligence Lab , aims to take a step towards this goal by detecting gestures from wearable muscle and motion sensors. A user can make gestures to remotely control a robot by wearing small sensors on their biceps, triceps, and forearm. The current system, called "Conduct-a-Bot," detects eight predefined navigational gestures without requiring offline calibration or training data. A new user can simply put on the sensors and start gesturing to remotely pilot a drone through hoops, for example. The system uses plug-and-play algorithms and builds up an expandable vocabulary for communicating with a robot.