A robot is being developed that tracks facial movements to perform human tasks. The robot resembles large, squiggly arms holding tiny cameras. Sitting in a rolling office chair across from one of the arms, the robot's developer, Nathan Huber, demonstrates how it works. Huber rolls from left to write, forward and back, and the robot's camera follows his movements. He programmed the robot to track objects in planes. First, it identifies the object and locates it on the X and Y axes. Then, with a spatial-tracking algorithm, the robot can determine how near or far an object is in order to grab it.
The computer screen connected to the robot displays the object it's following — in this case, Huber's face — outlined by a green box. “What he's trying to do is get the center of my face,” says Huber. “Within the camera frame, the green box is what the desktop recognizes as a face. I calculate the center of that square, and I have the robot calculate where I'm at in space in respect to the camera.”
Applied commercially, Huber's work can advance virtual reality, like the technologies used in training pilots. Pilot training typically takes place in a virtual reality that simulates a cockpit. A face-tracking robot can hold different parts of a physical cockpit, like a control panel or keyboard. As pilots-in-training look around the virtual interior of the cockpit, the robot can track these facial movements and move a control panel to their lines of sight. When a person reaches out to press a button or pull a switch, a control panel held by one of the robots will be right there.
In addition to virtual reality, this technology can be applied to healthcare, specifically an aging population. A robot like this could help someone who is in a nursing home and has trouble reaching for and holding different things. “It would be ideal if this robot is sitting next to a resident, and she looks over at a cup of coffee, and the robot's able to recognize that, and is able to grab it and bring it to her,” said Huber.