Building on her experience volunteering at retirement homes, Carnegie Mellon researcher Jasmine Li decided to focus her research on assistive robotics that help people with everyday tasks. “I was interested in the side of robotics that helps people who might not be as familiar with technology,” she said. “I was thinking about the hardware side of robotics, but I ended up doing a lot more with the data collection and software — the algorithmic side.”
For her project, she worked with Ph.D. student Zheyuan Hu in the Robotic Caregiving and Human Interaction Lab led by assistant professor Zackory Erickson.
Li worked with a bimanual robot arm setup — two multijointed arms clamped on a table — that can be controlled by a human remotely, using a pair of VR joysticks, or operated fully autonomously via a neural network. She analyzed the robot's behaviors in both simulations and real-world tasks in order to study how robots fail when imitating complex human activities, such as hanging shirts.
“We found that when a human tries to insert a hanger, the person will sometimes do minuscule corrections, but we had a theory that the robot might learn better if we corrected the task on a larger scale,” she said. So, rather than a tiny twist or adjustment, they guided the robotic arms to return to their original position before attempting to hang the shirt again more accurately.
Using the newfound data collection method, training the robot became more efficient, collecting more data and improving performance with fewer iterations of human teaching, Li said.
The team also experimented with the method using other trials that tasked the robot with packing a burger into a takeout box and sealing an airtight container lid.
“It’s hard to train a robot to be able to complete multiple different tasks, what we call generalization,” Li said. “Robotics research, for now, focuses on training robots for specific tasks, but, eventually, everyone contributing to the research will help us get there.”

