New research from the Robotics Institute aims to increase autonomy for individuals with motor impairments by introducing a head-worn device that will help them control a mobile manipulator. (Image: CMU)

More than five million people in the United States live with some form of paralysis and may encounter difficulties completing everyday tasks, like grabbing a glass of water or putting on clothes. New research from Carnegie Mellon University’s Robotics Institute (RI) aims to increase autonomy for individuals with such motor impairments by introducing a head-worn device that will help them control a mobile manipulator.

Teleoperated mobile manipulators can aid individuals in completing daily activities, but many existing technologies like hand-operated joysticks or web interfaces require a user to have substantial fine motor skills to effectively control them. Research led by robotics Ph.D. student Akhil Padmanabha offers a new device equipped with a handsfree microphone and head-worn sensor that allows users to control a mobile robot via head motion and speech recognition. Head-Worn Assistive Teleoperation (HAT) requires fewer fine motor skills than other interfaces, offering an alternative for users who face constraints with technology currently on the market.

The interface, shown below, consists of a hat integrated with an absolute orientation inertial measurement unit (IMU), microcontroller with built-in Bluetooth capabilities, and small lithium polymer (LiPo) battery. A thin layer of Neoprene foam is added over the electronics for comfort.

Speech recognition, using audio captured by a wireless microphone worn by the participant, is used for selection of four robot modes: drive, arm, wrist, and gripper. Signals from the head-worn interface are communicated to the mobile manipulator via Bluetooth and mapped to velocity commands for the robot’s actuators based on the mode the user is in. A human study was conducted with 16 healthy participants and 2 participants with motor impairments. Participants were asked to complete four tasks including, cup retrieval, trash pickup, blanket removal, and leg cleaning.

Participants were additionally asked to complete Task 1, Cup, with a conventional web interface with modifiable speed control in conjunction with a head tracking software and single button mouse. The head-worn interface serves as an alternative to a web interface for people who have a difficult time accessing traditional computing systems.

In addition to Padmanabha, the research team includes Qin Wang, Daphne Han, Jashkumar Diyora, Kriti Kacker, Hamza Khalid, Liang-Jung Chen, Carmel Majidi and Zackory Erickson. In a human study, participants both with and without motor impairments performed multiple household and self-care tasks with low error rates, minimal effort, and a high perceived ease of use.

For more information, contact Aaron Aupperlee at This email address is being protected from spambots. You need JavaScript enabled to view it.; 412-268-9068.