Intricate tasks that require dexterous in-hand manipulation — rolling, pivoting, bending, and sensing friction — are a challenge for today's robots. A University of Washington team of computer scientists and engineers has built a robotic hand that performs dexterous manipulation and "learns" from its own experience. The five-fingered robot, for example, can spin a tube full of coffee beans without needing human direction.

UW computer science and engineering doctoral student Vikash Kumar custom built this robot hand, which has 40 tendons, 24 joints and more than 130 sensors.
University of Washington

The team first developed algorithms that allowed a computer to model highly complex five-fingered behaviors and plan movements to achieve different outcomes — like typing on a keyboard or dropping and catching a stick — in simulation.

Machine learning algorithms enable the robot hand to become progressively more adept at the tasks. Most recently, the researchers have transferred the models to work on the actual five-fingered hand hardware.

As the robot hand performs different tasks, the system collects data from various sensors and motion capture cameras, and employs machine learning algorithms to continually refine and develop more realistic models.

The autonomous learning approach developed by the UW Movement Control Laboratory contrasts with robotics demonstrations that require people to program each individual movement of the robot’s hand in order to complete a single task.

Source 

Also: Read Robotics, Automation & Control tech briefs.


Medical Design Briefs Magazine

This article first appeared in the July, 2016 issue of Medical Design Briefs Magazine.

Read more articles from this issue here.

Read more articles from the archives here.