Robotics researchers are developing exoskeleton legs that make their steps on their own, using sophisticated artificial intelligence (A.I.) technology. The self-controlled legs may someday support the movements of the elderly and those with physical disabilities.

The system, built and tested by researchers at the University of Waterloo, combines computer vision and deep-learning A.I. to mimic a human-like gait. "Learning" from a collection of sample-strolls around an environment, the system adjusts its movements based on the surroundings it senses.

“We’re giving robotic exoskeletons vision so they can control themselves,” said Brokoslaw Laschowski , a PhD candidate in systems design engineering who leads a University of Waterloo research project called ExoNet .

The ExoNet system, supported by artificial intelligence, pulls from training-data gathered by the team. With wearable cameras strapped to their chest (as shown in the above image), Laschowski and his fellow researchers took videos of indoor and outdoor environments.

A.I. computer software then processed the video feed to accurately recognize stairs, doors, and other features within the surroundings.

{youtube}https://www.youtube.com/watch?v=uqNXHu7Bgj0 {/youtube}

The achievement was detailed in the journal Frontiers in Robotics and AI . (Explore a research dataset  relating to this autonomous exoskeleton project.)

The latest in a series of papers on the related projects, Simulation of Stand-to-Sit Biomechanics for Robotic Exoskeletons and Prostheses with Energy Regeneration , appears in the journal IEEE Transactions on Medical Robotics and Bionics.

Motor-operated exoskeleton legs have been designed before, but the wearer has almost always required a joystick or smartphone application to control their movements.

“That can be inconvenient and cognitively demanding,” said Laschowski.  “Every time you want to perform a new locomotor activity, you have to stop, take out your smartphone and select the desired mode.”

The approach from the University of Waterloo offers a more automated control, thanks to the A.I. and computer-vision capabilities.

The next phase of the ExoNet research project will involve sending instructions to motors so that robotic exoskeletons can climb stairs, avoid obstacles, or take other appropriate actions based on analysis of the user’s current movement and the upcoming terrain.

Additionally, the researchers are also working to improve the energy efficiency of motors for robotic exoskeletons by using human motion to self-charge the batteries.

“Our control approach wouldn’t necessarily require human thought,” said Laschowski, who is supervised by engineering professor John McPhee , the Canada Research Chair in Biomechatronic System Dynamics, in his Motion Research Group  lab. “Similar to autonomous cars that drive themselves, we’re designing autonomous exoskeletons that walk for themselves.”

Brokoslaw Laschowski tests out the exoskeleton. (Image Credit: University of Waterloo)

In a Q&A with Tech Briefs below, Brokoslaw Laschowski explains more about the ExoNet technology, and why an exoskeleton that has features similar to a self-driving car must also include vehicle-like safety measures.

Tech Briefs: How do you ensure safety? Can the user take control if the exoskeleton is mistakenly doing something dangerous? The analogy is ADAS vs completely autonomous vehicles. How would the user control speed and stopping and going? How would such controls interface with the user?

Brokoslaw Laschowski: Safety is the upmost importance. These robotic devices are designed to assist elderly and those with physical disabilities (e.g., stroke, spinal cord injury, cerebral palsy, osteoarthritis, etc). We can’t afford the exoskeleton to make wrong decisions and potentially cause falls or injuries. Consequently, we’re focusing entirely on improving the classification accuracy and control by developing an environment-recognition system to allow the exoskeleton to autonomously sense and react in real-time to the walking environment. We’re optimizing the system performance using computers and wearable prototypes with “healthy” controls before clinical testing. However, the exoskeleton user will always have the ability to take over manual control (e.g., stopping and steering).

Tech Briefs: Can you take me through an application that you envision for this kind of exoskeleton? Where will this be most valuable?

Brokoslaw Laschowski: These robotic devices are designed to assist elderly and those with physical disabilities with locomotor activities. An example application of our environment-adaptive automated control system is switching between different locomotor activities. In commercially available exoskeletons, when transitioning for level-ground walking to climbing stairs, for example, the user approaches the staircase, stops, and manually communicates to the exoskeleton the intended activity using a mobile interface, push-buttons, or other hand-controls (depending on the device).

In contrast, with an autonomous control system, as the user approaches an incline staircase, onboard sensors like inertial measurement units (IMUs) are continuously sensing and classifying the user’s current movements, and the wearable camera system is sensing and classifying the upcoming terrain. The fusion of these different sensor technologies and pattern recognition algorithms is used to predict the user’s locomotor intent and control the exoskeleton.

Tech Briefs: How is the exoskeleton “trained” to operate without human thought?

Brokoslaw Laschowski: We use computer vision and deep learning for environment classification. Using millions of real-world images, our convolutional neural networks are automatically and efficiently trained to predict the different walking environments shown in the images. This information about the walking environment is subsequently used to control the robotic exoskeleton in terms of optimal path planning, obstacle avoidance, and switching between different locomotor activities (e.g., level-ground walking to climbing stairs).

Tech Briefs: What’s next for this exoskeleton? What are you working on now?

Brokoslaw Laschowski: From a safety critical perspective, these A.I.-powered exoskeleton control systems need to perform accurately and in real-time. Therefore, we’re focusing on improving the environment classification accuracy while using neural network architectures with minimal computational and memory storage requirements to promote onboard real-time inference.

What do you think? Share your questions and comments.