In a first-ever demonstration of a two-way interaction between a primate brain and a virtual body, two monkeys trained at the Duke University Center for Neuroengineering learned to employ brain activity alone to move an avatar hand and identify the texture of virtual objects.

Without moving any part of their real bodies, the monkeys used their electrical brain activity to direct the virtual hands of an avatar to the surface of virtual objects, and, upon contact, were able to differentiate their textures. The texture of the virtual objects was expressed as a pattern of minute electrical signals transmitted to the monkeys’ brains.

Because no real part of the animal’s real body was involved in the operation of the avatar, these experiments suggest that in the future, severely paralyzed patients may take advantage of this technology not only to regain mobility but also to have their sense of touch restored.

The findings provide further evidence that it may be possible to create a robotic exoskeleton that severely paralyzed patients could wear in order to explore and receive feedback from the outside world. Such an exoskeleton would be directly controlled by the patient's voluntary brain activity in order to allow the patient to move autonomously.

Visit Duke University Center  for more information.

Also: Learn about a wearable, artificially intelligent, bionic device  that helps paraplegics stand and walk in a straight line.

Topics: