Generating Control Commands From Gestures Sensed by EMG
- Sunday, 01 October 2006
Electrical signals from muscles involved in gestures are recognized.
An effort is under way to develop noninvasive neuro-electric interfaces through which human operators could control systems as diverse as simple mechanical devices, computers, aircraft, and even spacecraft. The basic idea is to use electrodes on the surface of the skin to acquire electromyographic (EMG) signals associated with gestures, digitize and process the EMG signals to recognize the gestures, and generate digital commands to perform the actions signified by the gestures.
In an experimental prototype of such interface, the EMG signals associated with hand gestures are acquired by use several pairs of electrodes mounted in sleeves on a subject’s forearm (see figure). The EMG signals are sampled and digitized. The resulting time-series data are fed as input to pattern-recognition software that has been trained to distinguish gestures from a given gesture set. The software implements, among other things, hidden Markov models, which are used to recognize the gestures as they are being performed in real time.
Thus far, two experiments have been performed on the prototype interface to demonstrate feasibility: an experiment in synthesizing the output of a joystick and an experiment in synthesizing the output of a computer or typewriter keyboard. In the joystick experiment, the EMG signals were processed into joystick commands for a realistic flight simulator for an airplane. The acting pilot reached out into the air, grabbed an imaginary joystick, and pretended to manipulate the joystick to achieve left and right banks and up and down pitches of the simulated airplane. In the keyboard experiment, the subject pretended to type on a numerical keypad, and the EMG signals were processed into keystrokes.
The results of the experiments demonstrate the basic feasibility of this method while indicating the need for further research to reduce the incidence of errors (including confusion among gestures). Topics that must be addressed include the numbers and arrangements of electrodes needed to acquire sufficient data; refinements in the acquisition, filtering, and digitization of EMG signals; and methods of training the pattern- recognition software.
The joystick and keyboard simulations were chosen for the initial experiments because they are familiar to many computer users. It is anticipated that, ultimately, interfaces would utilize EMG signals associated with movements more nearly natural than those associated with joysticks or keyboards. Future versions of the pattern-recognition software are planned to be capable of adapting to the preferences and day-today variations in EMG outputs of individual users; this capability for adaptation would also make it possible to select gestures that, to a given user, feel the most nearly natural for generating control signals for a given task (provided that there are enough properly positioned electrodes to acquire the EMG signals from the muscles involved in the gestures).
This work was done by Kevin R. Wheeler and Charles Jorgensen of Ames Research Center. For further information, access the Technical Support Package (TSP) free online at www.techbriefs.com/tsp under the Bio-Medical category.
This invention has been patented by NASA (U.S. Patent No. 6,720,984). Inquiries concerning rights for the commercial use of this invention should be addressed to the Ames Technology Partnerships Division at (650) 604-2954. Refer to ARC-14494-1.
This Brief includes a Technical Support Package (TSP).
Generating Control Commands From Gestures Sensed by EMG (reference ARC-14494-1) is currently available for download from the TSP library.
Please Login at the top of the page to download.