A new system from the Georgia Institute of Technology has a sound approach to recognizing tiny gestures of the hand.

With the help of a thumb ring and a smartwatch-like wristband, the “FingerPing” setup identifies 22 separate micro finger motions, including the tapping of fingertips.

Once recognized, the gestures could be programmed to enable various commands, like inputting data or playing music.

Sound waves produce specific patterns as they travel through a structure, including a hand. The patterns change as the hand moves.

“The receiver recognizes these tiny differences,” said Cheng Zhang , lead researcher and Ph.D. student in the School of Interactive Computing.

The Georgia Tech-developed thumb ring produces acoustic chirps that travel through the hand and are then picked up by receivers on the watch.

“The injected sound from the thumb will travel at different paths inside the body with different hand postures,” said Zhang. “For instance, when your hand is open, there is only one direct path from the thumb to the wrist. Any time you do a gesture where you close a loop, the sound will take a different path and that will form a unique signature.”

At a high rate of accuracy, the system recognizes a variety of unique signatures, including hand poses that signify “1” through “10” in American Sign Language (ASL).

Zhang spoke with Tech Briefs about the promising applications that could result from the tiniest of gestures.

Tech Briefs: What is a “micro finger gesture” exactly?

Cheng Zhang: A micro finger gesture, by design, only requires the user to move the finger to perform gestures. The user does not need to move the wrist or the arm to conduct a gesture. Compared with arm or wrist gestures, finger gestures are more subtle and discreet, which make it more socially appropriate in different context.

Tech Briefs: How is your technology designed to detect a micro finger gesture?

Zhang: FingerPing recognizes these subtle micro gestures by injecting sound waves into the human body on the thumb, and receiving the acoustical signals on the wrist. Based on which path the acoustic signal travels through, it presents different acoustic signatures which can be distinguished by customized machine learning algorithms.

We observed that performing different finger poses and gestures would change the body propagation path and generate different acoustic signatures.

Tech Briefs: What are the challenges of detecting gestures? Is the technology ever “confused” by gestures, and if so, what causes that?

Zhang: The challenge is to recognize these highly similar micro-finger gestures. For instance, tapping on the top and middle phalanges of a finger only introduces a difference with 1 centimeter. The more similar the gestures are, the more likely they will be confused with each other, because there are more overlaps of the sound propagation path.

Tech Briefs: What is the "acoustic chirp?"

Zhang: The acoustic chirps are sweeping the sound signal from 20Hz to 6000Hz in a linear order, within a short amount of time, so that we can record the frequency response at different frequency.

Tech Briefs: Where do you see this technology being most useful? What kinds of commands do you imagine tied to the gestures?

Zhang: This technology is mostly useful for input on wearable devices. In the future it can be used to input numbers, text (T9 Keyboard), or provide shortcuts for applications. In addition, it can also be used to provide additional input for Virtual Reality devices without occupying the entire hand.

Tech Briefs: What was the inspiration behind this invention?

Zhang: The inspiration was to provide a technology that can recognize a broader set of fine-grained micro finger gestures, which is socially appropriate and does not require the user to hold the device in their hands.

Tech Briefs: What is most exciting to you about this technology?

Zhang: I feel most excited about the potential of the technology. We have shown that the acoustic signature of the hand can be used to reflect subtle changes of the hand. The similar phenomenon can be potentially used to a wider set of applications, like hydration-level detection.

Another application is to provide a comprehensive set of input gestures for wearables, which are currently missing. The other promising application is to build a Wearable American Sign Language Translator to help translate ASL in real-time to a broader audience.

What do you think? What possibilities do you see with FingerPing? Share your questions and comments below.