Your smartwatch can count your steps, but can it tell if you’re typing on a keyboard? Or chopping a vegetable?

Researchers at Carnegie Mellon University's Human-Computer Interaction Institute  (HCII) have turned a standard smartwatch into a detector of specific hand activities.

And typing and chopping are only the beginning.

After adjusting the smartwatch’s operating system, the CMU team used the device’s accelerometer to recognize movement and, in some case, bio-acoustic sounds associated with 25 different hand actions.

Detected behavior included petting a dog, washing dishes, cutting with scissors, and pouring a glass of water.

Why is this kind of specific understanding valuable?

Just as smartphones can now block text messages while a user is driving, future hand-sensing devices may someday learn not to interrupt someone who is, say, working in the kitchen or operating power equipment.

Sensing hand activity may also provide valuable health information, according to the lead researchers. Wearers, for example, can potentially use the smartwatch data to monitor hygiene habits, like smoking or flossing, and even spot early signs of illness.

Carnegie Mellon Ph.D. student Gierad Laput and Assistant Professor Chris Harrison envision apps that alert users to motor impairments such as those associated with Parkinson's disease.

The researchers began their study by first recruiting 50 people to test out the specifically programmed smartwatches. Each subject wore the watch for a total of approximately 1,000 hours.

During the trial period, the watch would periodically pick up on a hand movement and then prompt the wearer to describe the detected motion, such as shaving, clapping, scratching, or putting on lipstick.

A data set slowly formed, ultimately allowing the detection of 25 specific hand activities  – all at 93.6 percent accuracy.

For the smartwatch to detect actions, the device was placed around the wrist of the wearer’s active hand. Future experiments will explore what events can be detected via the passive arm.

“No easy way around this, but the technique works quite well for hand activities that require two hands,” Laput writes on his website .

Harrison and Laput presented their findings earlier this month at CHI 2019 , the Association for Computing Machinery's Conference on Human Factors in Computing Systems.

"The 25 hand activities we evaluated are a small fraction of the ways we engage our arms and hands in the real world," Laput said.

The CMU system detects chopping. (Image Credit: Gierad Laput)

In an interview with Tech Briefs, Laput explains what kinds of “wild ideas” he imagines as the detection system gets even better.

Tech Briefs: Why is it especially valuable to have a device that knows what the hand is doing?

Gierad Laput: We saw an opportunity with smartwatches — they are highly capable computers that are sitting on our wrists, yet they know nothing about what our hands are doing. This is a missed opportunity.

Philosopher Immanuel Kant argued that “the hands are the visible part of the brain.” Indeed, what the hands are doing offers insight into human activity, which can lead to applications that are more assistive, more accommodating, and can facilitate skill acquisition, monitor degradation, and help nudge people towards healthier habits.

Tech Briefs: What kinds of hand activity can be detected?

Laput: We make a distinction between “atomic events" (ones that cannot be broken down into distinct stages, like “chopping”) and “compound events” (a combination of events, such as “cooking” or “eating.”)

In our paper, we list 25 atomic hand activities . (see the above image)

Next: Playing the piano. (Image Credit: Gierad Laput)

Tech Briefs: What part of the technology is enabling such fine-grained activity detection?

Laput: We overclocked the smartwatch accelerometer to operate at high-speed mode (from 100Hz to 4000Hz). Using signal processing and machine learning, we were able to extract patterns and discriminate hand activities from the data.

Tech Briefs: What is most challenging about detecting hand motion? What has prevented this kind of fine-grained detection up to now?

Laput: The most challenging aspect of hand-motion detection is the diversity of hand activities across users, and system implications on battery life.

It is also important to note that, culturally, people wear watches on their non-dominant arm. Two-handed handed activities are easier to detect.

Tech Briefs: What applications are possible? What would be the immediate, obvious applications, and what are some of your favorite, more “far out” ideas that you can envision down the road?

Laput: The most obvious are “personal informatics” applications — for example, how often did I brush my teeth?

Wild ideas include:

  • Interactive experiences that limit interruption if it knows your hands are busy
  • Habit building applications — for example, showing alerts to detect if you’ve been typing too much
  • Health applications. Imagine being able to detect, say, the onset of tremors.

What kind of “wild ideas” do you envision with hand-motion detection? Share your questions and comments below.