New research has shown future wearable devices, such as smartwatches, could use ultrasound imaging to sense hand gestures. The research team is led by Professor Mike Fraser, Asier Marzo, and Jess McIntosh from the Bristol Interaction Group at the University of Bristol in the U.K., together with University Hospitals Bristol NHS Foundation Trust.
Computers are growing in number and wearable computers, such as smartwatches, are gaining popularity. Devices around the home, such as Wi-Fi light bulbs and smart thermostats, are also on the increase. However, current technology limits the capability to interact with these devices.
Hand gestures have been suggested as an intuitive and easy way of interacting with and controlling smart devices in different surroundings. For instance, a gesture could be used to dim the lights in the living room or to open or close a window. Hand gesture recognition can be achieved in many ways, but the placement of a sensor is a major restriction and often rules out certain techniques. However, with smartwatches becoming the leading wearable device, this allows sensors to be put in the watch to sense hand movement.
The research team propose ultrasonic imaging of the forearm could be used to recognize hand gestures. Ultrasonic imaging is already used in medicine, such as pregnancy scans along with muscle and tendon movement, and the researchers saw the potential for this to be used as a way of understanding hand movement.
The team used image processing algorithms and machine learning to classify muscle movement as gestures. The researchers also carried out a user study to find the best sensor placement for this technique.
The team's findings showed a very high recognition accuracy, and importantly this sensing method worked well at the wrist, which is ideal as it allows future wearable devices, such as smartwatches, to combine this ultrasonic technique to sense gestures.