
WHO
Carnegie Mellon University researchers have created EgoTouch, a tool that uses AI to control AR/VR interfaces by touching your skin with a finger.
WHAT
The new generation of augmented and virtual reality controllers may not just fit in the palm of your hand. They could be the palm of your hand. A research team at Carnegie Mellon University's Human-Computer Interaction Institute (HCII) has introduced EgoTouch, a tool that uses AI to control AR/VR interfaces by touching your skin with a finger. The team wanted to ultimately design a control that would provide tactile feedback using only the sensors that come with a standard AR/VR headset. OmniTouch, a previous method developed by Chris Harrison, an Associate Professor in the HCII and Director of the Future Interfaces Group, got close. But that method required a special, clunky, depth-sensing camera. Vimal Mollyn, a Ph.D. student advised by Harrison, had the idea to use a machine learning algorithm to train normal cameras to recognize touching. Mollyn collected the data for EgoTouch by using a custom touch sensor that ran along the underside of the index finger and the palm. EgoTouch can detect touch with more than 96 percent accuracy and has a false positive rate of around 5 percent. It recognizes pressing down, lifting up, and dragging. The model can also classify whether a touch was light or hard with 98 percent accuracy.
WHERE
Carnegie Mellon University, Pittsburgh, PA
WHY
Detecting variations in touch could enable developers to mimic touchscreen gestures on our skin. The system could turn AR/VR users' palms into touch-sensitive interfaces.
WHEN
Mollyn is exploring ways to use night vision cameras and nighttime illumination to enable the EgoTouch system to work in the dark. He's also collaborating with researchers to extend this touch-detection method to surfaces other than the skin.
For more information, contact Aaron Aupperlee at