Ultrasonic Gesture Recognizer for Portable Electronic Devices
Researchers at the University of California, Berkeley have developed an ultrasonic graphic recognition system that can track a user's movements and translate them into inputs to an electronic device. Their ultrasonic 3D range sensor system uses batch-fabricated micromachined aluminum nitride ultrasonic transducer arrays and custom CMOS electronics. This technology may be useful in the development of practical gesture-controlled computer interfaces. Optical 3D imagers for gesture recognition, such as Microsoft Kinect, suffer from large size and high power consumption. Their performance depends on ambient illumination and they generally cannot operate in sunlight. Ultrasonic gesture recognition systems measure sound waves to turn gestures into inputs, allowing for comparable performance to optical systems but with less power consumption and environmental restrictions.
Transcript
00:00:29 we are very excited by the chess recognizer it combines technology developed at UC Berkeley in the sensor and actuator center and at the swarm laboratory by focusing on the power and size constraints of mobile devices we've created a new and natural way for interacting with electronics for a long time people have used light to detect objects like a web camera works but
00:00:54 using light and cameras to detect objects in 3d requires a lot of power and computation now inspired by our medical ultrasound we've made micromachined ultrasonic transducers that operate in air the best part is the sensor is really small and really low-power whereas the camera takes one watch to record video our sensor takes 400 micro watts to do 3d range-finding
00:01:18 that's low enough power to run the system for 30 hours on a battery this small it's so small you might not even be able to see it we've built a prototype unit that demonstrates this technology this is the ultrasound chip and this is the custom chip that sins and receives electrical signals from the ultrasound chip we use an array of tiny ultrasound transducers to send a pulse
00:01:39 of sound waves into the environment we drive the sensor and it moves up and down pushing the air back and forth and creating sound waves the sound waves travel out through the chip and away from the transducers those waves bounce off objects in the environment and the echoes return to the transducer array which measures the time it took for the echoes to return from the time of light
00:01:59 we find the location of the objects relative to the sensor this allows us to enable new user interfaces like flipping through a photo gallery without even touching the screen this is the type of technology we expect to move beyond the borders of the University and into numerous applications that will leverage the small size and low power dissipation of
00:02:19 this technology we believe that by improving the way we interact with our devices and the way we interact with each other we can make a better smarter more connected world you

