EQ-Radio Can Detect Human Emotion With Wireless Signals

A new technology from MIT can infer a person's emotions from RF signals reflected off his body. EQ-Radio transmits an RF signal and analyzes its reflections off a person's body to recognize his emotional state. The key enabler underlying EQ-Radio is a new algorithm for extracting the individual heartbeats from the wireless signal at an accuracy comparable to on-body ECG monitors. The resulting beats are then used to compute emotion-dependent features which feed a machine-learning emotion classifier. The MIT team has demonstrated through a user study that its emotion recognition accuracy is on par with state-of-the-art emotion recognition systems that require a person to be hooked to an ECG monitor.



Transcript

00:00:00 - This video describes EQ-RADIO, a new technology from Professor Katabi's group at MIT, that can recognize people's emotions using wireless signals. Was there a moment in your life when you looked at someone's face, but you could not figure out how they feel? What if your wireless router can tell people's emotions, even if they don't show them on their faces? This is exactly what our technology does. This is our device.

00:00:29 The device transmits a wireless signal which reflects off a person's body, and comes back. It captures these reflections, and analyzes them to infer the person's emotions. Specifically, our algorithms zoom in on the wireless reflections to extract the minute variations due to breathing, and heartbeat. We then further analyze these reflections to extract the breathing signal, and the heart beat signal. We then zoom in more on these signals

00:00:54 to obtain individual heartbeat, and breathing cycles, and feed these as features into a machine learning algorithm to recognize that person's emotions. The device can automatically know if the person is excited, angry, sad, or happy. Our device can recognize emotions with an accuracy of 87%, while relying purely on wireless signals. We envision that EQ-Radio can be used in many applications. It can recognize the person's emotions

00:01:21 while he is watching a movie, and provide movie makers with better tools to evaluate user experience. It can also allow smart environments to detect emotional states like depression, and inform us to improve our emotional well-being. It can even enable these environments to react to our moods, and adjust lighting or music accordingly. To learn more about this research, please

00:01:49 check out our website.