Hongbian Li, a research associate in professor Nanshu Lu's lab. (Image: cockrell.utexas.edu)

Researchers at the University of Texas at Austin have modified a commercial virtual reality (VR) headset, giving it the ability to measure brain activity and examine how we react to hints, stressors, and other outside forces.

The team created a noninvasive electroencephalogram (EEG) sensor that they installed in a Meta VR headset that can be worn comfortably for long periods. The EEG measures the brain’s electrical activity during the immersive VR interactions.

The device could be used in many ways, Nanshu Lu — a professor in the Cockrell School of Engineering’s Department of Aerospace Engineering and Engineering Mechanics who led the research — told Tech Briefs in an exclusive interview, the entirety of which can be read below. “This work is motivated by the widespread need of understanding users’ mental and emotional status during gaming, training, and operation of drones or robots with VR.”

The research was published in Soft Science.

The pairing of VR and EEG sensors has made its way into the commercial sphere already, but today’s devices are expensive; the team says its electrodes are more comfortable for the user, extending the potential wearing time and opening up additional applications.

The best EEG devices today consist of a cap covered in electrodes, but that does not work well with the VR headset. And individual electrodes struggle to get a strong reading because our hair blocks them from connecting with the scalp. The most popular electrodes are rigid and comb-shaped, inserting through the hairs to connect with the skin, an uncomfortable experience for the user.

For this project, the researchers created a spongy electrode made of soft, conductive materials that overcame those issues. The modified headset features electrodes across the top strap and forehead pad, a flexible circuit with conductive traces, and an EEG recording device attached to the back of the headset.

“We have developed soft conductive foam electrodes interconnected by a flexible circuit,” Lu said in the interview. “The wetted foams can access the scalp for EEG sensing despite hairs and are intrinsically soft — hence mechanically imperceptible even after an hour of wear. Our system is also applicable to various existing VR headsets without being destructive to the headset.”

This technology will play into another major research project at UT Austin: A new robot-delivery network that will also serve as the largest study to date on human-robot interactions. Lu is a part of that project, too, and the VR headsets will be used by people either traveling with robots or in a remote “observatory.” They will be able to watch along from the robot’s perspective, and the device will also measure the mental load of this observation for long periods.

“If you can see through the eyes of the robot, it paints a clearer picture of how people are reacting to it and lets operators monitor their safety in case of potential accidents,” said Professor and co-author Luis Sentis, who’s also co-leading the robot-delivery project.

The researchers have filed preliminary patent paperwork for the EEG, and they’re open to partnering with VR companies to create a built-in version of the technology.

Here is the interview with Lu, edited for length and clarity.

Tech Briefs: I’m sure there were too many to count, but what was the biggest technical challenge you faced while developing this VR technology?

Lu: Electroencephalograms need to be measured through electrodes in direct contact with the scalp, but the hairs are in the way. State-of-the-art EEG sensing VR systems utilize comb-shaped electrodes to go into the hairs, but they are fixed on the specialized VR headset and uncomfortable for long-term wear.

Tech Briefs: Please talk about the joint venture: how it’s coming along; how the work is intertwined; how everyone’s working together; data sharing; etc.; anything you can divulge.

Lu: Here's a report of our GCR efforts at UT . Future robots could be operated or supervised by humans wearing VR headsets. To design human-machine interfaces that are consistent with the physical, cognitive, and sensory abilities of humans requires real-time sensing and assessment of human status. That's why our VR-EEG technology is relevant to this GCR effort.

Tech Briefs: What are your next steps? Any other future research/work/etc. on the horizon?

Lu: Expanding on this foundational success, we hope to advance our VR-EEG technology to simultaneous multichannel EEG recordings. This enhancement aims to more accurately assess a user’s vigilance and mental capacity during VR operations. Enhancing hair compatibility is another goal.

Tech Briefs: Do you have any advice for engineers aiming to bring their ideas to fruition?

Lu: Personally, I think asking the right questions and working with the right people are critical.