Hongbian Li, a Research Associate in Professor Nanshu Lu’s lab. (Image: cockrell.utexas.edu)

A research team at The University of Texas at Austin created a noninvasive electroencephalogram (EEG) sensor that was installed in a Meta VR headset that can be worn comfortably for long periods. The EEG measures the brain’s electrical activity during the immersive VR interactions.

The device could be used in many ways, from helping people with anxiety, to measuring the attention or mental stress of aviators using a flight simulator, to giving a human the chance to see through the eyes of a robot.

The best EEG devices today consist of a cap covered in electrodes, but that does not work well with the VR headset. And individual electrodes struggle to get a strong reading because our hair blocks them from connecting with the scalp. The most popular electrodes are rigid and comb-shaped, inserting through the hairs to connect with the skin, an uncomfortable experience for the user.

“All of these mainstream options have significant flaws that we tried to overcome with our system,” said Hongbian Li, a research associate in Professor Nanshu Lu’s lab.

For this project, the researchers created a spongy electrode made of soft, conductive materials that overcame those issues. The modified headset features electrodes across the top strap and forehead pad, a flexible circuit with conductive traces similar to Lu’s electronic tattoos, and an EEG recording device attached to the back of the headset.

This technology will play into another major research project at UT Austin: A new robot delivery network that will also serve as the largest study to date on human-robot interactions.

Lu is a part of that project, and the VR headsets will be used by people either traveling with robots or in a remote “observatory.” They will be able to watch along from the robot’s perspective, and the device will also measure the mental load of this observation for long periods.

“If you can see through the eyes of the robot, it paints a clearer picture of how people are reacting to it and lets operators monitor their safety in case of potential accidents,” said co-lead of the robot delivery project and co-author on the VR EEG paper Luis Sentis.

To test the viability of the VR EEG headset, the researchers created a game. They worked with José del R. Millán, an expert in brain-machine interfaces, to develop a driving simulation that has the user press a button to react to turn commands.

The EEG measures the brain activity of the users as they make driving decisions. In this case, it shows how closely the subjects are paying attention.

The researchers have filed preliminary patent paperwork for the EEG, and they’re open to partner with VR companies to create a built-in version of the technology.

For more information, contact Nat Levy at This email address is being protected from spambots. You need JavaScript enabled to view it.; 512-471-2129.