When we talk about operating “on autopilot,” we usually mean acting with little thought or effort, often resulting in error. It turns out actual autopilot systems can cause actual pilots to behave in this way.
“Early in my career, the concern in aviation was about stress and workload, but automation was becoming more of a concern,” says Alan Pope, who began his long stint at Langley Research Center in 1980. As automated flight control systems took over more and more of the moment-to-moment operations in the cockpit, pilots could grow bored, complacent, and disengaged—conditions that are just as dangerous as being overwhelmed, Pope explains.
By the mid-1990s, researchers at NASA and the Federal Aviation Administration, including Pope, were researching the causes of these “hazardous states of awareness” and ways to objectively identify and measure a pilot’s cognitive state. “We worked on finding physiological signals that showed underload,” he says.
Pope led a team looking for a way to use electroencephalographic (EEG) data—that is, brainwave readings—to quantify a subject’s level of engagement. The result, which they described in a 1995 paper, was an “engagement index” that’s still used today: the strength of high-frequency beta waves, which indicate attention, is divided by the combined power of lower-frequency alpha and theta waves, which come with relaxation, to arrive at a measure of engagement.
An incident later that year, in which a series of pilot errors ended with an American Airlines flight crashing into a mountain while preparing to land in Cali, Columbia, killing 159 people, intensified interest in the subject. The crash is mentioned more than once in a 2001 paper, in which Pope and two other researchers describe a collaboration between Langley and Old Dominion University to apply the engagement index to an attempt to train subjects to regulate their own engagement level.
The training relied on what’s known as a biofeedback loop. The researchers let the subjects in one group watch their own rating on the engagement index, on a 1-to-6 scale, while they performed a task, and told them to try to maintain a level of 3 or 4. Other groups were provided with incorrect feedback on engagement or none at all. After a week, all the participants were asked to perform the task again. The subjects in the group that had received the biofeedback training significantly outperformed the others while remaining closer to their baseline, “normal” engagement level. They also rated their workload markedly lower than those in the other groups did.
They had learned to regulate responses that are normally beyond our control. The paper concluded that this “physiological self-regulation” training could help pilots manage and maintain their attention.
When a group of Harvard University graduate students founded BrainCo Inc. in Cambridge, Massachusetts, in the spring of 2015, they didn’t know anything about NASA’s work on monitoring and mitigating hazardous states of awareness. But they did know a little about EEG readings as they related to attention and biofeedback.
The fledgling company wanted to develop a practical way to use EEG readings to monitor students’ attention in the classroom and, ultimately, to help people with attention deficit hyperactivity disorder (ADHD) gain more control over their concentration. A prototype EEG-reading, WiFi-connected headband had been built by the time Max Newlon was brought on board as a research scientist about a year later, but the company wanted to improve its software. Newlon was tasked with finding an algorithm to accurately determine cognitive state based on brainwave readings. He hit on the 2001 Langley/Old Dominion experiment and its use of Pope’s engagement index formula.
“I did a pretty extensive literature review and found that the NASA algorithm was the best fit for what we were doing,” he says.
BrainCo’s Focus EDU is a classroom system that lets the teacher monitor a class’s attention level in real time, as an average or as a “heat map” of the classroom, and it generates an after-class report on the group as a whole, as well as individual students’ attention levels.
An LED light on the front of each student’s headband can indicate one of three attention levels, although Newlon says the feature is generally turned off during class time. “You don’t want to distract students with their friends’ headband colors.”
Focus Family uses the same hardware with a smartphone app to create a sort of digital study buddy for use at home, Newlon says, noting that students can see their attention levels in real time and generate a report after a study session. A couple of video games are included, which users control by modulating their brainwaves. These assess abilities like sustained attention and task switching. All the results can be shared and compared in an online community.
The soon-to-be-released LUCY product uses the same technology in a slightly different form. It is intended to use biofeedback to improve focus and attention control, much like the work Pope and colleagues describe in the paper Newlon came across. It will include video games for this purpose, but the app will also let users remotely pair the EEG-reading headband with real-world objects, allowing ordinary electronics to be controlled with the mind.
BrainCo shipped its first order—20,000 Focus EDU and Family units—to a Chinese distributor in spring of 2018. The company’s founder and several team members are from China and had connections there, Newlon says, noting that BrainCo is now working to break into the U.S. market, with pilot studies having begun in mid-2018.
The classroom version gives teachers overall student attention reports to see what’s getting kids’ attention, where they’re getting lost, and even whether they’re relaxed during breaks. Individual student reports let them see who’s having trouble paying attention and when.
“If you go up a layer, an administrator can see who’s good at engaging students and look into what they’re doing well and how that can be shared with others,” Newlon points out.
The home version can record students’ brain activity while they study to create a report that allows insight into their distractibility, when they should schedule breaks, and what subjects engage them the most.
“One thing we’re hoping to use this for is to detect users’ interest,” Newlon says. “There’s a subjective component people already experience. We want to make it visible and put a number on it so people can learn more about what’s going on in their brains.”
The app also generates tips to improve studying.
BrainCo especially hopes LUCY can benefit those with ADHD, he says. “There’s been some pretty good research on neurofeedback for ADHD, but we’re aiming to be the first to get FDA approval.” He specifies that the company wouldn’t claim to treat or cure ADHD, but that anyone with trouble controlling attention could benefit. LUCY began in-house clinical trials in early 2018.
Pope partnered with Eastern Virginia Medical School to compare the effects of video game-based biofeedback and more standard biofeedback training on 22 children with ADHD. While the video game training was rated to be more fun, a report the team gave in 2001 concluded that “both the video game and standard neurofeedback improved the functioning of children with ADHD substantially above the benefits of medication.”
“It’s a little-known secret that a lot of benefit can be drawn from biofeedback,” says Pope, who used the technique in his practice as a clinical psychologist. “If someone has information about their internal state, they can learn to change it. It requires training over time and multiple sessions, but you can learn that skill.”
“It’s almost like flexing this mental muscle over and over again,” Newlon says, noting that some studies have found that improvements in concentration can last months after training. He says the company hit on using brain waves to manipulate household objects because ADHD is most common in children. “What do kids want to do? They want to control stuff in their room. How cool is that?” To that end, the company has remotely paired prototypes with devices like lights and robots such that the devices respond differently to different attention levels.
In addition to its product line, though, BrainCo also sees a future in research. With multiple students using each device sent to schools, Newlon figures it won’t take long for the first shipment alone to net a database of EEG readings from a million users, larger than any that exists today. “We think there might be a really big opportunity here. What can we learn if we do a big-data analysis?”
He emphasizes that the company will prioritize privacy and data security, having already started working with leaders in the field to ensure all applications meet the industry’s top standards for protecting data.
The research could go in any direction. “What we really want to do is talk to the smartest people: ‘We have all this data. What questions do you have, and how can we help answer them?’” Newlon says.
“And all of this is based on the NASA engagement index.”
While NASA research on biofeedback lagged after the mid-2000s, Pope says, a new appreciation for the importance of crew cognitive states in the last few years has renewed interest in the field. Now retired and a NASA distinguished research associate, Pope is a consultant on the Crew State Monitoring project that began at Langley around 2013. The team is applying machine learning to EEG data, as well as other physiological signs like heartbeat and respiration, to further refine characterization of cognitive states and extract more information from the data.