Hardwiring the Brain
fNIRS technology creates an increasingly sophisticated connection between brain and computer.
Inside a building on the northwest side of Stanford University (USA) sits a lineup of unusual cars: a self-driving white Audi and a squat, shiny red vehicle plastered with gray sunlight-harvesting silicon squares, to name a couple. The facility, with an exterior painted green to match the surrounding California oaks, is a garage dressed up as a laboratory.
Stanford neuroscientist Jennifer Bruno runs brain experiments using the Toyota Avalon in the back. The black matte sedan, stripped of its engine and rigged with a computer, is parked in front of enormous monitors that stretch from floor to ceiling. To do the experiment, Bruno's team turns on the monitors to project a driving simulation. Press on the brake in the driver's seat, and the pixels of the avatar car will respond exactly as a real car would. You're essentially playing a gigantic 270-degree arcade game. "You get a very immersive, realistic driving experience," says Bruno-as realistic as you can get without risking collisions.
In 2015, Bruno and her collaborators performed a study to observe the brain activity of humans driving a car. In a series of trials spanning about 30 minutes per person, they instructed study participants to change lanes as they drove in the simulation. In random trials, the steering wheel behaved in reverse: rotate it to the right, and the simulated car would swerve to the left. Bruno's team would monitor the driver's brain activity as they adapted to different steering settings, in hopes that the results could help the automotive industry design a safer car.
To monitor the participants' brains, Bruno's team used a device resembling a shower cap embedded with a network of LEDs and sensors. Recruits for the study donned the cap, clambered into the Toyota's driver's seat, and pretended to drive along a city street or a simulated forested highway. Each LED on the cap, pressed against the wearer's scalp, beamed light onto their head with roughly the brightness of a laser pointer. Some of that light could enter the wearer's skull and bounce back out. Meanwhile, detectors on the cap measured the light that scattered back out.
The technology exploited in the cap is called functional near-infrared spectroscopy, or fNIRS (pronounced "eff-nears"). FNIRS devices use multiple wavelengths of near-infrared light to detect changes in the levels of hemoglobin, the protein in blood that transports oxygen, in the brain. When neurons fire in the brain, oxygen-flush blood flows to that area. That oxygen is bound to hemoglobin proteins. Oxygenated hemoglobin absorbs infrared light differently than unbound hemoglobin. Thus, fNIRS devices exploit that difference to determine the amount of oxygenated and deoxygenated hemoglobin in particular regions of the brain. Higher levels of oxygenated hemoglobin indicate more brain activity.
The technique was pioneered in 1977 by Duke University biologist Frans Jöbsis, who first identified this capability of near-infrared light by applying light-carrying optical fibers to the shaved temples of cats. Like magnetic resonance imaging (MRI), fNIRS monitors brain activity noninvasively.
But unlike MRI, which requires a hulking machine, fNIRS has gained popularity as a brain-imaging technique because of its small size. Cap and headband-sized models are commercially available. And, unlike the MRI machine, fNIRS is not limited to use in clinics and laboratories. "The portability is super important," says Bruno. Because of fNIRS miniaturization, for the first time, researchers are measuring brain activity of people in more realistic environments rather than artificially controlled lab settings.
Today, portable fNIRS devices consist of a network of light sources and detectors placed on a person's head. The sources beam near-infrared light, 650 to 900 nanometers in wavelength, past the scalp, and into the brain. This particular range of wavelengths can penetrate biological tissue much more deeply than visible light and is known as the "optical window." The device measures the amount of light that scatters back to each detector. Using this information, it can calculate the amount of oxygenated versus deoxygenated hemoglobin in the area.
Despite a decades-long history, fNIRS devices have only become miniaturized in the last handful of years, says David Boas, director of the Neurophotonics Center at Boston University and editor in chief of the SPIE journal Neurophotonics. When Boas first started working in the field in the 90s, machines applied infrared light via long heavy fiber-optic cables. But as electronics shrank, especially devices for digitizing optical signals, the light sources could be directly affixed to the person's head, and fNIRS systems became more portable.
David Boas and his collaborators are integrating a network of sensors and LEDs onto a mesh cap for portable brain imaging.
The fNIRS cap used in Bruno's car study, for example, hooked up to a tablet that the researchers placed in the backseat of the car. "Researchers who study brain activity are getting excited because they can have their subjects doing more natural tasks. They can even be walking around," says Boas. "It really increases the number of studies you can do." But by some definitions, Bruno's cap doesn't qualify as a wearable because it still needs to connect to a tablet. Boas's group began developing a true wearable fNIRS device two years ago. Such designs mostly sit on the head and hook up to a smaller, smartphone-sized control unit, either via electric wiring or even wirelessly, that you can put in your pocket or elsewhere on the body.
The devices' portability also permits Bruno to collect brain data out of the lab. She and her team have set up a station in a hospital about a 90-minute drive south of Stanford in the heavily agricultural Salinas Valley, California, where much of the U.S.'s lettuce crop is grown. They have collected fNIRS data on the brains of about 150 children and adolescents in the area to find out how pesticide use might affect their development, says Bruno.
These increasingly portable designs have enabled the budding field of neuroergonomics, "studying the brain at work in everyday life," explains biomedical engineer Hasan Ayaz of Drexel University. Neuroergonomics researchers study how the brain functions outside of controlled lab settings to improve their product designs. Ayaz has worked with a coffee machine company, for example, to observe brain activity as a user interacts with the device. The company wanted to use the data to improve its user interface, says Ayaz.
In addition, Ayaz and his collaborators think that fNIRS could be used to help people whose jobs require them to digest and apply a lot of information quickly, such as airplane pilots, surgeons, or air traffic controllers. They have already begun studying pilots' brains using fNIRS. In one study, they split the pilots into two groups, where one group flew an actual plane, while the other used a flight simulator on the ground. Both groups wore a battery-operated, headband-shaped fNIRS device, which was originally developed by Ayaz's lab and is now commercially available via a spinoff startup, the Maryland-based fNIR Devices.
Curiously, Ayaz's group found that the brains of pilots flying the real plane were more active than those participating in the simulation. Researchers generally don't understand how differently people's brains behave in natural versus controlled settings. The study provides evidence that lab-based results do not necessarily translate into real-world situations.
One eventual goal of the studies is to use fNIRS to create a dynamic user interface for the pilots that changes depending on their mental state. Ayaz is using fNIRS to study the pilots' so-called cognitive workload, a number derived from physical signals that quantifies the pilots' brain activity. For example, if an fNIRS signal indicates that the pilot's brain is extremely active, the various devices in the cockpit could be programmed to present less information to the pilot. "We want to inform the machine about the operator's state so it can adapt itself to the user," says Ayaz.
A pilot wears an fNIRS device during a flight simulation.
Researchers also want to use fNIRS signals to directly control machines known as brain-computer interfaces (BCIs) developed for people with ALS or other mobility issues. These systems could be programmed to display "yes" or "no" or used to move a robotic arm based on specific fNIRS signals. "Right now we are at the proof-of-concept stage, demonstrating that [fNIRS BCIs] are feasible," says Ayaz. "There is still a lot of work to be done to create reliable systems for everyday use."
Ayaz's group recently performed a study where participants listened to a story while wearing an fNIRS device. Analyzing the resulting signals, they could identify which speech audio clips the person heard 75 percent of the time. This cracks open the door toward BCIs capable of transcribing perceived speech, they write in a 2018 paper published in Frontiers in Neuroscience.
FNIRS technology builds on conventional BCIs based on electric encephalography (EEG). EEG devices, which can be worn on the head like Bruno's device, measure electric signals produced when neurons fire. An fNIRS device, on the other hand, measures oxygenated blood that has rushed to the region of the brain a few seconds after the neurons fire. Thus, EEG measures brain activity more directly and rapidly than fNIRS.
However, fNIRS has some advantages over EEG. It's more difficult to determine precisely where electric signals originate compared to optical signals. EEG devices can be more complicated to use, too. Most electrodes in EEG devices also require a sticky gel to make good electrical contact, whereas fNIRS sensors can work directly on skin. In addition, because of its slow response, an fNIRS signal might be more trustworthy as input for an on-off switch in a BCI, says SPIE Member Alexander von Lühmann, a biomedical engineer at the Technical University of Berlin, Germany. EEG signals change rapidly and are more likely to mistakenly toggle a machine on and off.
Ultimately, the two methods complement each other rather than compete, says von Lühmann. They measure different things, so one technique can help confirm the measurements of the other. The techniques are also sensitive to different noise: fNIRS sensors pick up ambient light and fluctuations in blood pressure, while EEG devices pick up electric signals produced by the wearer's contracting muscles, so one technique can compensate for the noisiness of another. Researchers like von Lühmann are building wearable brain-imaging devices that take both EEG and fNIRS signals simultaneously, in the hopes that the resulting data will be cleaner to interpret. "The hardware is more complicated, but in the end you have more information that you can exploit in signal processing," says von Lühmann.
Von Lühmann has built a prototype headset that combines EEG with fNIRS, a model he calls M3BA. The device can be integrated into any wearable, such as a cap or a headband, with a moveable battery pack that can rest on the user's neck. The system can talk to a computer via a Bluetooth connection.
A big challenge is processing and interpreting the measured signals. Researchers are developing algorithmic methods to sort artifacts from real signals. Von Lühmann's hybrid device includes two accelerometers, whose measurements can be used to remove motion artifacts from raw data.
Von Lühmann's M3BA wearable device combines fNIRS with EEG imaging capability.
Signal-processing techniques vary depending on the application. For example, researchers developing BCIs do not necessarily need to understand the biological activity that produces the signals. To execute a binary command, the BCI just needs to discriminate one type of brain signal from another. To that end, von Lühmann is developing machine-learning algorithms to classify different types of signals.
On the other hand, physiology researchers do need to trace signals to biological origins or function. In Bruno's study, her team managed to locate changes in the drivers' brain activity as they drove. They are also looking to connect brain activity to other physiological responses: they could detect the drivers' pupils dilating when the drivers' brains are more active, for example. "We are still in the process of better understanding the fNIRS signals themselves," says Ayaz.
And while Bruno's cap suffices for her car study, she's still looking for a better design. FNIRS can only image the surface of the brain, so she'd like something that could see deeper. And it's still not comfortable enough for some of the studies that Bruno wants to do, such as studies of brain activity in children with autism. "For a child who has autism, a tag on a shirt can be irritating," she says. "These caps are way better than an MRI machine, but they're still little discs on your head. It gets irritating after a while."
As both hardware and software improve, fNIRS will be used not just for brain research, but also to help people do their jobs or create user-adaptable household products, says Ayaz. In the future, the technology could enable signals from your brain to enhance everyday experiences.
Sophia Chen contributes to Wired, Science, and Physics Girl. She is a freelance science writer based in Tucson, Arizona.
Enjoy this article? Get similar news in your inbox |
|