NICK DANLAG – STAFF WRITER
Nita Farahany started her Chautauqua lecture with an old adage: Don’t ever put anything in writing you don’t want to see on the front page of The New York Times. She said, however, this is outdated, and soon people should not even think — let alone write — about anything they don’t want to be broadcasted.
Humans have thousands of thoughts a day, she said, with specific neurons firing depending on what part of the brain the mind uses.
“As a thought takes form, like a math calculation or a number or words, neurons are interacting in the brain, creating minuscule electrical discharges,” Farahany said. “When you’re in a dominant mental state like relaxation, hundreds of thousands of neurons are firing in the brain, creating concurrent electrical discharges and characteristic patterns that can be measured.”
At 10:30 a.m. on Aug. 19, Farahany’s pre-recorded lecture was streamed into the Amphitheater, concluding the Chautauqua Lecture Series’ Week Eight theme, “The Human Brain: Our Greatest Mystery.” It was the first Amp lecture of the season delivered virtually due to the speaker’s health concerns.
Farahany, a professor of law and philosophy at Duke University and the founding director of The Duke Initiative for Science & Society and chair of the Duke master’s program in bioethics and science policy, discussed potential uses of brain monitoring, the ethical debates around companies and governments using it, and her own opinion that society should strive to protect self-determination of individual citizens around the technology — while also limiting the ability for organizations to take away people’s rights to their own thoughts.
Farahany showed the audience a recording of her own brain activity, which was an array of different colors constantly changing shades.
“What’s really interesting isn’t just how pretty my brain is. I do think I have a lovely brain,” Farahany said. “Actually, the fact is that those characteristic patterns can be decoded and parsed in great detail.”
Companies are heavily investing in brain monitoring technologies, such as video games where the controller is a person’s mind, cars that alert people when they are drowsy, robotic limbs that move from signals from the brain, swarms of military drones controlled by thoughts, and visible feedback to show people when they are focusing or in a state of mediation.
“This can be really powerful for things like ADHD. Training with a video game using one of these headsets, trying to get a golf ball into a hole, it turns out, can be more powerful if you’re able to complete about 20 hours of one of these games than even being on some ADHD drugs,” Farahany said.
Jack Gallant, one of her favorite neuroscientists, published a study in 2011 in which people lay in a fMRI machine and watched a series of YouTube videos. The machine recorded the brain activity of the subjects, and Gallant was able to “decode,” Farahany said, their thoughts to recreate a rough image of what they were viewing.
“He essentially built a dictionary, a library, an algorithm that can start to predict if you see this type of blood-oxygenation level, this is what it means in the brain. This is where the images are that you see,” Farahany said. “He then reconstructed, just based on brain activity alone, what the images were. They are a little bit blurry. It’s not a perfect representation, but it’s pretty remarkable.”
Previously, only doctors had the technology to see into the brain’s inner workings, but now, and more so in the future, everyday citizens will have access.
“Which raises the question: Should you have direct access to this information? Should it go through an intermediary, somebody who can interpret it for you?” Farahany said. “Is there a right to self-access? If there is, is there a corollary right to be able to do more than just access your brain? Can you change it as well?”
In sports, any physical enhancements, such as performance-enhancing drugs, are considered to give the athlete an unfair advantage. Common belief is that competitions should be won through innate physical gifts and hard work.
“Is this just what we are doing in society, or what we’re doing in humanity?” Farahany said. “Aren’t we always trying to enhance our own brains? Is that just part of what it means to flourish as a human being?”
International chess competitions, Farahany said, have banned the use of memory-enhancing pills, because they give advantages to the user, and chess players are required to take drug tests before playing.
And she said sports industries are not the only ones interested in brain-monitoring technologies. She said many tech companies like Facebook are “all-in.” The social media giant recently released an update on their computer-brain interface. In the study, they researched people who already had a brain chip to control epilepsy, and they were able to correctly predict what a person was going to say before they said it, which could be used for people with debilitating paralysis.
Facebook has already bought Control Labs, a leading company in the field, and is working on actual reality games, which are video games that interact with real life — such as Pokémon GO — and virtual reality games that use only the players’ thoughts to play, as well as computers that do not use a keyboard or a mouse.
Elon Musk is another figure in the brain monitoring industry. Recently, Musk released a video of a chimp playing a simple video game through a brain-monitoring headset.
Some of the efforts have been noticeably well-intentioned, Farahany said. IKEA, for example, wanted to create affordable artsy rugs for art lovers, but when they released the products, lines turned into brawls and the products were immediately sold online for thousands of dollars, which is called scalping.
So, IKEA used a technology that tracked a person’s mental reaction to art, and customers could only buy pieces that they had a noticeable mental engagement with. The process worked, Farahany said, and no fights occurred and none of the art was scalped later.
Though IKEA and its customers had a good experience with the application of the technology, Farahany said neurological surveillance needs to be limited.
“I, for one, am not ready to hand over the keys to my brain to be part of the greater surveillance economy that has been expanding so rapidly in recent years,” Farahany said. “Especially since it isn’t just corporations, but also governments, (which) are all in when it comes to investments in the human brain.”
Every brain is unique, and Farahany said each person has a “biometric identity” that could be used in the future to unlock a laptop, or for government identification purposes.
Some governments are already implementing the technology. In a class in China, students wore headsets that had different lights that would shine based on how focused they were. Not only would it shine on their head, but they were connected to a console in front of the teacher so they could easily be monitored. The data was shared with the school and parents. The government of China also had access to the data, but Farahany said it was unclear what they were using the information for.
“How does that affect human development? How does it quell the possibility of any dissidents or resistance, the ability to fantasize?” Farahany said. “How would that impact your ability to truly flourish, to grow, to think something novel and different?”
She said this will likely have a deep impact on children’s creativity and imaginations, and that most innovators have had their greatest ideas while their minds were wandering.
“I worry that we may have a slight increase in productivity; that is, the efficiency of the number of hours a person spends paying attention, and a plummeting result of the quality of their output as creativity starts to decline,” Farahany said.
Society, Farahany said, needs more protection around brain monitoring. Currently, there are no explicit protections around neurological surveillance in the U.S. Constitution.
But she also stressed the right for people to choose what kind of technology they would like for themselves, such as the keyboard-less computer or more effective ADHD treatment.
“The time has come to recognize cognitive liberty so that we can embrace the promise of neurotechnology, while safeguarding human flourishing,” Farahany said.
As part of the Q-and-A session, which Farahany conducted live from her home in North Carolina, Matt Ewalt, vice president and Emily and Richard Smucker Chair for Education, asked Farahany to talk more about IKEA using brain monitoring to sell art. When she discussed IKEA earlier in the lecture, as the audience reacted, an alarm went off in the distance.
“I also feel like,” Ewalt said, “that was Chautauqua just raising an alarm because of our love for the arts, as well as the implications of what that means, and measuring what loving art means, and who gets to define that.”
Farahany said the technology was first developed for museums, and scientists tracked the brain patterns of people viewing different works of art and asked them later how they felt about each. They noticed an innate reaction within the brain for pieces that the person likes. The technology was also used to recommend similar works of art. Farahany said it brings up the interesting question of what is defined as love.
“What does that do to how we think about our own appreciation of art, that you have to have an objective measure in your brain that somebody can visualize, and that that’s the true mark of what counts as appreciation?” Farahany said. “Our experience of appreciating art will start to be narrowed. It’s only if you love it that it counts. Well, actually, if I’m disgusted by it, then the artist has achieved something that they were trying to achieve as well, potentially.”