The titans of Silicon Valley say that the brain-computer interface revolution is coming, and neurotech devices will soon meld mind and machine, allowing us to communicate effortlessly with our computers — and even one another — just by using our thoughts.
But I believe their prognostications aren’t likely to come to fruition anytime soon.
Elon Musk invested $100 million into his neurotech startup, Neuralink, to develop an implantable device he has referred to as a “Fitbit in your skull with tiny wires.” Kernel recently unveiled a version of its brain-recording helmet, with its founder predicting that the device would be in every home by 2033. And Facebook is working on brain-computer interface (BCI) technology to enable brain-to-text typing.
In this new world of private neurotech development, company demos are live-streamed on YouTube and have the flavor of techno-optimism that involves proclamations about a future we have yet to see — but one that we are assured will come to pass. Data are sparse; rhetoric about making the world a better place is heavy.
I am a neuroethicist, someone who studies ethical and social implications of advances in neurotechnology. So I frequently get asked about its development. Should we be worried that companies like Facebook, Neuralink, Kernel, and others — helmed by individuals who have previously launched paradigm-shifting technology — are working on capturing data from our brains?
My answer: I doubt we will have accurate, mind-reading consumer devices in the near future. There are practical constraints that will likely hamper the development and adoption of these devices. These limitations — more on them in a minute — are nowhere to be seen in media coverage or industry proclamations like Neuralink’s recent live demo, in which Musk and other employees expressed hopes that their product would relieve human suffering, provide superhuman vision, allow for telepathy, cure paralysis, solve and prevent practically every disease, upload memories, explain consciousness, remove fear, and achieve symbiosis with artificial intelligence.
Neuroscience is far from understanding how the mind works — much less having the ability to decode it. In many studies of brain-computer interfaces, it appears as if a device is essentially “free reading” someone’s thoughts, à la Uri Geller. But a deeper look often reveals a method to the magic. For example, many researchers use electrophysiological neural signals as proxies for user communication. Here’s a crude analogy: If an Apple Watch user was instructed to flick her wrist to the right as a proxy for saying “no,” and left for saying “yes,” we could decode communication by using data from the watch’s accelerometer. Similarly, in BCI research, users are often instructed to pay attention to a particular part of the screen or to imagine performing a specific action as a proxy for “yes” or “no.” Impressive, but not a true decoding of thought.
It is unlikely that the public will adopt a consumer device requiring neurosurgery. Today, the most accurate brain-computer interfaces use recordings from electrodes that have been implanted into the brain, rather than recordings from outside the skull. But such implantation requires neurosurgery. Although neurotech companies are working to develop safer methods of implantation — Neuralink, for example, is building a robot that can inject tiny electrode threads into the brain — there will never be a risk-free method of implanting a recording device into the human brain.
Musk has envisaged Neuralink as akin to LASIK, a cosmetic procedure for which an individual assumes a small surgical risk in order to eliminate the need to wear glasses. But a neurotech device would have to add significant value for a consumer to get one implanted in her skull.
Just because we can collect brain data, that data won’t necessarily add value for consumers. Consumer brain-recording devices have been on the market for roughly 15 years. They were initially marketed for users to control objects such as computer cursors, foam balls, and even a toy helicopter with their thoughts. Today they are more commonly advertised for “wellness.” But these devices have never become mainstream, likely because they are not entirely accurate and consumers have found little use for them.
This failure to find a meaningful use for brain data raises questions about what, exactly, consumers will do with even more neural information. Will the time-domain functional near-infrared spectroscopy measurements that Kernel is working on add enough value to our daily lives that we will be willing to walk around while wearing brain-sensing helmets?
Helmets and other headgear face an uphill battle to adoption. Consumers have shown time and again that they are reluctant to adopt products that look funny (ahem, Google Glass). The problem with brain-computer interfaces is that measuring neural signals usually requires a contraption that sits on the skull or in it. Aside from socially acceptable headwear like headphone and baseball caps, there are few contraptions that consumers are likely to readily embrace. This barrier, which is often glossed over, is readily illustrated by looking to the history of failed consumer neurotechnology devices.
It may be more taxing to control a device with a BCI than without it. As I type this, I’m not paying attention to the movement of my fingers across the keyboard. So, while typing with my brain may be a neat party trick, in a gimmicky look, ma, no hands sort of way, will it be practical for healthy individuals to use their focused, conscious attention to move cursors and peck out letters on a keyboard? A neurotech device would need to demonstrate superiority over current methods of human-computer interaction to gain market traction.
Though these limitations are not insurmountable, there are few, if any, brain-computer interfaces in development that circumvent them. So while much of the media coverage about these devices has centered on privacy concerns, I believe that we need to be realistic about the relative risks. The reams of personal data being gathered about each of us — from our phones, web browsers, credit cards, and smart homes — are currently being aggregated and sold with little oversight. Some of this information, particularly my search engine history, emails, and notes, are more revealing about who I am than my neural data may ever be.
The hype over brain privacy may distract us from the more pragmatic, albeit less sexy, issues that may arise in the near term. My colleague Robert Thibault and I have argued that the problem with consumer brain-computer interfaces on the market today is not that they can record rich, private, revealing information from the brain. It’s that they are falsely claiming to be able to do such things, and consumers might confuse bunk science with real science. Similarly, future devices may be oversold to consumers without clear evidence of effectiveness.
Whatever the future holds for consumer brain-computer interfaces, two groups will likely be winners as a result of Silicon Valley’s obsession with neurotechnology. The first group is scientists who study the brain — the only group that has a demonstrable interest in all kinds of neural signals. More accurate and mobile recording devices could lead to more neural data that can be measured “in the wild” instead of on undergraduates using bulky laboratory machinery. The second group is neurosurgeons and their patients, as technological innovation in neurosurgical devices may prove a side benefit of Musk’s desire to merge humanity with AI.
It will take time to realize these benefits. We’re still smack in the middle of a neurotech bubble fueled by venture capital, one that will inevitably yield an increasing number of prophecies about our sci-fi future. So the next time you hear a tech entrepreneur predict that brain-computer interfaces will liberate humanity, remember that these devices are not yet on — or in — our heads, and may not be anytime soon.
Anna Wexler is an assistant professor of medical ethics and health policy at the University of Pennsylvania’s Perelman School of Medicine.