Opinion: Applied neuroscience needs a task force to craft effective governance

In the United States alone, more than 100 million people are affected by at least one neurological disease. These conditions, which range from Alzheimer’s disease to depression and Parkinson’s disease, cost the health care system almost $800 billion per year. The toll is far higher if you add in the almost unquantifiable financial and emotional costs of diminished quality of life and caretaking.

While the brain has historically been difficult to study directly, applied neuroscience is on the verge of transformative breakthroughs that could provide enormous benefits — as well as harms.

Technologies such as brain-computer interfaces provide unprecedented power to influence or control brain processes through noninvasive devices attached to the outside of the head or invasive devices such as electrodes or implants inserted into the brain. These devices can stimulate the brain to affect movement, sensations, or behavior, and may eventually repair previously permanent brain and nervous system injuries. But they may also monitor an individual’s private thoughts or emotions.


Brain-computer interfaces are not brand new — the first such device was implanted in a human brain two decades ago. Electrodes implanted in the brain have been used to treat more than 100,000 people with Parkinson’s disease or epilepsy. But the technology is on the verge of a great leap forward in both capabilities and applications as a result of public applied neuroscience efforts such as the National Institutes of Health’s BRAIN Initiative and private efforts such as Elon Musk’s Neuralink.

While applied neuroscience holds tremendous promise for treating people with various neurological conditions, it also raises longer-term concerns about potential intrusions into privacy, autonomy, and freedom. It may be possible, at some point in the future, for companies and governments to use these technologies to monitor, influence, or even control individuals’ thoughts, emotions, and behavior.


That’s why it is urgent and necessary to put in place appropriate guardrails now to prevent unethical or unsafe applications of the technology at both the national and international levels. Neurotechnology cannot afford to repeat the mistakes of social media, whose tremendous benefits are being jeopardized by the failure to address up front potential harms and misuse. At the same time, governance must be based on realistic projections of a technology’s capabilities — new revolutionary technologies are often subject to unrealistic hype and unfounded fears of dystopian futures. The establishment of realistic guardrails must also be accompanied by a high-level strategy for advancing the technology as fast as possible in order to achieve its potential benefits and improve health and wellness for many people.

To develop such a comprehensive oversight approach, we call for the creation of a high-level White House task force to craft a roadmap for the effective governance of applied neuroscience technologies. It would be similar to the Electronic Commerce Working Group established in the late 1990s during the Clinton-Gore administration to address the emergence of the internet and electronic commerce, which one of us (D.B.) chaired.

A White House working group on neurotechnologies should be headquartered in the White House with a small staff that is given a defined and limited mandate. It should operate for 12 months with an optional three-month extension. Like the Electronic Commerce Working Group, it should be chaired by the current Vice President and include representatives of key federal agencies, including the Departments of Commerce, Defense, Health and Human Services, Justice, Labor, State, and Veterans Affairs, as well as representatives from the Food and Drug Administration, National Institutes of Health, Federal Trade Commission, and Office of the U.S. Trade Representative.

The defined tasks of the working group should be to:

  • Produce a set of principles or guidelines for promoting positive civilian uses of neurotechnology as well as a set of rules to prevent adverse outcomes. These principles or guidelines would need to include processes to address factors such as privacy, autonomy, enhancement, fairness, and equity that are traditionally outside the jurisdiction of regulatory agencies.
  • Commission a study by the National Academies of Sciences, Engineering, and Medicine on the benefits and risks of neurotechnology for both civilian and national security applications.
  • Have the National Security Council create a set of options on how best to create a viable international governance framework for the optimal positive use of neurotechnology. Options should include an international convention or treaty, new transitional soft governance tools, and the use of regulations such as export controls and tariff and trade measures to ensure the ethical uses of powerful new neurotechnologies.
  • Prepare a report on how to actively engage or regulate the private sector in securing ethical uses and outcomes of neurotechnology.

President Biden must act now to both advance — and, when necessary, restrain — brain research and thereby accelerate the gains and mitigate the pains of such technologies. As sage observers from Roy Amara to Bill Gates have noted, people tend to overestimate the impacts of emerging technologies in the short term and underestimate their impacts in the long term. Applied neuroscience fits this paradigm.

The governance decisions made for neurotechnology today will significantly determine the quality and well-being of society for decades to come.

David Beier is a managing director of Bay City Capital, a former senior executive at Amgen and Genentech, and former chief domestic policy advisor to Vice President Al Gore. Lucille Nalbach Tournas is a graduate student and lecturer on global governance, law, and regulation of neurotechnologies at Arizona State University’s School of Life Sciences. Gary Marchant is a professor of law and faculty director of a program on law, science and technology at Arizona State University’s Sandra Day O’Connor College of Law.

Source: STAT