Opinion: We need a way to tell useful mental health tech from digital snake oil

Prior to the early 20th century, America had no regulation of medications or food additives. Formaldehyde was used to preserve meat, morphine was included in infant “soothing syrups,” and marketing, not science, drove the promotion of tonics and medications.

The 1906 Food and Drug Act was the first in a series of consumer protection laws focused on setting standards for safe and effective medications and food additives. This act, which ultimately led to establishing the Food and Drug Administration, was also known as Wiley’s Law, in tribute to Harvey Washington Wiley, a chemist who advocated for regulations that would protect the public.

With advances in pharmacology, the time had come for an agency to inform the public about the differences between medicine and snake oil. Wiley ran clinical trials with volunteers, known popularly as the “poison squad,” to test out products and he established the first process for reviewing new medicines. In the century that followed the passage of Wiley’s Law, the FDA framework for regulation established the gold standard for defining safety and efficacy of new drugs.


We have arrived at a similar “Wiley” moment today for digital mental health. With maturation of digital diagnostics and therapeutics as well as a new generation of evidence-based psychotherapies delivered by digital platforms, both the public and developers would benefit from a process that separates useful products from the digital equivalent of snake oil. Currently, consumers have no way to navigate the sea of options, and developers have no standards to meet. In the absence of a regulatory agency or standard-setting group, digital mental health is in its “wild west” stage when marketing and scale have trumped rigor and evidence.

The term “digital mental health” can include at least three broad areas of practice. One is the explosion of apps for mental health, ranging from software for self-management to tools connecting users to coaching. These apps are largely direct to consumer and may not be covered by health insurance. Another is the development of diagnostics (including wearables) and therapeutics (including virtual reality platforms) that use hardware as well as software for clinical care mostly of mood and anxiety disorders. And finally, digital mental health can include the vast array of companies, fueled by venture capital and private equity, that have exploded during the pandemic to improve access to medication and psychotherapy. While many of these companies use apps for delivering care, their main innovation is increasing access, often using matching algorithms that promise to connect “members” to the best therapist for their needs. These companies with thousands of providers and hundreds of thousands of patients are arguably the largest providers of mental health care in the nation, yet none were on the map a decade ago.


Innovation in each of these three areas has delivered important new products, some of which have proven safe and effective. For instance, virtual reality-based exposure therapy for phobias, apps to manage addiction, and a video game for ADHD have all demonstrated their value in clinical studies and, in some cases, received FDA approval. Meditation apps and coaching services have arguably helped millions with insomnia or anxiety. And the democratization of care via improved access with on-demand services has no doubt brought relief to many people who were on waiting lists or were unable or unwilling to seek therapy or medication in a brick-and-mortar office.

But how can one discriminate those products with value from those without evidence for efficacy or basic privacy protections?

Right now, there is no one agency that regulates digital mental health. Instead, there is a patchwork system. The FDA has reviewed a small number of digital therapeutics, but the agency is not well-matched for software that iterates on a weekly or monthly basis. With the completion of the Digital Health Software Pre-Certification Program and the release of its guidance on Clinical Decision Support in 2022, it seems likely that most digital mental health products will not be regulated by the FDA. Moreover, the FDA does not oversee psychotherapy, either online or in person, nor does it review apps for mental health that are marketed directly to consumers. The Drug Enforcement Agency has investigated telehealth companies for allegedly over-prescribing stimulants. The Federal Trade Commission has fined companies with false marketing. And, of course, credentialing and licensing bodies exist in every state, but these agencies provide certification, they do not regulate service delivery and are not equipped to review either software or hardware.

Several groups have released standards for digital mental health, including the Agency for Healthcare Research and Quality, the Digital Therapeutics Alliance, the National Institute for Health and Care Excellence, and the American Psychiatric Association’s App Advisor. While each of these efforts shows the feasibility of reviewing digital practices and products, none has established rules of the road in this landscape, for either mental health apps or telehealth practice, and none has established a process for review. One Mind PsyberGuide serves as a consumer’s guide by reviewing and scoring digital mental health products, but this nonprofit initiative has not scaled to review the thousands of products available, nor does it serve the payer community that may not know how to complete diligence on these products.

Across these efforts there are common themes for establishing efficacy and safety: the design of the products, the evidence for engagement and clinical impact, privacy protection and data management, and integration with electronic health records and broader health care. Of course, differences abound on how to define engagement or what constitutes clinical impact, which is why a common standard would be so important for this field. Note these standards can be iterative and supportive of innovation, rather than the approval process used for drugs which is more of a pass-fail process. To review the development of technologies for space, NASA has created the Technology Readiness Level, which rates the maturity level as an aid to the development of new technologies.

The time has come for a new Wiley Law that establishes a process to oversee the three aspects of digital mental health (and let’s toss in national credentialing of therapists, as well). A new federal regulatory agency could engage a range of stakeholders, including mental health advocates and consumer groups. And it could define the gold standards for efficacy and safety while also reviewing products to ensure that they meet these standards. That is one approach, but federal agencies are generally not adept at reviewing innovative digital products, a new agency might take years to launch, and a federal approach risks stifling innovation.

A different and perhaps more efficient approach would be to establish an industry group or a consortium of commercial payers to vet products. Payers perform due diligence on many of the start-up companies and their digital solutions already. Why not formalize and integrate this effort to create a shared oversight board? This approach would require a level of cooperation that has generally been missing in this highly competitive industry. But the Digital Medicine Society has created precisely this kind of consortium with its Digital Health Regulatory Pathways project, largely focused outside of mental health. Could a similar group be established for digital mental health, adding the ratings of consumers to the diligence of payers?

My own preference would be a public-private effort established by an authoritative federal agency, such as the National Institutes of Health or FDA, working with both industry and consumer groups. Both NIH and FDA have congressionally-mandated organizations (respectively, the Foundation for NIH and the Reagan-Udall Foundation) that support public-private initiatives. Thus far, “private” has been limited to pharmaceutical and biotech companies. In the current era when tech companies are innovating in health care and billions of venture-capital funds are supporting new companies, the range of “private” partners would need to be expanded.

The time has come to build governance and accountability for this new industry, ensuring that consumers know what they are buying and developers know what they need to be building. Wiley could hardly have known how the world of drugs and food additives would change after 1906. With the rise of generative AI, we can hardly predict where digital mental health will be in a decade (or even next year!), but surely all will benefit from a process for defining safety and efficacy for this new industry.

Thomas R. Insel is a psychiatrist and neuroscientist; co-founder of Vanna Health, Humanest Care, and Mindstrong Health; a former director of the National Institute of Mental Health (2002-2015); and author of “Healing: Our Path From Mental Illness to Mental Health.” He is an adviser or board member for Akili, Alto Neuroscience, Cerebral, Compass Pathways PLC, Embodied, Koa Health, NeuraWell Therapeutics, Owl Insights, Uplift Health, and Valera Health, as well as several nonprofit foundations.

Source: STAT