Can Digital Psychiatry Really Fill the Mental Health Care Gap?
Thousands of new tools with unproven results are entering the fold to help Americans in need
Imagine you’re in a room with a hundred American young adults, bright-eyed and bushy-tailed. Over their lifetimes, about 25 of them will have a stroke; 40 will get cancer. And an astounding half the room will develop a mental illness, if they haven’t done so already.
The United States’ mental health epidemic has been simmering for decades, with Covid-19 both illuminating and exacerbating the crisis. Given the social isolation, job insecurity and weakened support systems over the past few years, the World Health Organization estimated a 25 percent increase in anxiety and depression worldwide, with women and young people worst hit.
A large part of the challenge are the cavernous gaps in care: 158 million Americans live in an area with a shortage of mental health workers. And while the rise of telehealth and creation of the 988 suicide and crisis lifeline have helped, they are only Band-Aids, barely holding together a system failing at its seams. “We’re at a point in the U.S. where it almost couldn’t get worse,” says Kenneth Pages, a Florida doctor and former chief of psychiatry at Tampa General Hospital. “Describe worse to me at this point.”
Daunting as they may be, these challenges have inspired a new wave of “digital psychiatry” solutions, offering automated promise where humans have fallen short. Largely developed by computer scientists and consumer tech entrepreneurs, the new field leverages smartphones and wearable sensors to provide mental health insights, attracting more than $10 billion in funding worldwide between 2020 and 2022, according to technology market intelligence firm CB Insights. John Torous, director of the Digital Psychiatry Division at Beth Israel Deaconess Medical Center in Boston, argues that “the mental health crisis we’re all talking about really requires more transformative solutions.”
Left open are broad questions over this nascent field and the trend toward shifting health care into a digital-first field. What does it mean to remove humans from something as fundamentally interpersonal as our mental health? And is digital psychiatry worth all the hype?
A look at what’s out there
When you first consider digital psychiatry, you might think about the laundry list of apps ready to download on your smartphone: Calm, Headspace, Sanvello, Bearable, Happify and many others with similarly cheery names. These apps are personal assistants of sorts, helping users engage in guided meditations, mindfulness exercises, anxiety management and other activities, with customized wellness plans based on user preferences and lifestyles. While most of these apps offer free versions, accessing the full range of content—particularly the personalized tools—requires subscriptions ranging between $27.99 to $350 per year.
These companies advertise slogans like “Become the architect of your health,” and they say that “You’ll be surprised at how soon you’ll start feeling a positive change,” but they are also quick to note that the apps are not meant for clinical use. “We are not a health care or medical device provider, nor should our products be considered medical advice,” Headspace emphasizes, before adding that its app “makes no claims, representations or guarantees that the Products provide a physical or therapeutic benefit.” Most others offer similar disclaimers.
That being said, some of these apps can be helpful. Recent data from nonclinical participants suggest that Calm and Headspace offer modest improvements across mindfulness, well-being, stress, anxiety and depression. Clinical psychologist Vara Saripalli says a lot of her patients already use these apps, and she even recommends some of them for patients who are anxious or want practice sorting through their thoughts and feelings. “As an adjunct where your provider is checking in about your use of one of these tools,” she continues, “that can be helpful.”
Beyond the consumer-facing apps, firms also offer software to help clinicians better care for their patients. Medical device company Neuronetics, for instance, created the TrakStar platform to help clinicians manage transcranial magnetic stimulation, an FDA-approved therapy for major depressive disorder, obsessive-compulsive disorder, migraines and smoking addiction. More specifically, TrakStar helps determine patient eligibility and insurance coverage, tracks patient-reported outcomes during treatment to assess efficacy and adverse events, and continues to monitor patients post-treatment through questionnaires in case they relapse. The platform notifies a provider if a patient gets worse so the provider can reengage with the patient.
“Many of our patients waited until they completely crashed into a deep depression in order to seek help, even if they had previously recovered,” says Cory Anderson, the company’s senior vice president of research and development and clinical. “What TrakStar is doing is monitoring these patients after their treatment to make sure they don’t crash.” He calls it an early warning system, with the ultimate goal being to expand health care capacity. In Anderson’s ideal world, mental health providers could quickly divert their attention to patients experiencing severe crises rather than being spread thin across all patients.
This mission to use digital tools to augment professional care is shared by academic researchers. Beth Israel’s Torous, for instance, invented MindLAMP, a digital psychiatry platform that collects info on sleep patterns, physical activities, physiological symptoms and call and text logs to offer patients customized mindfulness, meditation and breathing interventions. Although the app can take in data across wearable technology, surveys and GPS tracking, Torous emphasizes that clinicians and patients collaboratively decide which particular data streams to collect and then interpret them together in the clinic. “We built it to be a more customizable, flexible way to use smartphones to augment care,” Torous says.
So far, this approach appears promising: Across India, China, Australia, Canada and the U.S., MindLAMP has been used to digitally provide therapy to patients with schizophrenia, track memory loss in patients with Alzheimer’s, and understand differences in the disease trajectories of bipolar disorder and depression. “If we can, in the future, start using algorithms—ones that are evidence-based—I think we can begin to offer people a lot more responsiveness and features on LAMP to help them feel better quicker,” says Torous.
Right now, MindLAMP is run by a research protocol without any investors, and there are no plans currently to spin the platform out into a business. “We’d like to keep it as a common tool that people can use,” Torous continues. “They can do replicable science in this space—they can add to it, augment it.” He wants to provide a free platform for other researchers to validate and build off, in a field sometimes devoid of data-driven solutions.
Like Torous, Paola Pedrelli, associate director of the Depression Clinical and Research Program at Massachusetts General Hospital, and Judith Law, CEO of Anxiety Canada, value these types of academia-led innovations in digital psychiatry. For the past seven years, Pedrelli has been working with Rosalind Picard at MIT to develop machine learning algorithms that detect the severity of depressive symptoms among patients. And since 2012, Law has been collaborating with Mayo Clinic, University of British Columbia, University of Waterloo and other institutions on MindShift CBT, an anxiety management coaching app. Based on cognitive behavioral therapy, the goal is to challenge patients’ thoughts, beliefs and attitudes to improve their emotional well-being.
Pedrelli hopes that eventually, by collecting heart rate, sweat gland activity, temperature and movement from wearables, she and Picard will be able to prioritize patients experiencing acute relapses and proactively modify treatments before they fall into a deep depression. But in the meantime, MindShift CBT doesn’t collect physiological data and instead contains modules to educate users on anxiety and engage them in skill-building exercises to support coping. A distinctive aspect of this free app is its community forum where users can learn from others’ experiences in a moderated space, providing and receiving peer-to-peer support. According to Lance Rappaport, a clinical psychologist at the University of Windsor and senior author of an upcoming study on MindShift CBT, “anxiety, depressive symptoms and functional impairment decreased and quality of life increased” among a cohort of more than 200 people who used the app for 16 weeks.
For digital psychiatry to succeed, Law says, the field will need to build its evidence base, actually proving that these tools have a clinical benefit in users. And if they don’t, regulators may need to step in and hold companies accountable to produce the evidence. “If Calm, Headspace and all these other products, ultimately, are more interested in the evidence base versus profitability, then I think we’re headed in the right direction,” says Law.
A note of caution
Unfortunately, with billions of dollars of investor funding, some companies have tested ethical and legal boundaries in how they offer patient care. The platform Koko recently admitted to using artificial intelligence chatbots in place of humans to provide emotional support to customers without their consent. And last year, mental health telemedicine company Cerebral was placed under investigation by the Department of Justice for overprescribing the controlled substances Adderall and Xanax without requiring in-person evaluations. “Companies that are for-profit are going to cut corners,” says Saripalli. “I’m really concerned about the lack of quality of care that is going to proliferate the more these apps proliferate.”
Vanderbilt University’s Bradley Malin, an expert in biomedical informatics, offers similar concerns: “With VC support behind it, there’s this push toward quick return as fast as possible—grow, grow, grow.” With around 20,000 mental health apps currently available on the marketplace, ensuring these technologies are validated and demonstrate tangible benefit is thus of utmost importance. And doing this properly requires a lot of data collection, independent studies and replicated results.
But how much data is too much? Malin says, “It’s this push forward toward, ‘We don’t know what we’re looking for. And therefore, we’re just going to blitz it and collect as much as we want, and then we’re going to let the computer figure out the answer.’”
With this shotgun approach to data collection, data breaches, either because of internal mistakes or external hacking, become increasingly risky. Cerebral had been using pixel trackers—code that collects activity data—to monitor user engagement for the past four years. And only in 2023 did the company realize that this data was being shared with Meta, TikTok and Google in a breach affecting 3.2 million patients. Similarly, a security flaw in the IT systems of Vastaamo, referred to as the “McDonald’s of psychotherapy,” led to its entire patient database being leaked to the internet, including email addresses, social security numbers and therapists’ notes. Around 30,000 people received ransom demands from hackers threatening to publish their private information.
And some companies have even shared data willfully. The Federal Trade Commission went after the online counseling service BetterHelp for pushing people to give sensitive health information while promising absolute privacy—but then BetterHelp handed that data over to Facebook, Pinterest, Snapchat, Criteo and other advertisers. BetterHelp has since agreed to a $7.8 million settlement for alleged data misuse. Digital psychiatry may promise mental health care from the privacy of people’s homes, but what does that privacy mean in a world of seemingly endless leaks?
According to Malin, any health care provider can search a patient’s physical information—lab tests, imaging, vitals. But mental health information is only known to those doctors whom it is shared with in consult. And our thoughts are sensitive—they concern other people, about things that have yet to happen, about the world that only we can see. “It does make it very juicy information, for lack of a better term,” Malin adds. “The question is: How much support are you going to provide for personal rights and protection versus the end application?”
Such sensitive information leaves little room for error. “The penalties of being wrong are severe,” says Colin Walsh, an internal medicine physician at Vanderbilt. “If an algorithm says an individual is high-risk and they aren’t, they may receive an intervention that they don’t need.” Walsh brings up the example of the military, where these kinds of false positives can be career altering: “A commander might take that information and not want to send them on deployment.”
Already we’re seeing students forced to withdraw from college after university medical staff inform administrators of their conditions—and workers are getting fired from their jobs after voluntarily disclosing their mental illnesses. With the propagation of digital psychiatry, providers, supervisors and administrators could get access to even richer personalized data, collected through “routine” onboarding processes or employer-provided mental health services. While these data usually tend to be de-identified, Walsh notes that they can always be re-identified. In his eyes, the rise of digital psychiatry could bring a rise in stigma and discrimination against those with psychiatric conditions.
A look ahead
For the time being, medical health professionals think it’s unlikely that digital psychiatry will fully replace human clinicians. Apart from the lack of scientific evidence to support these technologies, apps are simply unable to provide a humanistic experience. “One of the biggest factors in successful therapy is the quality of the relationship between the individual and the therapist,” says Saripalli. “I don’t think you’re going to get a human personalized touch if this is your primary provision of treatment.”
If anything, digital psychiatry might exacerbate the very inequities it hoped to address. Indeed, left unregulated, mental health companies can profit off those who can’t afford traditional care by offering cheap, ineffective treatments. “At the higher end of the income spectrum, people are going to pay for a premium product, and there’s no question that in-person, one-on-one individualized attention is going to be superior,” Saripalli adds.
Nonetheless, everyone interviewed for this story believes that the digital psychiatry movement is far from slowing down—and that providers need to actively participate to ensure it doesn’t harm patients. While part of this movement is undoubtedly driven by hype and feckless profiteering, real potential exists for digital solutions to alleviate the burdens of a mental health care system on the brink of collapse. The question becomes how to identify these promising use-cases and bring together mental health providers, data privacy officials and patients to ensure that we are progressing in an evidence-based, secure way.
“There’s this idea, ‘move fast and break things’—that old Silicon Valley mantra,” says Walsh. “In health care, that means people get hurt.”
Editor’s note, May 15, 2023: This article has been updated to add the name of the company, Neuronetics, that created the TrakStar platform.