close
Monday May 12, 2025

Therabot: Researchers develop AI tool to tackle mental therapist shortage

"We need something different to meet this large need," says assistant professor of data science and psychiatry

By AFP
May 04, 2025
A woman takes a photo in front of heart-shaped light installations in Hong Kong, China, June 25, 2023. —Reuters
A woman takes a photo in front of heart-shaped light installations in Hong Kong, China, June 25, 2023. —Reuters

NEW YORK: Dartmouth College researchers think AI can provide trustworthy psychotherapy, setting their work apart from the unreliable and occasionally questionable mental health apps that are all over the market these days.

Therabot, the app, tackles the challenge of a severe shortage of mental health specialists.

According to Nick Jacobson, an assistant professor of data science and psychiatry at Dartmouth, even multiplying the current number of therapists tenfold would leave too few to meet demand.

"We need something different to meet this large need," Jacobson told AFP.

The Dartmouth team recently published a clinical study, demonstrating Therabot’s effectiveness in helping people with anxiety, depression and eating disorders.

A new trial is planned to compare Therabot’s results with conventional therapies.

The medical establishment appears receptive to such innovation.

Vaile Wright, senior director of health care innovation at the American Psychological Association (APA), described "a future where you will have an AI-generated chatbot rooted in science that is co-created by experts and developed for the purpose of addressing mental health."

Wright noted these applications "have a lot of promise, particularly if they are done responsibly and ethically," though she expressed concerns about potential harm to younger users.

Jacobson’s team has so far dedicated close to six years to developing Therabot, with safety and effectiveness as primary goals.

Michael Heinz, psychiatrist and project co-leader, believes rushing for profit would compromise safety.

The Dartmouth team is prioritising understanding how their digital therapist works and establishing trust.

They are also contemplating the creation of a nonprofit entity linked to Therabot to make digital therapy accessible to those who cannot afford conventional in-person help.

Cash cow?

With the cautious approach of its developers, Therabot could potentially be a standout in a marketplace of untested apps that claim to address loneliness, sadness and other issues.

According to Wright, many apps appear designed more to capture attention and generate revenue than improve mental health.

Such models keep people engaged by telling them what they want to hear, but young users often lack the savvy to realise they are being manipulated.

Darlene King, chair of the American Psychiatric Association’s committee on mental health technology, acknowledged AI’s potential for addressing mental health challenges but emphasises the need for more information before determining true benefits and risks.

"There are still a lot of questions," King noted.

To minimise unexpected outcomes, the Therabot team went beyond mining therapy transcripts and training videos to fuel its AI app by manually creating simulated patient-caregiver conversations.

While the US Food and Drug Administration theoretically is responsible for regulating online mental health treatment, it does not certify medical devices or AI apps.

Instead, "the FDA may authorise their marketing after reviewing the appropriate pre-market submission," according to an agency spokesperson.

The FDA acknowledged that "digital mental health therapies have the potential to improve patient access to behavioural therapies".

Round-the-clock availability

Herbert Bay, CEO of Earkick, defends his startup’s AI therapist Panda as "super safe."

Bay says Earkick is conducting a clinical study of its digital therapist, which detects emotional crisis signs or suicidal ideation and sends help alerts.

"What happened with Character.AI couldn’t happen with us," said Bay, referring to a Florida case in which a mother claims a chatbot relationship contributed to her 14-year-old son's death by suicide.

AI, for now, is suited more for day-to-day mental health support than life-shaking breakdowns, according to Bay.

"Calling your therapist at two in the morning is just not possible," but a therapy chatbot remains always available, Bay noted.

One user named Darren, who declined to provide his last name, found ChatGPT helpful in managing his traumatic stress disorder, despite the OpenAI assistant not being designed specifically for mental health.

"I feel like it’s working for me," he said.

"I would recommend it to people who suffer from anxiety and are in distress."