
Artificial Intelligence is quickly becoming a part of our everyday lives. Health care is no exception and now, a growing number of AI-powered apps claim to offer mental health support and advice that people could once only access through wellness and medical providers.
But is it really a good idea to replace human therapists with AI-driven chatbots?
Chatbot programs -- like ChatGPT, Woebot and Earkick -- work by analyzing user prompts such as "I am depressed," "I'm feeling anxious," and "Nobody cares about me," and responding with advice, suggestions or follow-up questions to help calm the user or reduce stress -- such as deep breathing exercises.
The goal is to "engage in empathetic conversations, analyze what users are saying and offer responses that aim to be supportive, informative and therapeutic," Keisha Saunders-Waldron, a licensed clinical mental health counselor, told Forbes Health.
While chatbots can be a tool to fill the void left by a shortage of therapists and help people when professionals are unavailable, experts warn that they should be used to support therapy, not replace treatment.
Thomas Heston, a University of Washington doctor of family medicine, said chatbot programs are especially concerning because they can be customized to assume personas, including those of mental health counselors, and engage in conversations that give the impression they're intelligent without having any expertise in the specialty.
"Chatbot hobbyists creating these bots need to be aware that this isn't a game. Their models are being used by people with real mental health problems, and they should begin the interaction by giving the caveat: I'm just a robot. If you have real issues, talk to a human," Heston said in a statement. "I'm very optimistic about the potential of AI in health care, but it can get out of control, especially when it's providing mental health advice."
Wade Reiner, clinical assistant professor at University of Washington, said the greatest limitation of chatbots in mental health is that they're largely based on text, which alone is not enough to render a judgement about a patient. Responses can often get repetitive and leave users in the same place they started.
"Clinicians need to see the patient," Reiner said. "When we see the patient, we're doing more than just listening to what they say. We're analyzing their appearance, their behavior, the flow of their thoughts. And we can ask clarifying questions."
"Bit by bit, AI may be able to do more of those analyses, but I think that will take some time. For one AI to be able to do all those things will take quite a long time," Reiner added.
Supporters of chatbot therapy say apps make the services available to people who would otherwise face financial and logistical barriers to traditional care. They mention that some people may be more likely to seek help through a chat-based app without the judgement of a human on the other end. Even so, they still caution that chatbots should be used as a method of self-help and not an alternative to clinical treatment.
A 2022 analysis of multiple research studies found that AI-based methods can positively enhance outcomes in psychological interventions and reduce mental health symptoms. However, researchers agree more robust studies are needed before AI products can be recommended as therapeutic tools.