Is ChatGPT Smart Enough To Practice Mental Health Therapy?
Online therapy is a booming industry, and now AI-powered chatbots are starting to roll into the territory. The arrival of ChatGPT and its enhanced language ability to deal with human interaction might soon be a revolutionary alternative to divulging your darkest mental health issues to a human therapist. The post Is ChatGPT Smart Enough To Practice Mental Health Therapy? appeared first on TechNewsWorld. …
Online therapy is a booming industry, and now AI-powered chatbots are starting to roll into the territory.
The recent arrival of ChatGPT and its enhanced language ability to deal with human interaction might soon be a revolutionary alternative to divulging your darkest mental health issues to a human therapist.
How close is an AI-driven chatbot to replacing your human mental health specialist? A lot closer than you might think. In fact, some aspects of AI-powered therapy are already here.
Some software developers and mental health practitioners are already reaching for the pause button. Others in the medical and computer industries are pushing the panic button.
TechNewsWorld recently surveyed several dozen mental health experts and bot developers actively involved with artificial intelligence chatbot projects. Some respondents rejected the reliability of AI tools to replace human therapists. Others lumped a potential for improved algorithms down the road with current practices they said are useful and effective.
What Is ChatGPT?
ChatGPT reached 100 million active users in January to become the fastest-growing consumer application in history. Launched by OpenAI in November 2022 as a prototype, it is user-friendly despite the need for advanced technical knowledge and understanding of AI and conversational technology.
Since then, the beta product has become explosively popular and integrated into numerous uses involving language and computer coding.
The product’s name, ChatGPT, stands for Chat Generative Pre-trained Transformer. It is a chatbot built atop OpenAI’s GPT family of large language models. Its popularity results from its ability to deliver detailed responses and articulate answers across many knowledge domains.
Some mental health websites use chatbots to set appointments, answer FAQs about services and policies, and store customer data for later convenience, according to Carolina Estevez, a clinical psychologist at Infinite Recovery.
“However, they have not been used for the therapy itself because the technology is not sophisticated enough to capture the nuances needed. It would not be able to give sound counseling, proper diagnosis, or even determine how a person is feeling,” Estevez told TechNewsWorld.
From Chatbot to Digital Therapist
To be clear, ChatGPT is not a single entity developed exclusively for mental health services. It is an AI-based language model that is not capable of providing mental health therapy, cautioned Ryan Faber, founder of Copymatic, an AI-powered platform to create digital ads, website copy, or blog content.
His eight years of tech experience sparked curiosity about chatbot potential, which led him to pursue the innovative technology and found his company.
But using ChatGPT to author written content is one thing. Employing it to provide mental health therapy is something else entirely different.
“It may be more difficult to build a rapport with a therapist when communicating through a screen. Moreover, online therapy may not be suitable for individuals with severe mental health issues or those who require more intensive treatment,” Faber told TechNewsWorld.
Ramiro Somosierra, the founder and editor of the online music magazine GearAficionado, professed to be a heavy user of ChatGPT in his daily publishing activities. He also admits to swearing by his therapist.
“I do not believe there is a real use case for ChatGPT as a [therapy] replacement,” Somosierra told TechNewsWorld. “However, I would not put something as important as mental health in the ‘hands’ of an algorithm as imperfect as ChatGPT.”
In his experience, ChatGPT usually gets basic information wrong when he asks it to write an article. He is not alone in this assessment. Content inaccuracy is a common descriptor among ChatGPT users across all industries.
“So how can I make sure it would provide me with mindful interactions if I am telling it about the relationship with my father?” he quipped.
Pushing AI’s Limits for Online Therapy
Online therapy has proven to be effective as it reaches a broader number of people and is more accessible due to its virtual capabilities, agreed Talin Banbassadian, director and technology team lead at project management and software development firm Vaco. As the population keeps growing and the world continues to shift towards more virtual methods of reaching its client set, online therapy has become more widely adopted.
“The main driver is the pandemic. There has been an increase in mental health cases due to this event, and as we move towards a world where being vulnerable is encouraged, online therapy will continue to see growth,” Banbassadian told TechNewsWorld.
Chatbots are currently utilized where there is a systematized and predictable way of providing therapy — whether for depression, addiction, or anxiety-related issues. The most reliable providers make a human triage option available for edge cases.
One cannot stress enough the need to localize this type of automation based on cultural nuances. What will work in the West may not work everywhere, cautioned Atul Bhave, senior vice president of managed services for Vaco.
“This is an area of challenge for all enterprises engaged in scaling AI-supported services. It will remain a challenge, but overcoming it will also produce the most gains,” Bhave told TechNewsWorld.
ChatGPT is troubled with content accuracy, he agreed. But the technology has introduced the revolution, and many more will follow that truly optimize the ability of machines to learn from humans and produce predictable success.
“It is too early to expect human-level empathy, which can replace inaccurate canned information from a bot,” Bhave offered.
Pushing AI Ethics and Expectations
As an example — or maybe a warning — some people are already misusing the still infantile ChatGPT. In January, behavioral health platform Koko cofounder Robert Morris announced on Twitter that his website provided mental health support to some 4,000 people using ChatGPT-3 in an experiment.
Some software developers responding to our questions raised concerns about ChatGPT’s suitability and ability to replace human therapists. For example, Love2Dev Proprietor Chris Love has done several mental health-related projects involving ChatGPT.
“It is a general language model and does not specialize in therapy, nor has (it) been trained to respond like a therapist. If you wanted to ask it to help with general research in the field, it could help you with resources [and] research,” he told TechNewsWorld.
Love appreciates the mental health potential that ChatGPT can provide. However, we need to understand where the limits are, he offered.
“As far as using AI for therapy, it can add real value today. I love AI models as a way to triage tedious work and screen issues. That way, professionals can focus care on more severe cases,” Love added. “ChatGPT is not a therapist, and most likely, it will give you a disclaimer in its answers.”
Love offered an example of potential failsafe features his clients integrated into the online applications he built for them. The medical field was watching for certain “triggers” to either notify real medical staff or require the patient to call 911 for a better response.
Online Therapy vs. Chatbot Therapist, No Deal
Chatbots now are a therapy tool for mental health in two key ways, according to Flora Sadri-Azarbayejani, medical director at Psyclarity Health, an addiction treatment facility in the Boston area that provides support for self-care and delivers cognitive behavioral therapy (CBT).
She explained that CBT is a form of talk therapy that helps people learn how to alter their thinking and behavior to manage their mental health better.
“Chatbots can provide users with automated, on-demand self-care support by giving information about mental health topics, suggesting lifestyle changes to improve their well-being, or providing strategies to manage stress,” Sadri-Azarbayejani told TechNewsWorld.
These types of chatbots can be an effective way to provide additional support and guidance to those who are already receiving therapy and those who may not have access to traditional therapy services.
“But because therapy is essentially about connecting and understanding people, both chatbots are still fairly limited in their capabilities,” she offered. “Even though platforms like ChatGPT can remember information from previous conversations, they are still unable to provide the same level of personalized, high-quality care that a professional therapist can.”