New Delhi, India
Artificial Intelligence is everywhere, from driving to art, so why not therapy? World over, there's a shortage of professionals even as the occurrences of mental health problems rise. To fill this gap, providers of mental health services are turning to AI-powered chatbots.
But how effective are these AI therapists?
As per reports, while some chatbots can offer helpful advice, others can be potentially harmful.
One chatbot that seeks to help people in crisis is called Woebot. It has been developed by Alison Darcy, a research psychologist and entrepreneur with a background in coding and therapy.
Also read | AI's Achilles' heel: Pentagon developing tech to avoid rogue machines, slaughterbots in battle
Woebot, which is ultimately an app on the phone, is a kind of "pocket therapist" that uses its text function to help people manage concerns like depression, anxiety, addiction, and loneliness, anytime and anywhere.
However, as with any technology, Woebot is not foolproof. When CBS News tested it out, the chatbot failed to recognise suicidal thoughts but picked on them when exaggerated a bit.
The app has been available for use since 2017, but its access is limited to an employer benefit plan or access from a health professional. It is also available for use free of cost at Virtua Health, a non-profit healthcare company in New Jersey.
In a conversation with CBS News, Darcy stated that with the app, she was seeking to "modernise psychotherapy".
Also read | Elon Musk under inquiry after defying Brazil judge's 'unconstitutional' order to block certain X accounts
"I think it's so interesting that our field hasn't, you know, had a great deal of innovation since the basic architecture was sort of laid down by Freud in the 1890s, right? That— that's really that sort of idea of, like, two people in a room. But that's not how we live our lives today. We have to modernise psychotherapy," she said.
The chatbot therapist has been trained on large amounts of specialised data to help the AI recognise words, phrases, and emojis associated with dysfunctional thoughts. After recognising these "dysfunctional thoughts," the chatbot acting like a cognitive behavioural therapy – or CBT practitioner challenges them.
Watch | AI-based robot spots sick flowers in Tulip fields
"It's actually hard to find a CBT practitioner. And also, if you're actually not by the side of your patient when they are struggling to get out of bed in the morning or at 2:00 am when they can't sleep, and they're feeling panicked, then we're actually leaving clinical value on the table."
The bad side of AI therapy
Another AI-based therapist, Tessa, which was developed as an eating disorder hotline by the National Eating Disorder Association (NEDA) had to be shut down last year when it was found to be giving harmful advice to people seeking help.
The chatbot was recommending that they count calories and strive for a deficit of up to 1,000 calories per day, among other "tips", which critics and users alleged led to the development of eating disorders.
As per Darcy, "there are going to be missteps if we try and move too quickly," however, she said, "We have an opportunity to develop these technologies more thoughtfully," and that she hopes that "we take it".
(With inputs from agencies)