Fifty years ago, an MIT professor created a chatbot that simulated a psychotherapist.
Named Eliza, it was able to trick some people into believing it was human. But it didn’t understand what it was told, nor did it have the capacity to learn on its own. The only test it had to pass was: Could it fool humans?
These days, with robotics advancing to drive cars, beat humans at chess and Go!, and replace entire workforces, Eliza’s smoke and mirrors is child’s play. Researchers now build chatbots that can listen, learn and teach cognitive behavioral therapy to humans. Forget simply simulating a psychotherapist — can a chatbot do what a therapist does, or at least come close?
A San Francisco startup thinks so. Its chatbot, named Woebot, doesn’t replace therapists, but its creators believe it could be the next best thing to seeing one.
Delivered over Facebook Messenger, Woebot teaches users cognitive behavioral therapy skills, such as exercises that people can do to combat negative thinking and ways to manage mood disorders such as anxiety and depression.
Built by former Stanford researcher Alison Darcy and a team of psychologists, linguists and software engineers, Woebot has enormous ambition: to help an increasingly anxious, depressed and stressed population feel happier.
“Right now you can see a therapist, or you can access self-help books, and there’s nothing in between,” Darcy said. “The major gap we want Woebot to fill is the nothing.”
The current model for therapy, in which patients see a therapist once a week for an hour at a time, isn’t, in tech parlance, “scalable,” she said. A therapist can’t reach everyone at every minute of the day. For $39 a month, Woebot can.
“You can access it when you need it most,” Darcy said. “If it’s 2 a.m. and you’re having a panic attack, a physician isn’t going to be available at that time.”
Woebot sends users a message each day to check in with them, asks them about their mood and energy levels, and draws from cognitive behavior therapy to combat self-defeating thinking.
With a personality that’s a cross between Kermit the Frog and Spock from “Star Trek,” the innocent, logical and often goofy chatbot acts as a nonjudgmental listener, while adding levity with the occasional dad joke (“How many therapists does it take to change a lightbulb? One, but the lightbulb has to want to change”).
There are many apps and websites that purport to improve users’ mental health, but Woebot hopes to differentiate itself in two key ways.
The first is that it’s a robot people can talk to, much as they would to a therapist (although the company stresses that Woebot is not a therapist).
The second is that it has undergone a randomized control trial under the supervision of Stanford University researchers and was shown to be effective in improving symptoms of anxiety and depression in college-aged users. The results were published in June in the Journal of Medical Internet Research.
By filling in the gap between self-help books and full-blown therapy, Woebot could be tapping into a significant business opportunity.
Mental disorders topped the list of the most costly health conditions in the U.S., with spending at $201 billion (more than what was spent on heart conditions and cancers), according to 2013 research by Charles Roehrig, the director of the Center for Sustainable Health Spending at Altarum Institute.
Spending included hospital care, traditional therapy and counseling sessions and prescription drugs. Alternative medicines and self-help guides were not included. The report also did not consider how much would be spent if more people had access to mental health services.
A separate market report published by MarketsandMarkets last year estimated that the cognitive assessment and training market — which chatbots such as Woebot fall under — could be worth more than $8 billion by 2021.
This doesn’t come as a surprise to mental health experts, who said the majority of people who could benefit from mental health services don’t access them because of cost, lack of availability or fear of the stigma still associated with mental illness. Even those who have health insurance can have a challenging time getting help.
“It may involve seeing a primary care doctor, asking for a referral, and then some insurance providers will do outreach so you have to see them before they determine which therapist you can see,” said Kathleen Kara Fitzpatrick, a psychologist at Stanford’s department of psychiatry and behavioral sciences who supervised Woebot’s randomized control trial.
The accessibility and relative anonymity of Woebot makes it an attractive option. (Woebot’s team never sees any user data and has promised to never sell user information; Facebook has said it does not read or sell ads based on the content of messages sent between people and businesses.)
“It’s not as specific and it doesn’t go as deep as seeing an in-person therapist,” said Nick, a 24-year-old student from Washington, D.C., who has experienced traditional therapy and started using Woebot in February. “But it’s a low-pressure way to vent, it makes you feel more at ease, and it was good for what I was going through,” said Nick, who didn’t want to reveal his last name due to the sensitive nature of this topic. “It was also cute and funny that it has a personality.”
Despite its scientific backing, the Woebot team is careful to remind users that Woebot is not a therapist.
Woebot’s icon is a robot. If it doesn’t understand what a person has typed, it will apologize and explain that it is only a few months old and still learning. If, after prolonged use, users don’t show signs of improvement — for instance, if they consistently rate their energy levels as low or use key words such as “sad,” “anxious” or “depressed” to describe their mood — Woebot nudges them toward seeking medical help.
Although the Food and Drug Administration has not started regulating lifestyle and wellness apps, the Federal Trade Commission and state attorneys general have been more aggressive in going after apps that purport to do or be something they’re not.
The New York attorney general cracked down on three health and fitness apps last year for their misleading claims. One app called Cardiio said that it could turn an iPhone into a personal heart rate monitor, and that the data could be used to estimate the customer’s life expectancy. As part of settlements made in March, Cardiio had to clearly state that it was not for medical use.
In 2016 the FTC went after Lumos Labs, creator of the Lumosity brain training games, for making unfounded claims that its games could help users perform better at work and school, and reduce or delay cognitive impairment associated with age and other health conditions. Lumos Labs paid a $2-million settlement and has since been more specific in its marketing materials, backing its claims with peer-reviewed research.
If federal regulators were to look into Woebot, Darcy believes the chatbot would be in the clear because it has already conducted a randomized control trial, and it hasn’t overpromised.
She’s also not interested in Woebot being viewed as a clinical health tool. Woebot isn’t the same as seeing a doctor; it doesn’t diagnose, and it doesn’t replace therapy, she said. But for the millions of people in the world who just need a little boost, she believes Woebot can help.
“The idea of therapy is so burdensome and loaded for some people, and we’re not that — we’re not as intensive,” Darcy said. “We have this hope that people will use us and not even realize we’re a mental health tool.”
By Tracey Lien, Los Angeles Times
©2017 Los Angeles Times, Distributed by Tribune Content Agency, LLC.