Woebot

A screenshot of Woebot.

“Beep boop!” From its cheeky greeting to its punny name, Woebot is marketed as a friendly chatbot that helps users with mental health and wellness.

It comes at a time when mental wellness is beginning to gain traction on university campuses and when students are advocating for more accessible mental health resources. Western University offers resources such as psychological services, the Peer Support Centre and seasonal Canadian Mental Health Association crisis counselling — all of which are in-person interactions.

Woebot introduces artificial intelligence to the field of mental health and raises the question of whether or not we are ready to talk mental health with a machine. 

Created by a San Francisco startup consisting of Stanford psychologists and AI specialists, Woebot Labs, Inc., Woebot is a chatbot available for free on Facebook messenger. It checks-in on the user daily with questions like, “How are you feeling?” and “How would you rate your energy level right now?” Based on user text input, Woebot will respond with more probing questions and deliver cognitive behavioural therapy exercises.

"We now have the ability to have large databases of words, so we know what those words mean. We know what the part of speech is for all of those words, so we can understand the syntactic structures of sentences. We can understand — to some limited extent — what the sentence actually means, and we have algorithms that pull all of that information together," says Robert Mercer, a professor of computer science at Western.

Mercer believes the ability to formulate creative responses to user inputs is a much more complex problem. Woebot is still a relatively young and primitive AI, with most user input being text options that can be selected and responses being stock answers written by humans.

Despite its obvious limitations, a Stanford study conducted on Woebot gave it a glowing review. The study showed significantly reduced levels of depression and anxiety in college students who used Woebot for two weeks.

One benefit of Woebot, and arguably its biggest selling point, is accessibility. Given Western’s long wait times for health services, Woebot is an attractive option. Matthew Yip, the Peer Support Centre supervisor, says he can see how Woebot could be helpful to someone who may need someone (or something) to talk to late at night.

However, Yip stresses the value of human interaction. “I think there's still a lot to be said for human interactions,” says Yip. “Because there's more context, there's more of a baseline to read from and you can detect … things that a robot really can't." 

He sees Woebot as an extra component that can be added to students’ mental health support network but would not recommend it as a resource replacing in-person services.

Woebot seems like a harmless and potentially helpful tool for students, but there are always risks associated with using an online resource. Nandita Biswas Mellamphy, an associate professor of political science at Western, highlights the fact that user input given to Woebot is going through a third party first, Facebook.

Biswas Mellamphy points out that Woebot has the potential to be a profitable source for Facebook.

“Woebot is opening itself up to vast amounts of information that can be very lucrative in many different ways," she says. "While Woebot is acting like a good therapeutic friend, the data being produced concerns not only your personal health information potentially, which in everyday circumstances are protected by legal and medical laws of confidentiality. But in the case of Woebot, the technological content and metadata produced within Woebot are not regulated by legal or medical laws and Facebook can do whatever it wants with it."

Mercer agrees that there is a risk associated with using Facebook as an interface, since users don’t know what Facebook will do with their data. “But that’s not Woebot, that’s not the AI, that’s Facebook. It’s an enterprise; it’s a corporation,” he says.

Although Woebot claims that it will never replace a human therapist, an artificially intelligent therapist is not impossible in the distant future. Yip brings up ‘“cultural lag,” the idea that technology progresses faster than the social and cultural context needed to accept it. Laws regulating new technologies have been slow to adapt, as shown by Uber, self-driving cars and now Facebook and Woebot. Artificial intelligence is exciting, with new avenues for integration emerging faster than ever. If Woebot is successful, chatbot therapy may be here to stay. 

4
0
1
0
2

Load comments