advertisement
advertisement

Your Next Therapist Might Be A Chatbot

If our tough soldiers will confide more in a virtual psychologist, could the same be true for you?

Your Next Therapist Might Be A Chatbot
[Image: USC Institute for Creative Technologies]

In 1966, an MIT professor named Joseph Weizenbaum coded what was meant to be a landmark scientific joke. Called ELIZA, it was the world’s first chatbot. Linguistically simple, it was designed to probe topics like a psychologist. Mention your family, and it might say, “tell me more about your mother.” The psychologist bit was funny to Weizenbaum, until he found his subjects actually requested to spend more time with ELIZA. Eventually, he even walked in on his secretary chatting with the bot, before she infamously asked, “Would you mind leaving the room, please?”

advertisement

More than 50 years later, studies are concluding what Weizenbaum’s experiment only suggested: Bots–even if they look and sound like bots–may benefit our mental health. In fact, sometimes these human facsimiles might even be preferential to the less anonymous therapy we seek today.

Recently, a group of University of Southern California and Carnegie Mellon University researchers proved as much with soldiers who just returned from a year abroad in Afghanistan. As many soldiers who’ve seen combat suffer from PTSD, returning troops generally fill out a written test–what’s essentially just a symptom checklist called the Post-Deployment Health Assessment (PDHA). In the past, research has found that soldiers, knowing that PTSD can affect their own military careers, will share less on a PDHA test they know will be seen, and more on an anonymous version. Anonymity helps people share.

But researchers believed they could get even better results if they could somehow offer anonymity with the added element of person-to-person rapport. Because it’s also been found that we’ll share more openly with people through social cues, like body language and a cadence of questions.

So in turn, researchers built a 3D therapy bot. She’s essentially a psychologist rendered in video-game graphics, with brown hair, a soft, slight smile, and a racially ambiguous face. Her garments are a simple a cardigan sweater and slacks. Though she’s decked out with plenty of subtle bling–what looks like diamond earrings, a solitaire diamond necklace, and a gem-studded wristwatch.

In the study, she talked to a soldier via a TV screen and offered more and more probing questions. The conversation would begin with questions like, “Where are you from?” move to PTSD assessments like, “Do you have trouble sleeping at night,” and finish with phrases that would boost the soldier’s mood before he left: “Tell me what you’re most proud of.”

Along the way, crucially, the virtual psychologist nodded and asked generic follow-ups like, “Can you tell me more about that?”

advertisement

Complex? Probably not at all, at least not compared to the linguistic skills of Google Now or Alexa. However, it was effective. Researchers found that, across a couple of studies, soldiers revealed more PTSD symptoms to the anonymous therapy bot than they did even to the anonymous written survey.

[Image: Woebot]
Amazingly, researchers are finding similar results across similar studies. In fact, another bit of research found that people actually disclosed more in text chats with bots than real people, simply because of a bot’s inherent anonymity (err, until the advertisers get looped in at least). Meanwhile, a psychologist chatbot startup called Woebot is betting on the measurable benefit of chatting with machines. In their own, peer-reviewed study, the team behind the app found that students reported lower stress after chatting with a bot for a few weeks, compared to those who were just pointed toward a self-help book.

One would think that a bot,  so obviously rendered with a graphical fidelity no one would possibly mistake for real, would be a lousy replacement for an actual person. And maybe it is. None of the studies above make the bold claim that their bots could defeat an anonymous, empathetic, flesh and blood psychologist in a head-to-head contest. However, these bots–delivered in words or 3D graphics–don’t seem to repulse us in some uncanny valley way.

So assuming bots continue to prove their worth in critical settings, maybe it is worth considering how they could be designed into the mental health system at large. Because, if you think about it, a self-help robot could be there for you in all the times a psychologist couldn’t. Much like a good friend, you could communicate with it through a continuum of media. It could be texted, or called, or Facetimed at any hour of the night, never too tired to lend an ear, never too booked to take an appointment, and hopefully, never too expensive to stop someone from getting help who really needs it. 

About the author

Mark Wilson is a senior writer at Fast Company. He started Philanthroper.com, a simple way to give back every day.

More