What Prompts Us To Use Mental Health Chatbots

I’ve written previously about the somewhat surprising popularity of telehealth platforms for mental health services, with users seeming to appreciate the relative distance the virtual nature of proceedings affords them.  New research from North Carolina State University and Syracuse University explores some of the motivating factors behind people’s use of chatbot technology for mental health care.

The research focused specifically on the mental health challenges surrounding mass shootings, with users’ primary motivation for enrolling to enable them to better support their peers rather than to get help for their own problems.

“We saw a sharp increase in mass shootings in the U.S. in recent years, and that can cause increases in the need for mental health services,” the researchers explain. “And automated online chatbots are an increasingly common tool for providing mental health services – such as providing information or an online version of talk therapy. But there has been little work done on the use of chatbots to provide mental health services in the wake of a mass shooting. We wanted to begin exploring this area, and started with an assessment of what variables would encourage people to use chatbots under those circumstances.”

Seeking support

The researchers surveyed 1,114 American adults who had all previously used chatbots to solicit mental health support at some point prior to the study.  Each was given a scenario in which there had been a mass shooting, and were asked a number of questions relating to the use of chatbots for mental health support in the wake of such an event.

The research revealed a number of factors that were important in driving people to use chatbots when tackling their own mental health needs.  For instance, the speed and ease of access provided by chatbots was a clear benefit, while the depth and quality of the information provided was also perceived as valuable.

There was also a clear desire for the chatbots to appear as human as possible, with the participants revealing that the personal and emotional nature of the conversation leading them to want the chatbots to feel lifelike.  These are all somewhat expected reasons for enrolling, but rather more surprising was the desire to help others.

“We found that the motivation of helping others was twice as powerful as the motivation of helping yourself,” the researchers explain.

This kind of help includes learning from the chatbot how to help a loved one who is experiencing mental illness, helping them to access similar chat services themselves, or even demonstrating to them how easy they are to use.

“Our study offers detailed insights into what is driving people to access mental health information on chatbot platforms after a disaster, as well as how they are using that information,” the researchers conclude. “Among other applications, these findings should be valuable for the programmers and mental healthcare providers who are responsible for developing and deploying these chatbots.”

Facebooktwitterredditpinterestlinkedinmail