It’s difficult to have an impact on your employees’ mental health.
Ideally, your organization would be a friendly place to work without undue stress and good wellness programs. Sadly, we know that many companies struggle to get anywhere close to that ideal. It’s also the case that many of the stresses on an employee’s mental health don’t originate in the firm. It’s hard to do much about family, personal issues, political unrest, or pandemics.
This leads us to wonder if a cheap, easily deployed technology can help. We can’t send everyone to a therapist, but perhaps a chatbot can assist in some way.
Task-oriented vs “friend” chatbots
There are a number of inexpensive chatbots like Woebot and Mindshift that use cognitive behavioral therapy tools. From a technology perspective, these are not very sophisticated. They largely involve the chatbot asking something and the user picking an answer from a list. It’s much like going through a cognitive behavioral therapy workbook, with the advantage that many people will find the chatbot interface more engaging than a book.
More spooky and potentially more useful are AI-based chatbots that use natural language programming like Replika. One interesting distinction between Replika and most everything else we do in business is that Replika is not task-oriented. While a bot like Woebot might be used to help you deal with a task (like reducing anxiety), Replika just wants to be your friend.
AI-based chatbots don’t need to have all that much intelligence to be effective. People who are dealing with stress or loneliness often just want someone who is a good listener. That can be achieved with current technology.
Are chatbots a copout?
There is a weird challenge for HR in deploying chatbots to help employees with mental health: They’re almost too easy. That is, employees can download them on their own, use their own devices, chat with them on their own, and easily afford any costs. So what’s the role for HR?
HR is best to simply educate employees about these tools as a means for addressing stress. Furthermore, HR’s endorsement of these tools legitimizes them in a way that can encourage employees to give them a try. Additionally, it’s worth pointing out that the bar for acceptance is low; even if only some people find them helpful, then that’s good enough.
It’s impossible not to get excited or scared about what AI friends may bring to the workplace in the future. These tools are continually getting technically more powerful and are also continually learning from all their interactions with humans. They could no doubt be great tools for helping people develop emotional strength and interpersonal skills. They could also become addictive or create some other harm. As professionals with an interest in humans, we should engage with chatbots now and on an ongoing basis so that we understand the potential and the risks.