Advertisement
AI-based chatbots offer a new form of mental health help amid shortage of therapists, but can they be trusted?
- Chatbots powered by generative AI use vast amounts of data to mimic human language. They are now being offered as stress management and mental health tools
- Their creators don’t call what they offer therapy, to avoid regulatory oversight, but say they help with minor issues. However, professionals urge caution
Reading Time:4 minutes
Why you can trust SCMP
1

Download the mental health chatbot Earkick and you’re greeted by a bandana-wearing panda who could easily fit into a kids’ cartoon.
Start talking or typing about anxiety and the app generates the kind of comforting, sympathetic statements therapists are trained to deliver. The panda might then suggest a guided breathing exercise, ways to reframe negative thoughts or stress-management tips.
It’s all part of a well-established approach used by therapists, but please don’t call it therapy, says Earkick co-founder Karin Andrea Stephan.
“When people call us a form of therapy, that’s OK, but we don’t want to go out there and tout it,” says Stephan, a former professional musician and self-described serial entrepreneur. “We just don’t feel comfortable with that.”

The question of whether these artificial intelligence-based chatbots are delivering a mental health service or are simply a new form of self-help is critical to the emerging digital health industry – and its survival.
Earkick is one of hundreds of free apps that are being pitched to address a crisis in mental health among teens and young adults.
Advertisement