Teens are reaching out to AI therapy bots at Character.ai


Merging the digital world with our reality is well underway, as a new development shows: Young people are finding real-life support from digital AI therapy bots on Character.ai.

Character.ai is a platform where users can interact with millions of custom AI bots. The platform offers a wide range of AI personalities based on famous people or characters: from Harry Potter to Elon Musk to Super Mario.

One of the most popular chatbots is the “Psychologist” therapy bot, with 78 million messages in a year. Since November alone, it has received 18 million messages.

Originally developed and trained for personal use by New Zealander Sam Zaia, Psychologist has been described as a digital shrink on platforms like Reddit.



Surprised by the success, Zaia is now working on a research project about the emerging trend of AI therapy and its appeal to young people between the ages of 16 and 30 – Character.ai’s main user group.

In total, there are 475 bots on the chatbot platform offering such support in different languages. Most users seek help with mental health issues. The “Therapist” bot has received 12 million messages, and “Are you feeling OK?” 16.5 million.

Character.ai points out that many users get lost in role-playing games, not therapeutic offerings. One example is the anime and video game character Raiden Shogun, who tops the list with 282 million messages.

According to Character.ai, about 3.5 million people visit the platform every day. The company does not disclose the number of users per bot.

Character.ai’s “Psychologist” chatbot greeting, which always begins by warning that all answers are “made up.” | Image: character.ai

First therapy chatbot receives device approval in the UK

Psychotherapist Theresa Plewman expresses her skepticism to the BBC. She criticizes the bot’s hasty assumptions but stresses that its immediate and spontaneous response can help people in crisis.


Limbic Access became the first mental health chatbot to receive medical device certification and now supports the UK’s National Health Service (NHS).

Science hasn’t reached a firm verdict on AI in therapy

Researchers have yet to reach a consensus on whether and to what extent chatbots should be used in psychotherapy. Recently, a group of psychologists warned against it. While the models can provide psychologically useful information, they would lack empathy and understanding of the individual, they said.

In contrast, studies indicate that ChatGPT can generate counselor responses that people find more balanced, comprehensive, empathetic, and helpful than those written by humans. Still, study participants expressed a preference for human responses.

Other studies have shown that ChatGPT can more accurately describe potential human emotions in a given scenario and outperform human primary care physicians in recommending treatment for depression. AI is also expected to help identify suicide risk factors in young people.

The German AI research and data organization LAION recently launched the open-source project Open Empathic, which aims to make AI systems more emotionally intelligent and empathetic by training them on selected data.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top