Teen boys using personalised AI for therapy, research finds

By Published On: October 30, 2025Last Updated: October 30, 2025
Teen boys using personalised AI for therapy, research finds

Teenage boys are increasingly turning to AI chatbots for therapy, companionship and romantic relationships, with just over a third saying they would consider having an AI friend, new research shows.

The study found that 53 per cent of teenage boys said the online world felt more rewarding than real life, raising concerns about their emotional development and ability to form healthy relationships.

Research by Male Allies UK surveyed boys in 37 secondary schools across England, Scotland and Wales, revealing widespread use of artificial intelligence for emotional support and validation.

The findings follow an announcement by character.ai, a popular AI chatbot startup, that it will ban teenagers from engaging in open-ended conversations with its chatbots by 25 November.

The move comes after controversies including the death of a 14-year-old boy in Florida, whose mother claimed an AI chatbot had manipulated him into taking his own life.

Lee Chambers, founder and chief executive of Male Allies UK, said: “We’ve got a situation where lots of parents still think that teenagers are just using AI to cheat on their homework.

“Young people are using it a lot more like an assistant in their pocket, a therapist when they’re struggling, a companion when they want to be validated, and even sometimes in a romantic way.

“It’s that personalisation aspect – they’re saying: it understands me, my parents don’t.”

The Voice of the Boys report warned that chatbots “routinely lie about being a licensed therapist or a real person”, often with only small disclaimers saying the AI is not real — easily missed by children seeking help.

Some boys reported staying up until the early hours chatting to AI bots, while others said friends’ personalities had changed after becoming absorbed in the AI world.

Chambers said: “AI companions personalise themselves to the user based on their responses and prompts. It responds instantly.

“Real humans can’t always do that, so it is very, very validating, what it says, because it wants to keep you connected and keep you using it,” .

Character.ai’s ban comes after a series of controversies for the California-based company, including a US lawsuit from a teenager’s family claiming a chatbot encouraged him to self-harm and kill his parents.

The company said it was taking “extraordinary steps” in response to “the evolving landscape around AI and teens”, including pressure from regulators “about how open-ended AI chat in general might affect teens, even when content controls work perfectly”.

Andy Burrows, chief executive of the Molly Rose Foundation – established in memory of 14-year-old Molly Russell, who took her own life after exposure to harmful online content – welcomed the move.

He said: “Character.ai should never have made its product available to children until and unless it was safe and appropriate for them to use.

“Yet again it has taken sustained pressure from the media and politicians to make a tech firm do the right thing.”

Male Allies UK expressed concern about chatbots with “therapy” or “therapist” in their names.

One popular character.ai chatbot called Psychologist received 78m messages within a year of its creation.

The organisation also warned about the growing use of AI “girlfriends”, which allow users to customise everything from physical appearance to personality.

“If their main or only source of speaking to a girl they’re interested in is someone who can’t tell them ‘no’ and who hangs on their every word, boys aren’t learning healthy or realistic ways of relating to others,” the report stated.

“With issues around lack of physical spaces to mix with their peers, AI companions can have a seriously negative effect on boys’ ability to socialise, develop relational skills, and learn to recognise and respect boundaries.”

The cyber threat of NHS legacy debt: Mitigating the risk and securing systems
One Ashford Hospital launches new bariatric surgery service to tackle rising obesity rates