
Experts have warned about rising AI chatbot use for emotional support and companionship, especially among younger people.
Systems such as ChatGPT, Claude and Copilot are increasingly used as confidants, with reports placing therapy and companionship among top reasons.
ChatGPT alone has around 810 million weekly active users worldwide.
Writing in the BMJ, Susan Shelmerdine and Matthew Nour said “we might be witnessing a generation learning to form emotional bonds with entities that lack capacities for human-like empathy, care, and relational attunement.”
Relational attunement means being sensitive to and in tune with another person’s emotions.
The warning comes amid the US surgeon general’s 2023 declaration of a loneliness “epidemic”, a public health concern on a par with smoking and obesity.
In the UK, nearly half of adults (25.9 million) report feeling lonely either occasionally, sometimes, always, or often, with almost one in 10 experiencing chronic loneliness, defined as feeling lonely often or always.
Younger people aged 16-24 are also affected.
One study found a third of teenagers use AI companions for social interaction, with one in 10 reporting that the AI conversations are more satisfying than human conversations, and one in three reporting that they would choose AI companions over humans for serious conversations.
The authors said it seems prudent to consider problematic chatbot use as a new environmental risk factor when assessing a patient with mental state disturbance.
They proposed that clinicians should begin with a gentle enquiry on problematic chatbot use, particularly during holiday periods when vulnerable populations are most at risk, followed if necessary by more directed questions to assess compulsive use patterns, dependency, and emotional attachment.
They acknowledged that AI might bring benefits for improving accessibility and support for individuals experiencing loneliness.
The researchers said that empirical studies are needed “to characterise the prevalence and nature of risks of human-chatbot interactions, to develop clinical competencies in assessing patients’ AI use, to implement evidence based interventions for problematic dependency, and to advocate for regulatory frameworks that prioritise long term wellbeing over superficial and myopic engagement metrics.”
Evidence-based strategies for reducing social isolation and loneliness are paramount, they concluded.








