Four in 10 UK adults open to AI counselling

By Published On: March 9, 2026Last Updated: March 24, 2026
Four in 10 UK adults open to AI counselling

Four in 10 UK adults would use AI counselling, a global survey suggests, highlighting growing trust in ChatGPT for mental health support.

The study, led by Bournemouth University, surveyed nearly 31,000 adults in 35 countries about their use of artificial intelligence large language models such as ChatGPT.

It found that 41 per cent of participants in the UK, and 61 per cent globally, said they would be happy to use AI for counselling services.

Researchers said this could reflect the waiting times many people face to access the mental health services they need.

The study also found that one quarter of UK adults would be happy to delegate the role of teaching their children to AI.

Globally, 45 per cent of people said they would trust AI models to take on the role of their doctor.

Three quarters of people surveyed said they would use an AI chat tool as a companion and a friend.

Dr Ala Yankouskaya, senior lecturer in psychology at Bournemouth University, who led the study, said: “With the rapid development and mass availability of AI, more people are placing their trust in it.

“We wanted to learn more about how people would trust generative AI tools, such as ChatGPT, to carry out some of the most important roles in their daily lives.

“If someone is experiencing depression, they do not want to wait months for an appointment, so instead they can turn to AI.

“However, when I tested some of the tools myself, I found the language used very vague and confusing because the developers are careful not to jump into providing diagnoses.

“So, it is no substitute for speaking to a health professional.”

The researchers also noted that users were already familiar with NHS chatbots, which use similar AI technology, and said this could be normalising the use of AI in other apps such as ChatGPT for mental health care.

A quarter of people in the UK and half of everyone surveyed globally said they would trust AI to carry out the role of a teacher, which the research team found particularly concerning.

“It really knocked me down when I saw how many people would be willing to delegate AI to the role of teaching their children,” Dr Yankouskaya said.

“We still do not know the long-term effects that using these tools for education could have on children’s memory and cognitive functions.

“We could be heading to the stage where we are developing children who are good at putting prompts into AI tools but not as good at taking the information in.”

The researchers were also concerned about the long-term physical effects on the brain if learning information in the traditional way was replaced by excessive search engine use.

They said this could shrink the hippocampus, a region of the brain involved in memory, spatial awareness and learning.

Forty five per cent of all respondents and 25 per cent in the UK said they would trust AI to carry out the role of their doctor.

The numbers were higher in countries where healthcare is more expensive and harder to access.

This was less surprising to the researchers, who said people living in parts of the world where healthcare is less readily available might rely on technology for quick answers.

However, they were cautious about the underlying algorithm used to retain the user’s attention and keep them in a relaxed chat.

They said this could be more harmful for mental health advice, where traditional methods might instead direct the user to specific services such as the Samaritans.

The highest level of trust participants were willing to place in AI came in the role of friendship.

More than three quarters of people globally and more than half of people in the UK said they would talk to ChatGPT as a companion.

Researchers said this may be explained by a perceived sense of empathy from generative language tools because they are designed to adapt the tone of their responses to suit the user.

“AI tools come across as a friend who knows you well and understands you,” Dr Yankouskaya said.

“ChatGPT can remember every chat it has had with a user and it feels like a private conversation between them.

“Nowadays people can be very sensitive to being judged and AI tools are designed to be non-judgemental. This means they can provide the sense of security people need.”

Dr Yankouskaya and the team concluded that, as the prospect of AI playing a bigger role in people’s lives moves from theory to reality, there needs to be more awareness within society about how generative AI tools work and their limitations.

They added that the lack of knowledge about the long-term effects on memory means caution should be used before these tools take over roles in education in particular.

Potentially life-changing drug shows promise for child epilepsy disorder
The NHS doesn't have a productivity problem: It has a precision problem