
AI can create empathetic responses more reliably and consistently than humans, even when compared to professionals whose job relies on empathising with those in need, new research has found.
The research looked at how people evaluated empathetic responses generated by ChatGPT compared to human responses.
Across four separate experiments, participants were asked to judge the level of compassion (an important facet of empathy) in written responses to a series of positive and negative scenarios created by AI as well as regular people and expert crisis responders.
In each scenario, the AI responses were preferred and rated as more compassionate and responsive—conveying greater care, validation and understanding compared to the human responses.
So how could a general chatbot like ChatGPT outperform professionals trained in responding with empathy?
Dariya Ovsyannikova is lab manager in Professor Michael Inzlicht’s lab at University of Toronto (UT) Scarborough and lead author of the study.
The researcher pointed to AI’s ability to pick up on fine details and stay objective, making it particularly adept at crafting attentive communication that appears empathetic.
Empathy is an important trait not only in fostering social unity, but in helping people feel validated, understood and connected to others who empathise with them.
In clinical settings, it plays a critical role in helping people regulate emotions and feel less isolated.
But constantly expressing empathy has its costs.
Ovsyannikova, who herself has professional experience volunteering as a crisis line responder, said: “Caregivers can experience compassion fatigue.”
She added that professional caregivers, especially in mental health settings, may need to sacrifice some of their ability to empathise to avoid burnout or balance their emotional engagement effectively for each of their clients.
Humans also come with their own biases and can be emotionally affected by a particularly distressing or complex case, which also impacts their ability to be empathetic.
The researchers say that coupled with shortages in accessible healthcare services and qualified workers, and a widespread increase in mental health disorders, it means empathy is in short supply.
That doesn’t mean we should cede empathy-derived care to AI overnight, says Inzlicht, who was a co-author of the study along with PhD student Victoria Oldemburgo de Mello.
He said: “AI can be a valuable tool to supplement human empathy, but it does come with its own dangers.”
Inzlicht added that while AI might be effective in delivering surface-level compassion that people might find immediately useful, something like ChatGPT will not be able to effectively give them deeper, more meaningful care that will get to the root of a mental health disorder.
He noted that over-reliance on AI also poses ethical concerns, namely the power it could give tech companies to manipulate those in need of care.
For example, someone feeling lonely or isolated may become reliant on talking to an AI chatbot that is constantly doling out empathy, instead of fostering meaningful connections with another human being.
Inzlicht, whose research looks at the nature of empathy and compassion, said: “If AI becomes the preferred source of empathy, people might retreat from human interactions, exacerbating the very problems we’re trying to solve, like loneliness and social isolation.”
Another issue is a phenomenon known as “AI aversion,” which is a prevailing skepticism about AI’s ability to truly understand human emotion.
While participants in the study initially ranked AI-generated responses highly when they didn’t know who had written them, that preference shifted slightly when they were told the response came from AI.
However, this bias may fade over time and experience, with Inzlicht noting that younger people who grew up interacting with AI are likely to trust it more.
Despite the critical need for empathy, Inzlicht urged for a transparent and balanced approach to deployment where AI supplements human empathy rather than replaces it.
He said: “AI can fill gaps, but it should never replace the human touch entirely.”










