fbpx
Connect with us

News

Automated compassion: The oxymoron we’ll have to live with 

Avatar photo

Published

on

Artificial intelligence is learning to imitate human compassion – and it’s taking shape everywhere from customer service to psychiatric therapy to full imitations of our dead relatives. But can we automate empathy? Should AI be marketed as having compassion?

No matter which way you spin it, the idea of automating empathy and compassion is a contradiction. Automated compassion is, by any stretch of the imagination, imitated compassion. And machine “imitation” is only capable of human comprehension, identification, and acknowledgement in the sense that it’s been programmed to copy human reactions. 

But is it really about comparing machines to humans? Artificial intelligence is an astonishing thing all-in-all, and has already shown real promise in areas such as our healthcare systems. 

The idea of it pretending to be human, and pretending to comprehend human suffering is an odd concept – but AI may have a very useful place in the healthcare of the future – especially when a human isn’t available. 

What is automated compassion exactly?

While it’s what it says on the tin, there are aspects of it that may not be obvious. It’s AI that analyses and reacts to perceived mental or physical distress in humans. It’s a simulation of human emotion, a false impression of it if you will, and it can be used for Cognitive Behavioural Therapy (CBT), or as a mental health chatbot. 

It can also be trained to acknowledge the woes and difficulties experienced in customer service, to improve the customer experience and make them feel understood. 

Sentient AI “does not exist”

Gordon Midwood, co-founder of London AI firm Anything World made the point about AI having a lack of sentience, which makes it hard for it to genuinely be compassionate. 

He told Health Tech World: “I think currently automated compassion is something of an oxymoron given that general Artificial Intelligence does not really exist in any meaningful form. This means that a sentient artificial intelligence does not exist. 

Given that a sentient Artificial Intelligence does not (yet) exist any attempt at emotion is clearly simulated. And simulated feelings are not genuine feelings, so any compassion given is essentially faked compassion.”

Value in automated empathy 

He added: “However if we take a more philosophical view I think there is value here. If something provides compassion to someone, and AI is demonstrably doing this now through state of the art services such as Hugging Face for example (especially for a teen audience), then is it not by its very nature compassionate? 

“Whether the compassion actually exists as an emotion in the provider of the compassion is therefore somewhat irrelevant, no?”

If it’s helpful, it’s helpful

Surbhi Rathore, CEO and Co-Founder of Symbl.ai pointed out that despite the obvious imitation of compassion,  there’s great value and use in machine learning – especially in health care.

She said: “Machine learning models that offer automated compassion features can help prod doctors, social workers, therapists, and those in many other people-focused professions in the right direction when it comes to recognising and alleviating mental and physical distress. 

“Compassion is an innate trait in humans, not machines, that can be fostered and grown with the intentional use of such models. 

“It hardly seems fair to say that a mechanism that seeks to train humans to be more aware of their own level of compassion is attempting to imitate it exclusively. 

She added: “In cases where AI models are offering compassion and empathy to humans—say, a robot mental health counsellor that chats with humans—we believe that the effectiveness of such an interaction doesn’t have to be limited to how “real” the empathy is. 

“If it’s helpful, it’s helpful.”

Machines may deter us from human connection

Rajesh Namase, co-Founder and professional tech writer at TechRT said that it still pays to be aware of the disadvantages. 

He said: “Though automated compassion may seem like a helpful way to provide comfort, there are several disadvantages to using this method. One of the disadvantages is that the machines can never really know what another person is feeling. 

“They can only go off of what the person has told them or what they have observed. This means that they could easily miss important cues that would help them to understand how the person is feeling. 

“Another disadvantage is that people can start to rely on machines too much. If people become used to having a machine always there to listen to their problems, they may start to feel like they don’t need to talk to other people as much. This could lead to them becoming more isolated and lonely.

“Compassionate” AI in psychiatric therapy 

Surbhi Rathore said we should expect integration between AI and humans in healthcare and mental healthcare.

She continued: “It’s not likely that machines will surpass humans in terms of effectiveness with regard to therapy. We predict there will be more of an integration between therapists and AI models as they progress in terms of overall intelligence, but to rely entirely on AI as one’s mental health support system—at least in the near future—would be a bit of a stretch. 

In the same vein, we picture more of a happy harmony between the two with regard to CBT and trauma treatment. 

“AI can pick up on subtle cues or phrases that some human therapists might miss, but the outsize benefit of a human-to-human connection in therapy won’t go away any time soon.

The sentiment was echoed somewhat by Gordon of Anything World: “I think absolutely with the advances in Machine Learning effective therapy of all kinds can be performed by computers, there is no reason why not.

“However, and I think this is a point worth stressing, whilst I believe A.I. can play an important role in therapy, it will not and should not fully replace human involvement.

A “healthy blend” of AI and human care”

He added: “I think we will see a healthy blend of Artificial Intelligence and human care in the healthcare in 30 years time, with AI performing a lot of the labour intensive tasks of communication, initial diagnosis and intelligent monitoring and with human care always a strong part of the mix.”

Don’t miss…

Chilling new technology allows us to “speak” to our dead relatives. Would you?

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending stories