fbpx
Connect with us

News

The term “AI” is damaging our relationship with tech – Dr Craig Newman

Avatar photo

Published

on

TED Talks speaker and clinical psychologist Dr Newman speaks to Health Tech World about the importance of AI in healthcare – and how what we call it matters more than you might think… 

When you think of artificial intelligence, what’s the first thing that comes to mind? If it’s a robot with arms and legs, that moves on wheels and badly mirrors human conduct, then you wouldn’t be alone. But you’d also be way off the mark. 

Understanding AI has never been an easy task, as the myths (and inaccuracies) around it remain rife. Even in 2022, as AI makes itself comfortable in our homes, hospitals and classrooms, many humans (despite using it everyday) respond to the term “AI” with resistance. 

What is AI?

Artificial intelligence is an array of technologies working together to comprehend and mimic natural intelligence. Machine learning is a part of the AI landscape, and is already a major part of our healthcare system and patient care. Virtual nursing assistants, robotic surgeries, and precision medicine are just a few ways in which such advancing tech is already integrated. 

Terminology and stigmatisation

Clinical psychologist Dr Craig Newman says that we are lacking a “communications research” understanding about the term AI, and that the terminology we use doesn’t necessarily fit with what is taking place within technologically facilitated healthcare.

Speaking to Health Tech World, Dr Newman said: “We can’t underestimate the language we use when we talk about e-health or AI, and what that means to people. 

“People need to be sold on the idea that AI supported health can be or is better. Not as a marketing spin, but to broaden understanding overall.  

“We are just inheriting tech language that describes the function and not the process.” 

Dr Newman was the Mobile Health Innovation Lead at Plymouth University Peninsula Schools of Medicine & Dentistry, and Clinical Psychologist in the Neuropsychology department at Derriford Hospital, Plymouth. 

He’s now the CEO of WeAreProject5, a clinically-governed wellbeing support for healthcare staff. 

In an interview with Health Tech World, Dr Newman explained how the AI stigma is “only getting bigger” and that it could be addressed in how we use the language around it. 

He added: “Often we associate AI with military narratives, or robots, I personally think it would be helpful for language to be more empowering and less about the technological frameworks.

“I said in my TEDx talk that the eHealth technology has to be received as a safety net for people, you’re not being palmed off! I see time and time again that it’s a sensible alternative: enriching care pathways.”

“We are lacking a communications research understanding about the term AI and its impact on patients, stakeholders and others in care pathways.

“The scale of difference captured in this term is vast, going from machine learning systems trying to replicate human communication to simple triage systems – there’s a common function in terms of technology but they’re different in terms of human experience and complexity of interaction.”

Improving understanding of AI in healthcare

Dr Newman pointed out that, despite a common resistance to AI, we would think nothing of turning to a book to broaden our knowledge of something. 

He continued: “Think about it – nothing is less interactive than a book. Yet when we talk about AI we sometimes activate stigma and resistance. We perhaps need to look at the language we are using. 

“For example, If you needed a therapist and I said “there’s no human but here’s a robot”, you are naturally going to carry some resistance towards that. Yet, you think nothing of getting a book, and referring to it to gain knowledge that way. The barrier likely lies in our relationship with these mediums, which in many ways is linked to the language we use (social constructions).

AI needs to be put in the right place and better understood.”

AI in the NHS must feel like a “service, not replacement”

“I think healthcare will eventually arrive at a point where it is facilitated by an AI platform.  For that to be a success, at the human level (staff and patients) we need to get to the point where we feel it’s a service and not a replacement or poor fix to current issues. 

“The use of AI still creates panic in people and resistance, understandably given our experience of it in media (movies etc). But we need to observe that people probably still go home and ask Alexa to add something to their shopping list, without this fear.

People love Alexa, don’t they? Why are they totally fine with that, do people even associate it with AI? I think it’s because we understand how it serves us.

The people behind Alexa were able to manifest it into our homes, it goes further than marketing. It looks like furniture. It’s clever. It interacts with us in a meaningful way.  

“We call it ‘Alexa’ and we rarely think of it as coded AI. 

“We ask it to entertain us, we ask it for answers, our children ask it for jokes, we ask it to pick music for us… it feels like a shortcut to what we want – and a friendly shortcut at that.  

“It matches the idealised or fantasised version of what we think AI could be, and is.”

E-Health and AI 

“My feeling about e-health is that it needs to look and feel similarly, but in respect to what we expect from healthcare at the idealised end of the scale.

“It’s the people that make us feel at ease and cared for when we arrive in healthcare services.   AI has to find a way to create that connection, not to replace what exists – but to add to what we already experience in a way that feels close to what we wanted, even if we never sat down and thought about what we wanted.  

“A friendly technology, digital personality, intelligent system that adds to what is already there and feels like a new safety net or service boost that we soon can’t imagine living without.  

“The shift needs to be slow, but a constant evolution, as we are currently dependent on the medical model and need to take baby steps on a journey towards something that could be very new.

“What we must not do, I believe, is try to replicate that too quickly.  Healthcare is not an equivalent set of functions.  It is a deeply personal, sometimes painful, often scary and perhaps shameful experience – for many.  

“It is a lifelong journey.”

Get Out, Get Love

Dr Newman is currently transforming a self-help book into an intelligent app that supports domestic abuse recovery for all genders, sexuality, race and ages.

The app brings together a route towards recovery, which adapts around personal need and stage of progress (AI).  

He will be launching a Kickstarter in the autumn (2022). Dr Newman can be contacted via www.uxclinician.com 

Continue Reading
3 Comments

3 Comments

  1. Pingback: Chemistry and digital chemistry: healthcare needs both | Health Tech World

  2. Pingback: 10 Outstanding TED Talks which shook the health tech space 

  3. Pingback: What role can AI play in the global diet crisis?

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending stories