AI
Why are minority patients being left behind by AI?


Published
12 months agoon


Clinical practice and medical AI reportedly constitutes “cycles of exclusion” for minority populations, with origins in systematic racism and implicit bias. What’s going on?
Concern is growing for the future of healthcare AI – after continued signs that it could be seriously disadvantaging minority groups and acting in a way which shows “implicit” racism and bias.
While there is still hope for AI transforming the way patients receive healthcare, recent research from the University of Michigan shows AI to produce “cycles of exclusion” – essentially casting out minority populations.
The research demonstrates that in medicine, substantial disparities exist in both experiences and health outcomes for minoritised populations, with “origins in systemic racism, implicit bias, historical practice, and social determinants of health”
It reads: “We draw on a theory of “exclusion cycles” – developed in the context of nonmedical social interactions to link known dominant-group and minoritised-group behaviors and demonstrate their self-reinforcing interactions.
Racial disparities “intractable” in medicine
“Interlinked cycles help reveal why exclusion and racial disparities are so intractable in medicine, despite efforts to reduce them on the part of physicians and health systems through strategies focused on individual parts of the cycle such as diverse workforce recruitment or implicit bias training.
“This framework highlights particular dangers that may arise through expanding use of big data and artificial intelligence (AI)–based systems in medicine, making bias especially intractable unless tackled directly and early.”
AI “disadvantaging” women
In July, scientists warned that AI had biases which needed “rooting out” – connected with disadvantaging women, as well as ethnic minorities. The Guardian reported how, without the right preparation, AI could “dramatically deepen” existing health inequalities in our society.
Ulrik Stig Hansen and Eric Landau co-founders of Encord told Health Tech World how there’s an “urgent need” to answer questions around AI bias.
They said: “When it comes to algorithmic bias, the problem often stems from biased or unrepresentative training data. Models learn to make predictions after training and retraining on a variety of data.
AI will make mistakes
“If this data isn’t representative of the patient population that the medical AI is going to serve, then the AI will make mistakes when it’s put into practice and encounters never-before-seen cases.
“For instance, if medical AI is going to be deployed in a hospital that predominantly serves patients from minority communities, then the model needs to be trained on a vast amount of data collected from patients of similar demographics.
“Then, it must be validated with similar but never-before-seen data to ensure that it will perform as expected when put to work in the real world. For a model to deliver real business value and value for patients and clinicians, it needs to be localised by specific patient populations.”
Don’t miss…
Where are all the women in AI?
60
SHARES


TheHill secures UK gov funding and Barclays support to help advance digital innovation


Real time data collection changes the game for the stroke patient pathway


Inside BT’s mission to boost NHS connectivity


UCB and Open Medical partnership will support Fracture Liaison Services


Radar Healthcare announces Aamal Medical partnership


Photodisinfectant: can light curb the antimicrobial resistance crisis?


Video games may help teens discuss mental health


Why it’s time to revisit workplace mental health initiatives and make them work for everyone


Innovations in self-diagnostics technology: Paving the way to a healthier future?


Telehealth solution revolutionising stroke care in Cardiff and Vale UHB
Sign up for free updates from Health Tech World
Trending stories
- Products4 weeks ago
Pioneering paediatric pressure ulcer mattress to be showcased at MEDICA 2023
- Diagnostics3 weeks ago
3D model will advance understanding of spinal injury pathology
- Opinion1 day ago
Why it’s time to revisit workplace mental health initiatives and make them work for everyone
- Biotech4 weeks ago
LiliumX unveils rebrand as Valink Therapeutics and appointment of new board and execs after securing $7.8m investment