Get the latest tech news

AI medical tools found to downplay symptoms of women, ethnic minorities | Bias-reflecting LLMs lead to inferior medical advice for female, Black, and Asian patients.


Bias-reflecting LLMs lead to inferior medical advice for female, Black, and Asian patients.

Artificial intelligence tools used by doctors risk leading to worse health outcomes for women and ethnic minorities, as a growing body of research shows that many large language models downplay the symptoms of these patients. Similarly, research by the London School of Economics found that Google’s Gemma model, which is used by more than half the local authorities in the UK to support social workers, downplayed women’s physical and mental issues in comparison with men’s when used to generate and summarize case notes. Zack said Open Evidence, which is used by 400,000 doctors in the US to summarize patient histories and retrieve information, trained its models on medical journals, the US Food and Drug Administration’s labels, health guidelines and expert reviews.

Get the Android app

Or read this on r/technology

Read more on:

Photo of women

women

Photo of patients

patients

Photo of asian

asian

Related news:

News photo

Football Manager 26 adds the women's game for the first time, with official Barclays Women's Super League inclusion

News photo

Ultrahuman and Clue team up to help women take control of their cycle

News photo

LoL esports team suspends player for saying women can't compete while menstruating