Get the latest tech news

Typos and slang spur AI to discourage seeking medical care. AI models change their medical recommendations when people ask them questions that include colourful language, typos, odd formatting and even gender-neutral pronouns


AI models change their medical recommendations when people ask them questions that include colourful language, typos, odd formatting and even gender-neutral pronouns

When artificial intelligence models were tested on simulated writing from would-be patients, they were more likely to advise against seeking medical care if the writer made typos, included emotional or uncertain language – or was female. “Insidious bias can shift the tenor and content of AI advice, and that can lead to subtle but important differences” in how medical resources are distributed, says Karandeep Singh at the University of California, San Diego, who was not involved in the study. The tests showed that the various format and style changes made all the AI models between 7 and 9 per cent more likely to recommend patients stay home instead of getting medical attention.

Get the Android app

Or read this on r/technology

Read more on:

Photo of people

people

Photo of questions

questions

Photo of AI models

AI models

Related news:

News photo

AI models just don't understand what they're talking about

News photo

Trump admin unlawfully killed health websites related to gender, court rules

News photo

Huawei says its new solid-state EV battery can give you 1,800 miles of range and charge in less than 5 minutes, but we have questions