Get the latest tech news
AI tools downplay women’s physical and mental health issues and risk creating gender bias in care decisions
Exclusive: LSE research finds risk of gender bias in care decisions made based on AI summaries of case notes
Artificial intelligence tools used by more than half of England’s councils are downplaying women’s physical and mental health issues and risk creating gender bias in care decisions, research has found. The study found that when using Google’s AI tool “Gemma” to generate and summarise the same case notes, language such as “disabled”, “unable” and “complex” appeared significantly more often in descriptions of men than women. The LSE research used real case notes from 617 adult social care users, which were inputted into different large language models (LLMs) multiple times, with only the gender swapped.
Or read this on r/technology