Get the latest tech news
V.A. Uses a Suicide Prevention Algorithm To Decide Who Gets Extra Help. It Favors White Men. – The Markup
An AI program designed to prevent suicide among U.S. military veterans prioritizes White men and ignores survivors of sexual violence
An artificial intelligence (AI) program designed to prevent suicide among U.S. military veterans prioritizes White men and ignores survivors of sexual violence, which affects a far greater percentage of women, an investigation by The Fuller Project has found. The VA touts its machine learning model, REACH VET, “as the nation’s first clinical use of a validated algorithm to help identify suicide risk.” Launched in 2017, the system flags 6,700 veterans a month for extra help. Joy Ilem, national legislative director for Disabled American Veterans, told The Fuller Project she was “puzzled” by the VA’s decision to exclude a long list of factors known to increase suicide risk among female veterans—including military sexual trauma, intimate partner violence, pregnancy, menopause, and firearm ownership.
Or read this on r/technology