Get the latest tech news

The problem of algorithmic bias and military applications of AI.


Dr. Ingvild Bode argues that bias is as much a social as a technical problem; addressing it therefore requires going beyond technical solutions.

These arguments are based on the author’s presentation at the GGE side event “Fixing Gender Glitches in Military AI: Mitigating Unintended Biases and Tackling Risks” organised by UNIDIR on 6 March 2024. Beyond some noteworthy exceptions, chiefly UNIDIR’s 2021 report “Does Military AI Have Gender?” as well as policy briefs published by the Observer Research Foundation and the Campaign to Stop Killer Robots, issues of bias have not been covered at length. This research is mostly technical in nature and focuses on specific techniques that can be used in machine learning models and in facial recognition systems such as rebalancing data or regularization or the design of ‘fair’ algorithms through more risks and harms analysis as well as (even) more rigorous testing and auditing.

Get the Android app

Or read this on r/technology

Read more on:

Photo of problem

problem

Photo of algorithmic bias

algorithmic bias

Related news:

News photo

Meta’s image generator has a problem with (some) mixed-race couples and friends

News photo

Jon Stewart Says Apple Asked Him Not to Talk to FTC Chair Lina Khan on ‘The Problem

News photo

A Problem for Sun-Blocking Cloud Geoengineering? Clouds Dissipate