Get the latest tech news
Judges are using algorithms to justify doing what they already want
Turns out deciding when to use algorithms is deeply human.
The interviews revealed recurring patterns in judges’ decisions to use risk assessment scores, frequently based on defendants’ criminal history or social background. This was partly because they identified problems with how the system weighted information for specific crimes — in intimate partner violence cases, for instance, they believed even defendants without a long criminal history could be dangerous. Megan Stevenson, an economist and criminal justice scholar at the University of Virginia School of Law, says risk assessments are something of “a technocratic toy of policymakers and academics.” She says it’s seemed to be an attractive tool to try to “take the randomness and the uncertainty out of this process,” but based on studies of their impact, they often don’t have a major effect on outcomes either way.
Or read this on The Verge