Many states and cities are putting Americans’ fates in the hands of algorithms

Many states and cities are putting Americans’ fates in the hands of algorithms

Derek Thompson writes:

Rachel Cicurel, a staff attorney at the Public Defender Service for the District of Columbia, was used to being outraged by the criminal-justice system. But in 2017, she saw something that shocked her conscience.

At the time, she was representing a young defendant we’ll call “D.” (For privacy reasons, we can’t share D’s name or the nature of the offense.) As the case approached sentencing, the prosecutor agreed that probation would be a fair punishment.

But at the last minute, the parties received some troubling news: D had been deemed a “high risk” for criminal activity. The report came from something called a criminal-sentencing AI—an algorithm that uses data about a defendant to estimate his or her likelihood of committing a future crime. When prosecutors saw the report, they took probation off the table, insisting instead that D be placed in juvenile detention.

Cicurel was furious. She issued a challenge to see the underlying methodology of the report. What she found made her feel even more troubled: D’s heightened risk assessment was based on several factors that seemed racially biased, including the fact that he lived in government-subsidized housing and had expressed negative attitudes toward the police. “There are obviously plenty of reasons for a black male teenager to not like police,” she told me.

When Cicurel and her team looked more closely at the assessment technology, they discovered that it hadn’t been properly validated by any scientific group or judicial organization. Its previous review had come from an unpublished graduate-student thesis. Cicurel realized that for more than a decade, juvenile defendants in Washington, D.C., had been judged, and even committed to detention facilities, because the courts had relied on a tool whose only validation in the previous 20 years had come from a college paper.

The judge in this case threw out the test. But criminal-assessment tools like this one are being used across the country, and not every defendant is lucky enough to have a public defender like Rachel Cicurel in his or her corner. [Continue reading…]

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.