logo
logo

Courts use a ‘Minority Report’ crime prediction algorithm, and it’s incredibly racist

avatar
John Larsen
May 23, 2016 16:55
img

But I was far less surprised to learn that the computer, much like the system it serves, seems to hate black people.

DON T MISS: Right now, Lyft is cheaper than the subway in NYC

Those scores are then used by judges to help with everything from bond amounts to sentencing.

It s currently used in states including Arizona, Colorado, Delaware, Kentucky, Louisiana, Oklahoma, Virginia, Washington, and Wisconsin.

What makes the report — and yes, there is something worse than computers using flawed methodology to lock people up — is the racial bias.

Northpointe disputed the results of ProPublica s findings, and wouldn t release the exact algorithm it uses to compute risk scores.

So, in conclusion, a computer is incorrectly classifying individuals as high or low risk, using a formula that it won t disclose, but is objectively racist.

collect
0
avatar
John Larsen
May 23, 2016 16:55
guide
Zupyak is a free content platform for publishing and discovering stories, software and startups.