But I was far less surprised to learn that the computer, much like the system it serves, seems to hate black people.
DON T MISS: Right now, Lyft is cheaper than the subway in NYC
Those scores are then used by judges to help with everything from bond amounts to sentencing.
It s currently used in states including Arizona, Colorado, Delaware, Kentucky, Louisiana, Oklahoma, Virginia, Washington, and Wisconsin.
What makes the report — and yes, there is something worse than computers using flawed methodology to lock people up — is the racial bias.
Northpointe disputed the results of ProPublica s findings, and wouldn t release the exact algorithm it uses to compute risk scores.
So, in conclusion, a computer is incorrectly classifying individuals as high or low risk, using a formula that it won t disclose, but is objectively racist.