BOISE, ID. – Critics of computer algorithms that predict someone’s risk of committing a crime say they’re tinged with racial bias.

A bipartisan measure in the Idaho Legislature aims to make the algorithms transparent and ban those that show bias.

Civil rights groups, including the ACLU and NAACP, have raised concerns about discrimination in pretrial risk assessment algorithms, which analyze a person’s likelihood of not showing up for trial or committing another crime.

Republican Rep. Greg Chaney of Caldwell says some of these programs have been shown to over-predict this likelihood with African-Americans.

“The rate of error is far lower for Caucasians, and it overestimates the likelihood that they will not commit an additional crime,” he explains. “And that’s been borne out by studies of what the computer models predicted up front versus what actually happened in the real world, where these particular programs have been deployed.”

Chaney says risk-assessment algorithms are just taking off in Idaho.

House Bill 118, with Senate support from Democrat Cherie Buckner-Webb of Boise, would require programs to be validated as “free of bias” by the Idaho Supreme Court before being used.

The algorithms evaluate risk based on factors such as age when a person was first arrested and history of substance abuse to determine a score.

But Chaney says much of what goes into calculating the score is secret. He compares it to dystopian science fiction.

“It’s like the ‘Minority Report’ in real life,” he states.
“It’s a secretive, computerized calculation that actually is used to help determine, in many cases around the country, how many years of someone’s life they lose to incarceration.

“For that to be one, a computerized process and number two, a secretive process is quite frightening.”

Chaney says these types of algorithms could potentially be useful, but only if they are completely transparent. The bill currently is awaiting a hearing.

____

Eric Tegethoff, Public News Service