It predicts crime, eh? Must be rayciss!
The never-ending quest to predict crime using AI

The practice has a long history of skewing police toward communities of color. But that hasn’t stopped researchers from building crime-predicting tools.


And as the United States faces rising rates of violent crime, another research project emerged: A group of University of Chicago scientists unveiled an algorithm last month, boasting in a news release of its ability to predict crime with “90% accuracy.”

The algorithm identifies locations in major cities that it calculates have a high likelihood of crimes, like homicides and burglaries, occurring in the next week. The software can also evaluate how policing varies across neighborhoods in eight major cities in the United States, including Chicago, Los Angeles and Philadelphia.
Sounds good, right? But wait, there's more.

Historically, police data in the United States is biased, according to Southerland. Cops are more likely to arrest or charge someone with a crime in low-income neighborhoods dominated by people of color, a reality that doesn’t necessarily reflect where crime is happening, but where cops are spending their time.

That means most data sets of criminal activity overrepresent people of color and low-income neighborhoods. Feeding that data into an algorithm leads it to suggest more criminal activity is in those areas, creating a feedback loop that is racially and socioeconomically biased, Southerland added.
https://www.washingtonpost.com/techn...gorithms-fail/