AI experts call for end to research into facial recognition predictive crime algorithms

A coalition of artificial intelligence researchers, data scientists and sociologists has called on academia to stop publishing research that claims to use training algorithms such as facial scanning and crime statistics to predict individual crime. The Key Technology Alliance says such work is not only scientifically blind, but also perpetuates a cycle of prejudice against blacks and people of color. Many studies have shown that the justice system treats these groups harsher than whites, so any software trained on these data simply magnifies and reinforces social prejudice and racism.

They say there is no way to develop a crime prediction system that predicts or identifies non-racial bias, because the crime category itself is racially biased. Studies of this nature, and the accompanying claims of accuracy, are based on the assumption that data on criminal arrests and convictions can serve as reliable, neutral indicators of potential criminal activity. Yet these records are far from neutral.

An open letter from the Key Technology Alliance was drafted in response to news that Springer, the world’s largest publisher of academic books, plans to publish such a study. The letter, which has 1,700 signatures, calls on Springer to withdraw the paper and other academic publishers to refrain from publishing similar works in the future.

The report, called “Deep Neural Network Model spending time using image processing to predict criminal behavior,” in which researchers claim to have created a facial recognition system that can predict whether a person can become a criminal with 80 percent accuracy and is free of racial bias, according to a now-deleted press release. The authors include Jonathan W. Cohen, a doctoral student and former NYPD officer.

In response to the open letter, Mr. Springer said he would not publish the paper, which he had planned to present to an upcoming conference, according to the MIT Science and Technology Review. After a thorough peer review process, the paper was rejected.

AI experts call for end to research into facial recognition predictive crime algorithms

AI experts call for end to research into facial recognition predictive crime algorithms