A coalition of artificial inteligence researchers, big data scientists, and sociologists called on the academic world to stop publishing studies that claim to predict criminality of people by their faces. We are talking about algorithms that are trained using face scans and criminal statistics, writes The Verge.
According to the Coalition for Critical Technologies, such scientific works are not only illiterate but also perpetuate stereotypes directed against black and colored people. According to the results of many studies, the justice system is tougher for these groups of people than for white people. Based on this, it can be said that any program based on such data simply strengthens the bias of society and strengthens racism.
Algorithms, which learn from racist data show racist results.
The group explains this statement as follows: this kind of research is based on the fact that data regarding arrests and any other records connected with the criminal world can serve as reliable, neutral indicators of criminal activity. But the problem is that these data are not neutral.
The open letter was a response to the news that the world’s largest publisher of academic literature Springer plans to publish a study on this topic. The letter was signed by 1700 experts. They urge not only to cancel the newspaper but also to others not to publish studies on this subject.
The study notes that the accuracy of the algorithm for deciding criminality is 80%.
In response to the letter, Springer said: they will not publish the article. It was presented at an upcoming conference where Springer planned to publish materials. But after the review process, it was rejected.
The coalition notes that this incident is just one example. At the moment, a trend has already emerged in such spheres as data science and machine learning.
A day ago it became known that more than 1650 Google employees signed an open letter to CEO Sundar Pichai demanding that they stop selling their technology to police forces throughout the United States. Googlers find it impossible criminalizing black existence while all cry that Black Lives Matter. Employees pay attention to company’s current Cloud contract with one of the New York police departments. They also highlight the company’s indirect support for the Arizona Sheriff’s department, which tracks people who cross the US-Mexican border. You can read about this in our article.