WorldPredict crimes, at what cost?

Predict crimes, at what cost?



More information

Predicting where crimes can happen is an old desire of academics, public administrators and police forces. But what are the problems behind this Minority Report? The so-called predictive policing began to gain attention in the early 2010s. More recently, some American cities like Chicago, New York and New Orleans have already tested or are still using this type of tool in police work. So the idea of ​​a Minority Report Real life came out of the paper and companies like PredPol and Palantir started to make money selling their solutions and did some secret experiments with this technology. In 2021, this idea is arriving in Brazil.

Ignoring the debates established for years about this idea already in force in other countries, the Igarapé Institute, which calls itself a think and do tank, wants to select two Brazilian municipalities to test a technological tool called CrimeRadar. With the use of algorithms, the promise is to predict the occurrence of crimes and help managers to distribute vehicles and police across the territory. The technology requires the processing of an immense amount of data but, according to the Institute’s notice, it is not possible to know how this information will be collected and stored. Several questions were made on social networks about the vigilant potential of this tool, especially regarding what would be the implications of this type of technology for the black population in a country where eight out of 10 people arrested using facial recognition are black.

The problems of predictive policing are many. The central issue of these new surveillance technologies is the data that fuels them. It has long been known that the work of Brazilian police officers and the entire criminal justice system is structured by racism.

Blacks are the most approached by the police, and the favelas and suburbs are the only parts of Brazilian cities where the police enter with armored vehicles and helicopters, feeling free to shoot. Blacks are the majority of the prison population, they are also the ones who receive the least legal favor during the custody hearing, they are the ones who most die from homicides, in addition to being the majority of fatal victims of the security forces. These are the data produced by decades of racism and violence against the black population that will be used in crime prediction models. What will the machine learn from this history?

The discussion about racism and technology has been made by scholars around the world and can already be seen in documentaries such as Coded Bias, which shows how researchers at MIT and Columbia Universities have identified the perpetuation of racism in a new guise: technology. Machines learn racism and reproduce it.

The Igarapé Institute responds to these criticisms by pointing out that CrimeRadar is not only a predictive policing solution, but also a tool that has a “social impact strategy”. The institute, however, does not present what and how many actions they take to “fix” the centuries-old bias of persecution of black bodies by the forces of the State. In this regard, the organization falls on a fundamental point that has been the keynote of every technological solution applied in Brazil and in the world: the lack of transparency and public debate. After the repercussion, the Igarapé Institute decided to suspend on Tuesday (28/09) the CrimeRadar public notice.

Support news production like this. Subscribe to EL PAÍS for 30 days for 1 US$

Click here

Facial recognition technologies have spread throughout the country without much fanfare, precisely because of the lack of transparency with which police corporations and public security secretariats deal with the subject. The Panopticon project, developed by the Center for Security and Citizenship Studies (CESeC), seeks to shed light on the subject, producing and disseminating data on prisons, approaches, government contracts and financing so that it is possible to understand how this technology has been used and what are its impacts, mainly for the black population.

A survey carried out in 2020 shows that at least 22 Brazilian states have already tested, used or are in the process of acquiring facial recognition systems. The project was able to monitor the arrest of 184 people using this technology in 2019 — 90% of them were black and were arrested for crimes without violence.

States, municipalities and the federal government have been dedicated in recent years to increasing surveillance of the population. Drones, face recognition and license plate reading cameras, tools that scan social media profiles and genetic databases are some of the novelties that have been used to increase surveillance on the streets of Brazilian cities, especially in the suburbs and favelas where most of the residents is black.

Therefore, it is even more shocking to note that not only public authorities are interested in these flawed and racist technologies, but also part of civil society. This year, the Central Única das Favelas (CUFA) used facial recognition to register people seeking help to mitigate the impacts of the pandemic. The system was provided by Unico and it was not clear at the time what the company’s role in data storage and processing was. Faced with questions and criticism, CUFA decided to suspend the project, leading Unico to take a stand, pointing out its compliance with the General Data Protection Law, which regulates the use of personal data in Brazil and classifies biometric information as sensitive data

These cases demonstrate the need for dialogue between institutions and researchers specializing in the subject. A public and enlightened debate about something new and presented as “evolution” is essential to guarantee the construction of security solutions that do not involve victimization of a population that already sees itself violated in many of its basic rights. Algorithms must serve society. Society must not serve algorithms. We are not in a cinematic fiction. In the real world, we cannot afford to allow these technologies to be yet another factor in violations of rights.

Pablo Nunes he holds a PhD in Political Science. Assistant coordinator of the Center for Security and Citizenship Studies (CESeC).

sign up on here to receive EL PAÍS Brasil’s daily newsletter: reports, analyses, exclusive interviews and the main information of the day in your e-mail, from Monday to Friday. sign up also to receive our weekly newsletter on Saturdays, with highlights of coverage for the week.

Log in to continue reading

Know that you can read this article now, it’s free

Thank you for reading EL PAÍS