Would we accept that a postman read all our letters (or WhatsApp messages) to prevent crimes? This is the debate that will be decided in the coming weeks. The European Union, under the presidency of Spain, is debating a new regulation to prevent the dissemination of images of child sexual abuse.
The proposal has been the most criticized technology standard of the last decade: from Edward Snowden to the European Data Protection Supervisor, passing through the Commissioner for Human Rights of the Council of Europe, Dunja Mijatovic, and hundreds of academics, the vast majority of Technology and human rights experts consider the proposal a threat to the confidentiality of communications and, therefore, freedom of expression, association, movement and, in general, a threat to our independence.
The proponents of this law, particularly Home Affairs Commissioner Ylva Johansson and the rest of the Directorate she leads within the European Commission, focus the debate on an alleged dichotomy between the protection of children and the protection of other human rights. This angle hides a fundamental question from public opinion: is this proposal adequate to protect children? Only if the answer is yes, the debate raised makes sense.
The first question is: can the objective of the proposal be achieved as stated? Does the technology exist to properly implement this directive? The answer from a technical point of view is simple: no. Current detection techniques are not as accurate as would be needed to detect child exploitation and abuse material (MESI): they would either miss a large amount of MESI, or they would flag a large number of false positives (material erroneously marked as MESI). . Given the volume of material to be scanned, processing these false positives would require a significant amount of manpower currently unavailable – or would result in a large number of false accusations.
But the problem doesn’t end there. Not only are these techniques ineffective, but they are incredibly easy to circumvent. Multiple articles demonstrate that very simple manipulations, which do not modify the appearance to the human eye, can cause the detectors to not indicate true MESI; or mark harmless material as MESI. Therefore, those who want to distribute MESI will continue to do so with impunity, while the rest of the citizens will have all their content scanned without any profit. There will therefore be a risk that teenagers who share consensual sexual content, for example, will see how their most intimate photos pass through the hands of the police, Europol and any official of the future European agency dedicated to this matter that has access to them. Or that parents consulting about their children’s illnesses are accused of horrible crimes with the consequences that this may have.
From a technical point of view, there are no guarantees that the rule will have any positive effect. Despite the number of times scientists and members of civil society have demanded that proponents show some evidence of supposed benefits, we have seen none; On the contrary, there is ample evidence that these benefits cannot be achieved.
The second question is: is the measure proportional? Can this law be implemented without risk of causing great harm to the fundamental rights of everyone, including the very children it seeks to protect? The answer, unfortunately, is once again no.
The proposal, if implemented, will break any guarantee of confidentiality that encryption currently provides. Commissioner Johansson and the artificial intelligence companies promoting the proposal (such as Thorn or Microsoft) say that there is no risk, ignoring the definition of confidentiality. Say that this legislative proposal to scan allelectronic communications does not affect confidentiality and privacy is like saying that reading a letter before putting it in the envelope does not impact the confidentiality of analog communications (for example, an envelope that protects our correspondence).
In this context, it is worth highlighting the main role of the lobbies of artificial intelligence companies that has been uncovered after the investigation that has recently been published by independent media and that reveals the conjunction of private interests, Europol and certain political actors to de facto prohibit the confidentiality of communications.
The other point that has not yet been answered is how to ensure that the scanning capabilities can only be used for MESI. The reality is that, again, technically you can’t. The algorithms only check if bits look like other bits, they cannot decide if it is MESI or not. At the moment there is only a promise that it will not be expanded, and examples from the past that say this is likely to happen, including revelations that, for example, Europol has already requested access to all data (illegal or otherwise) that could be collected as result of the application of this law.
Who supports a rule as invasive as this? A large portion of children, adolescents, and even many survivors of sexual abuse do not. According to a European study, around 80% of children in the EU say they would not feel comfortable being politically active or exploring their sexuality if they knew their communications were being continuously spied on. The same study shows how two thirds of young Europeans use applications such as WhatsApp, Telegram or Signal that use encryption, and how the same number disagree that their conversations should be read in advance.
Abuse survivors also do not support the draft regulation: Alexander Hanff, an activist who was a victim of sexual abuse, warns that Commissioner Johansson’s legislative proposal will lead to survivors feeling unprotected when seeking support from the authorities. Another victim of sexual abuse, Marcel Schneider, has sued Facebook for reading his private messages and thus eliminating confidentiality for victims of abuse. Not even police forces are convinced: Both the FBI in the United States and police officials in the Netherlands and Germany have warned that the system will produce more reports, many of them with false alarms, and how that will make it more difficult to find criminals and protect victims. Unfortunately, the Spanish Government maintains a proposal in favor of prohibiting the confidentiality of communications by prohibiting encryption, and unexpectedly aligns itself with the MEP of the Spanish Popular Party who is leading the discussion in the European Parliament, Javier Zarzalejos.
Ultimately, the proposal in its current form guarantees no improvement and is a threat to our democracy. Children must be protected, without a doubt. The future of our society depends on them. But it must be done effectively and safely. The Commission’s proposal is neither.
Carmela Troncoso She is a researcher at the Federal Polytechnic School of Lausanne and leader of the team of scientists that developed the privacy protocol for Covid tracking applications.
Diego Naranjo He is responsible for public policies at the NGO in favor of digital rights EDRi.
You can followEL PAÍS Technology in Facebook andx or sign up here to receive ourweekly newsletter.
Subscribe to continue reading
Read without limits
_