The European Union has been debating a new law called ChatControl for almost a year. Its goal is to find child pornography, but it also runs the risk of massively violating the fundamental rights of European residents.
The proposal calls for the creation of special coordinating authorities in each Member State that will be empowered to order the installation of an automated system for monitoring user conversations on some digital services in order to detect child pornographic content. The discussions of all users would be continuously monitored in accordance with these directives for a maximum of 24 months and compared with databases that had markers that may identify the existence of sensitive photos.
Without making a distinction, for instance, between previously reported accounts, particular individuals, or geographical locations, the scan would be concerned with all users.
This regulation creates a lot of questions. Although the technical solutions so far suggested could allow for the use of encryption systems for the data under analysis, there would still be a systematic and widespread control of all European citizens, in violation of fundamental principles like the presumption of innocence and the proportionality in the processing of personal data. Additionally, the usage of end-to-end encryption technologies would be prohibited by the installation of these systems. Since only the sender and recipient of information can access the data, this sort of encryption is highly common; for instance, it is used by WhatsApp, Signal, and iCloud. It provides the maximum level of information security for those who utilize it. On the other hand, other parties are not permitted access.
As unencrypted data is required for message scanning, maintaining this degree of security would be difficult.
It is likely that a paradox would develop: on the one hand, regular users would lose access to end-to-end encryption and be vulnerable to private and public intrusion; on the other hand, those who distribute child pornography would turn to alternative services that are less well-known and insensitive to this kind of solution, like decentralized networks.
The hazards include both algorithmic abuses and mistakes, which would be made worse by the volume of data that the rules would influence. For instance, every day on WhatsApp, about 70 billion messages are sent and received. Even a 1% mistake rate would result in a huge number of inaccurate reports, which would increase expenses and administration time as well as cause the people affected great inconvenience.
In actuality, the transfer of a photo from a parent to a physician for perfectly legitimate reasons might set off an alarm and a pointless inquiry.
I am not persuaded by the strategy used to solve the issue. In the belief that technology can give quick, easy fixes for complicated, ingrained problems, recent technical advancements have led to a movement toward enormous data collecting with a goal to regulating and suppressing criminal activity. In actuality, it happens frequently that the examination of vast volumes of data affects the efficacy of investigations since it overwhelms the authority with pointless material that requires time and resources to handle.
This makes the ChatControl Regulation an example of technology solutionism. The answers should be varied, without underestimating the issue of online child exploitation and the danger of grooming.
Prioritizing a preventative strategy that invests in raising parental and child awareness as well as the capture of those responsible for crimes involving the exploitation of kids are the first two things that need to be done.