Description of Facial Identification, Privacy Advocates Describes Brussels
A group of 51 organizations has just sent an open letter to European Commissioners calling for a ban on any deployment of facial recognition tools to spy on citizens.
A group of 51 digital rights organizations has just urged the European Commission to ban the use of facial recognition technologies for mass surveillance altogether - and without any exceptions.
Made up of groups from several European countries, such as Big Brother Watch UK, Algorithm Watch and the European Digital Society, the call was chaperoned by the European Digital Rights Network (EDRi) in the form of an open letter addressed to the European Commissioner for Justice, Didier Reynders. It comes a few weeks before the Commission publishes, on April 21, new long-awaited rules on the ethical use of artificial intelligence on the continent.
The letter from these NGOs urges Brussels to support increased protection of fundamental human rights in future laws, in particular with regard to facial recognition and other biometric technologies, when these tools are used in public spaces to carry out a mass surveillance.
According to this coalition of NGOs, there are no examples where the use of facial recognition for mass surveillance purposes justifies the harm it could cause to human rights, such as the right to privacy, data protection, non-discrimination or free expression.
Proponents of these technologies often claim that they are a reasonable tool to deploy in certain circumstances, for example to monitor the public in law enforcement. Conversely, the signatories of the letter argue that a blanket ban should instead be imposed on all potential use cases.
“Wherever biometric technology involves mass surveillance, we demand a ban on all uses and applications without exception,” explains Ella Jakubowska, policy and campaign manager at EDRi, “We believe that any use that indiscriminately or arbitrarily targets people in a public space will always and undoubtedly violate fundamental rights. It will never reach the threshold of necessity and proportionality. "
I will go and film at your place
Example in China, where the regime uses facial recognition to conduct mass surveillance of the Muslim Uyghur population living in Xinjiang, through portal-type scanning systems that record biometric characteristics, as well as Smartphone fingerprints to follow the movements of residents.
Closer to home, recent research coordinated by EDRi found examples of controversial deployments of biometric technologies for mass surveillance in a large majority of EU countries.
These examples range from the use of facial recognition for queue management at airports in Rome and Brussels to the use of this technology by German authorities to monitor G20 protesters in Hamburg. The European Commission has awarded a grant of 4.5 million euros ($ 5.3 million) for the deployment of a technology dubbed iBorderCtrl in certain European border crossings, which captures the gestures of travelers to detect those who could lying while trying to enter an EU country illegally.
The EU takes up the case
In recent months, however, some senior European Union officials have come out in favor of legislation that would limit the reach of facial recognition technologies. In a white paper published last year, Brussels even claimed to consider banning this technology altogether.
EU Digital Vice-President Margrethe Vestager adds that using facial recognition tools to automatically identify citizens is at odds with the Union's data protection regime, as it does not meet one of the main requirements of the GDPR, namely obtaining the consent of the person before the processing of their biometric data.
According to EDRi, this will not be enough to prevent technology from interfering with human rights. The GDPR leaves space for exemptions in case of "strict necessity". Which, coupled with a misapplication of the consent rule, has led to examples of the use of facial recognition to the detriment of EU citizens, such as those uncovered by EDRi.
Brussels must do more for NGOs
“We have evidence that the existing legal framework is poorly applied and presents implementation problems. Thus, although the Commissioners seem to agree that in principle these technologies should be prohibited by the GDPR, this prohibition does not exist in reality, ”explains Ella Jakubowska. "This is why we want the Commission to publish a more specific and clearer ban, which builds on existing bans in general data protection law," notes the latter.
EDRi and the 51 organizations that signed the open letter join a chorus of activist voices demanding similar action in recent years. As a reminder, more than 43,500 European citizens have signed a “Reclaim Your Face” petition, calling for the ban of biometric mass surveillance practices in the EU. A few months ago, the Council of Europe also called for the ban of certain applications of facial recognition, when they are likely to lead to discrimination .
Pressure is therefore mounting on the European Commission, as the institution's publication of new rules on AI approaches, which should determine the place and relevance of the EU in what is often described as a race against the Chinese and American superpowers.
The pressure is building
For Ella Jakubowska, however, this is an opportunity to be seized. “These technologies are not inevitable,” she says. “We are at an important tipping point where we could in fact prevent a lot of future harm and authoritarian technological practices before they go any further. We don't have to wait for huge and disruptive impacts on people's lives to stop them. This is an incredible opportunity for civil society to intervene, at a time when we can still make a difference. "
As part of the open letter published on Tuesday, the EDRi also urged the Commission to carefully consider other potentially dangerous applications of AI, and to draw a few red lines if necessary.
Among the use cases that could be problematic, the signatories pointed to technologies that could impede access to health care, social security or justice, as well as systems that make predictions about behaviors and the thoughts of citizens; and algorithms capable of manipulating individuals, and posing a threat to human dignity, organization and collective democracy.