New study reveals alarming expansion of biometric mass surveillance in Europe

Press release published by the Greens/EFA group in the European Parliament, 25 October 2021.



Today, the Greens/EFA group in the European Parliament published a study on Current Practices of Biometric Mass Surveillance in the European Union (e.g. facial surveillance or behavioural analysis in public spaces). [1] The report reveals the alarming expansion of biometric mass surveillance in the EU, and how these technologies have already been put to use in cities and entire countries. It presents evaluations and warnings from data protection authorities across the Union, and how they have been ignored or fought by government agencies. An interactive map visualises just how widespread the testing or deployment of such technologies already is. [2]

During the study presentation earlier today, report author Francesco Ragazzi and Greens/EFA MEPs Patrick Breyer and Saskia Bricmont urged media correspondents and civil society to pay close attention to the ongoing development of overreaching, privacy-infringing and often pseudo-scientific technologies. Error-prone facial surveillance, facial recognition and behavioural analysis technology seriously threatens fundamental rights and liberties.

Patrick Breyer, MEP for the German Pirate Party and Greens/EFA coordinator of the group’s campaign for a ban on biometric mass surveillance, comments:

“This report is further proof that George Orwell’s dystopian look into the future could soon be outlived by reality. Not only does the report reveal the testing and roll-out of biometric mass surveillance technology in the EU, but it also shows the inefficacy of many of these technologies in fighting crime, with projects failing to reach their goals, experiencing high false-positive rates, wrongfully incriminating citizens, and even misidentifying hugs as suspicious behaviour. The EU needs to stop its funding of the development of such dangerous surveillance technologies immediately.”

Saskia Bricmont, Belgian MEP for Greens/EFA who commissioned the study, underlines the need for immediate action:

“It is very concerning that experiments of facial recognition in public spaces have been carried out without legal authorisation. It shows how, in spite of public discourse against their use, these technologies are being developed below the radar. It is of utmost importance to edict a clear ban and prohibit the deployment of both indiscriminate and targeted remote facial and behavioural recognition technologies in public spaces. Transparency and accountability of these technologies has to be strengthened to allow for truly effective public oversight.”

The findings of the study will be further discussed at a webinar [3] on the 25 October from 16:00 to 17:00 CET , along with an interactive map on current practices of biometric mass surveillance in the European Union. After the presentation of the studies’ main findings and recommendations, experts will focus on two case studies of the use of biometric mass surveillance technologies in the EU: the Zaventem Airport experiment as well as the biometric surveillance tests at Berlin Südkreuz train station. The presentations will be followed by a Q&A session with the audience.

The European Parliament is currently debating whether the new AI Act should ban biometric mass surveillance. A recent non-binding resolution spoke out in favour of a ban, but EU governments are strictly opposing this. A European Citizens Initiative “Reclaim Your Face” aims at collecting 1M. Signatures to get a ban. [4]

[1] Link to study: http://extranet.greens-efa.eu/public/media/file/1/7297
[2] Link to interactive map: https://www.greens-efa.eu/biometricsurveillance/
[3] Registration page for study presentation event:
https://www.patrick-breyer.de/event/current-practices-of-biometric-mass-surveillance-in-the-eu-study-presentation/
[4] European Citizens’ Initiative ‘Reclaim Your Face’:
https://reclaimyourface.eu/

 

Spotted an error? If you've spotted a problem with this page, just click once to let us know.

Report error