France: Flaws and injustices of 'predictive policing' laid bare in new report

Topic
Country/Region

As part of a European initiative coordinated by Statewatch, La Quadrature du Net has published an English translation of its report on the state of predictive policing in France. In light of the information gathered, and given the dangers these systems carry when they incorporate socio-demographic data as a basis for their recommendations, La Quadrature calls for a ban.

Support our work: become a Friend of Statewatch from as little as £1/€1 per month.

The report compiles data on several predictive policing software systems formerly or currently in use within French police forces. These include:

  • RTM (Risk Terrain Modelling), a “situational prevention” software program used by the Paris Police Prefecture to target intervention zones based on “environmental” data (presence of schools, shops, metro stations, etc.);
  • PredVol, a software developed in 2015 within the government agency Etalab, tested in Val d’Oise in 2016 to assess the risk of car thefts, abandoned in 2017 or 2018;
  • PAVED, a software developed from 2017 by the Gendarmerie and trialed from 2018 in various departements to assess the risk of car thefts or burglaries. In 2019, shortly before its planned nationwide rollout, the project was “paused”;
  • M-Pulse, previously named Big Data of Public Tranquility, developed by the city of Marseille in partnership with the company Engie Solutions to assess the suitability of municipal police deployments in urban public space;
  • Smart Police, an application that include a “predictive” module and that is developed by French startup Edicia which, according to its website, has sold this software suite to over 350 municipal forces.

The report criticises these systems on a number of grounds:

  • out of bad faith or ideological laziness, the developers of these technologies maintain a grave confusion between correlation and causation (or at least refuse to make the distinction between the two);
  • the data used to feed the systems is likely to lead to the targeting of the most precarious populations and those most exposed to structural racism;
  • the developers and promoters of the systems rely on discredited criminological theories, and do not take into account exclusion, discrimination, and the social violence of public policies;
  • by using data from areas and communities that are already over-policed, predictive policing systems produce a self-fulfilling prophecy;
  • the technologies may underpin abuses of power, such as being used to justify administrative police officials being given judicial powers;
  • despite the absence of any official evaluation, available data points to the absence of added value of predictive models in achieving the objectives the police had set themselves;
  • the developers and users of predictive policing systems in France may be failing to comply with data protection law.

La Quadrature du Net note that "what these systems produce is, above all, an automation of social injustice and of police violence, and an even greater dehumanization of relations between the police and the population."

For this reason, the report calls for predictive policing technologies to be banned.

Read the full report and explore other reports in our series on 'predictive policing' in Europe.

Our work is only possible with your support.
Become a Friend of Statewatch from as little as £1/€1 per month.

Further reading

29 April 2025

EU’s secretive “security AI” plans need critical, democratic scrutiny, says new report

The EU is secretively paving the way for police, border and criminal justice agencies to develop and use experimental “artificial intelligence” (AI) technologies, posing risks for human rights, civil liberties, transparency and accountability, says a report published today by Statewatch.

15 April 2025

Belgium: New report calls for a ban on 'predictive' policing technologies

Following an investigation carried out over the past two years, Statewatch, the Ligue des droits humains and the Liga voor mensenrechten, jointly publish a report on the development of ‘predictive’ policing in Belgium. There are inherent risks in these systems, particularly when they rely on biased databases or sociodemographic statistics. The report calls for a ban on ‘predictive’ systems in law enforcement.

08 April 2025

UK government wants to legalise automated police decision-making

A proposed law in the UK would allow police decisions to be made solely by computers, with no human input. The Data Use and Access Bill would remove a safeguard in data protection law that prohibits solely automated decision-making by law enforcement agencies. Over 30 civil liberties, human rights, and racial justice organisations and experts, including Statewatch, have written to the government to demand changes.

 

Spotted an error? If you've spotted a problem with this page, just click once to let us know.

Report error