UK: Unmasking Facial Recognition: An exploration of the racial bias implications of facial recognition surveillance in the United Kingdom

Topic
Country/Region
UK

A new report by WebRoots Democracy, a think tank focused on progressive and inclusive technology policy, looks at the implications of the police use of facial recognition technology for people of colour and Muslims - two social groups who are heavily monitored by the state.

Support our work: become a Friend of Statewatch from as little as £1/€1 per month.

The text below is taken from the full report, which is available here: Unmasking Facial Recognition (WebRoots Democracy, link)

Executive summary

The increased use of live facial recognition technology (also referred to as ‘automated facial recognition’ or simply as ‘facial recognition’) by the police in the United Kingdom has become the subject of heated discussion in recent years. The controversy has centred around two issues. The first is that the use of this technology (embedded into cameras) erodes the privacy of members of the public and is akin to fingerprinting as each passing face is intricately analysed. During this analysis, templates of faces are created based on data points such as the distance between the eyes or the length of a nose. The second point of controversy is the ‘racial bias’ challenge of these systems. To date, this conversation has focused on the reported inability of the technology to accurately analyse the faces of people of colour which, in a policing context, could lead to innocent people of colour being flagged up as a suspected criminal.

This report, Unmasking Facial Recognition, has sought to look beyond this question of accuracy and to situate live facial recognition technology (LFR) within the broader context of racialised surveillance. It is focused on the potential implications of the police’s use of the technology for people of colour and Muslims in the UK, two groups who have historically been subjected to over-policing.

In addition to desk-based research, we conducted a policy workshop, an expert roundtable, a public seminar, and interviews with individuals working on surveillance and racialised policing. We submitted freedom of information requests to the Metropolitan Police and South Wales Police in order to obtain copies of their equality impact assessments for facial recognition deployments. Finally, to better understand the human bias within these systems, we undertook a test of a publicly available facial recognition system using the faces of 300 UK Members of Parliament, including all 64 Black, Asian and Minority Ethnic (BAME) MPs.

Recommendations

This report makes one key recommendation which is for authorities in the UK to impose a ‘generational ban’ on the police’s use of LFR technology. In addition to this, we make a series of recommendations which we believe should be put in place if police forces continue to use the technology. These recommendations are explained in the final chapter of the report (p37).

  1. A generational ban
  2. Mandatory equality impact assessments
  3. Collection and reporting of ethnicity data
  4. Publication of algorithms
  5. Regular, independent audits
  6. Diversity reporting for third-party developers
  7. Protections for religious minorities
  8. Protections for political protests
  9. A fair-trade approach
  10. A data firewall between immigration enforcement and public services

Key findings

  • The Metropolitan Police failed to undertake an equality impact assessment prior to their trials of LFR across London.
  • It is highly likely that LFR will be used disproportionately against Muslims and communities of colour.
  • It is highly likely that the expansion of LFR will bolster calls for a face veil ban in the UK.
  • There is a particular risk of ‘anti-Black’ racism within the development of LFR.
  • The use of LFR, particularly at protests, is likely to induce a ‘chilling effect’ amongst political activists.
  • Concerns over the accuracy of LFR is distracting from the wider debate on racialised surveillance.

Our work is only possible with your support.
Become a Friend of Statewatch from as little as £1/€1 per month.

 

Spotted an error? If you've spotted a problem with this page, just click once to let us know.

Report error