03 September 2020
A new report by WebRoots Democracy, a think tank focused on progressive and inclusive technology policy, looks at the implications of the police use of facial recognition technology for people of colour and Muslims - two social groups who are heavily monitored by the state.
The text below is taken from the full report, which is available here: Unmasking Facial Recognition (WebRoots Democracy, link)
The increased use of live facial recognition technology (also referred to as ‘automated facial recognition’ or simply as ‘facial recognition’) by the police in the United Kingdom has become the subject of heated discussion in recent years. The controversy has centred around two issues. The first is that the use of this technology (embedded into cameras) erodes the privacy of members of the public and is akin to fingerprinting as each passing face is intricately analysed. During this analysis, templates of faces are created based on data points such as the distance between the eyes or the length of a nose. The second point of controversy is the ‘racial bias’ challenge of these systems. To date, this conversation has focused on the reported inability of the technology to accurately analyse the faces of people of colour which, in a policing context, could lead to innocent people of colour being flagged up as a suspected criminal.
This report, Unmasking Facial Recognition, has sought to look beyond this question of accuracy and to situate live facial recognition technology (LFR) within the broader context of racialised surveillance. It is focused on the potential implications of the police’s use of the technology for people of colour and Muslims in the UK, two groups who have historically been subjected to over-policing.
In addition to desk-based research, we conducted a policy workshop, an expert roundtable, a public seminar, and interviews with individuals working on surveillance and racialised policing. We submitted freedom of information requests to the Metropolitan Police and South Wales Police in order to obtain copies of their equality impact assessments for facial recognition deployments. Finally, to better understand the human bias within these systems, we undertook a test of a publicly available facial recognition system using the faces of 300 UK Members of Parliament, including all 64 Black, Asian and Minority Ethnic (BAME) MPs.
This report makes one key recommendation which is for authorities in the UK to impose a ‘generational ban’ on the police’s use of LFR technology. In addition to this, we make a series of recommendations which we believe should be put in place if police forces continue to use the technology. These recommendations are explained in the final chapter of the report (p37).
Spotted an error? If you've spotted a problem with this page, just click once to let us know.
Statewatch does not have a corporate view, nor does it seek to create one, the views expressed are those of the author. Statewatch is not responsible for the content of external websites and inclusion of a link does not constitute an endorsement. Registered UK charity number: 1154784. Registered UK company number: 08480724. Registered company name: The Libertarian Research & Education Trust. Registered office: 10 Queen Street Place, London EC4R 1BE. © Statewatch ISSN 1756-851X. Personal usage as private individuals "fair dealing" is allowed. We also welcome links to material on our site. Usage by those working for organisations is allowed only if the organisation holds an appropriate licence from the relevant reprographic rights organisation (eg: Copyright Licensing Agency in the UK) with such usage being subject to the terms and conditions of that licence and to local copyright law.