16 February 2026
On 12 January, Statewatch responded to the Home Office’s consultation on developing a new legal framework for governing the use of biometrics, facial recognition and similar technologies.
Support our work: become a Friend of Statewatch from as little as £1/€1 per month.
Image: CPOA, CC BY-ND 2.0
We welcome the opportunity to provide input to the Home Office’s Consultation on a new legal framework for law enforcement use of biometrics, facial recognition and similar technologies.
Despite its proliferation across law enforcement and the private sector the use of facial recognition technologies remains largely unregulated due to a lack of legal framework governing its use. Statewatch’s position is that the use of facial recognition and similar technologies should be prohibited in most cases and otherwise restricted to only very specific circumstances.
Where such technologies are in use, they must be subject to a strict legal framework that allows for proper statutory oversight. Any legal framework must recognise the far-reaching and inherently intrusive effects of facial recognition technology and be adapted accordingly. It is clear that existing regulations on biometrics and DNA are not fit for purpose and do not provide adequate safeguards for the use of facial recognition technology.
Live facial recognition (LFR)
The use of live facial recognition technology effectively inverts the long-held principle of innocent until proven guilty. Through the use of LFR members of the public are unwillingly subject to biometric identity checks which are then cross-checked with custody images held in the Police National Database (PND) to identify potential suspects. The deployment of LFR is problematic on several levels:
The cumulative effect of the use of such technology equates to mass surveillance, through which individuals are automatically and unwittingly subject to suspicion. It is a flagrant assault on individual civil liberties and constitutes a disproportionate interference with basic human rights.[1] The use of LFR is fundamentally incompatible with the protection of human rights and, as has been shown, is open to abuse and very often leads to unjust outcomes. The challenges posed by LFR make it clear that its use cannot be effectively regulated and must, therefore, be prohibited.
Retrospective facial recognition technology (RFR)
The use of retrospective facial technology is equally problematic with a concerning lack of transparency[2] having emerged with respect to its use by police. Of particular concern is the use of footage obtained by police at demonstrations and protests which can be held from anywhere between 31 days to 50 years.[3] Such practices and policies pose an explicit threat to the right to peaceful assembly and association, and to freedom of expression.
Other uses of biometric technology
The use of other biometric technologies, in particular, inferential technologies must also be explicitly prohibited. The margin of error and the invasion on individual privacy is too great to warrant any kind of exception.
In line with many civil rights activists, racial justice and equality groups and technology experts, Statewatch calls for an immediate stop to the use of biometric technologies for law enforcement purposes and urges the British government to reverse its expansion of and reliance on this technology. The far-reaching impact of such technology ultimately means that no legal framework, irrespective of how comprehensively it is conceived, can ever really be adequate to ensure sufficient safeguards and protect against its abuse.
If a decision is taken to expand the use of biometric technologies for law enforcement, we would urge government to ensure that stringent safeguards and limitations on use are provided for in any upcoming legislation. Whilst the EU’s Artificial Intelligence Act may appear tempting as a reference point, Statewatch cautions against modelling a UK legal framework on the provisions of EU legislation that ultimately fail to offer adequate protections for human rights and civil liberties. The law’s supposed safeguards are at best vague and at worst circumventable in a myriad of situations.[4]
At the very least, the rollout of facial recognition technology by the Home Office should be halted until thorough and meaningful data protection and privacy audits of police forces have been carried out. There should not be an automatic presumption that individual police forces can provide or uphold the necessary safeguards and protections. Official reviews have found, and senior officials have admitted, that police forces in England and Wales are in some cases institutionally sexist, racist and homophobic. These findings call for an urgent reassessment of the way policing works, including in relation to the protection of privacy and personal data. Rather than providing the police with new powers and access to new, invasive technologies, the government should be taking meaningful action to ensure that these deep-rooted issues are meaningfully addressed.
The use of facial recognition and similar technologies is, in and of itself, a disproportionate interference with an individual’s rights. Any justification for its use must, therefore, be set at an extremely high threshold. Broad prohibitions must apply to any use of LFR, be in in the private or public sector. Any exceptions or derogations from the prohibitions must equally be applied to all users.
If use is allowed in exceptional cases, it can only be for law enforcement purposes and must be subject to strict regulatory procedures by an independent oversight body. The oversight body must have the power to review, approve or refuse any request for the use of LFR technology as well as to investigate failures of law enforcement or other entities to comply with relevant procedures or uphold legal safeguards.
That body should be required to maintain statistics on the number of requests received from law enforcement bodies, the number and type of decisions made, and details of any investigations carried out. Those numbers and other information on the oversight body’s work should be made public in an annual report, with a presumption given to the maximum disclosure of information on what is a matter of substantial public interest and of great importance for civil liberties. Data obtained through biometric technology is highly sensitive and if acquired must be adequately protected. In this regard, we consider it highly unfortunate that recent changes to data protection law (through the Data Use and Access Act) have weakened previously-existing safeguards on law enforcement use of data[5]. If there is to be a legal framework to regulate law enforcement use of biometric technologies, those changes should be reversed. This is particularly so with regard to the removal of the requirement to maintain logs, the easing of automated decision-making and the introduction of a new national security exemption.
More generally, it is imperative that the ICO be provided with sufficient resources to deal with complaints regarding the use of biometric technologies for law enforcement quickly and effectively.
Notes
[1] https://www.equalityhumanrights.com/met-polices-use-facial-recognition-tech-must-comply-human-rights-law-says-regulator
[2] https://libertyinvestigates.org.uk/articles/hundreds-of-thousands-of-innocent-people-on-police-databases-as-forces-expand-use-of-facial-recognition-tech/
[3] https://www.statewatch.org/news/2025/october/uk-police-footage-of-protests-can-be-held-for-decades/
Changes to UK law will undermine data protection standards, posing risks to individual rights and leading to calls for the EU to review the "adequacy decisions" that deem the UK a safe destination for transfers of personal data. A letter from seven organisations, including Statewatch, calls for the EU to urgently reassess the UK's adequacy status, "to protect fundamental rights and uphold its credibility as both the guardian of the EU’s legal order and a global leader in digital rule-making." However, the EU is also currently seeking to downgrade data protection standards, for the same purpose: economic deregulation.
Following the racist pogroms that broke out across England at the end of July and beginning of August, the prime minister, Keir Starmer, announced a range of new policing measures - including a proposal for "wider deployment of facial recognition technology." A letter signed by more than two dozen organisations, including Statewatch, says that an expansion of live facial recognition "would make our country an outlier in the democratic world" and calls for the plan to be dropped.
The UK's new Labour government must ensure "proper regulation of biometric surveillance in the UK," says a letter signed by nine human rights, racial justice and civil liberties groups, including Statewatch. "No laws in the UK mention facial recognition, and the use of this technology has never even been debated by MPs," the letter highlights. It calls on the new home secretary, Yvette Cooper, and the science, technology and innovation minister, Peter Kyle, to meet the signatory groups "to discuss the need to take action and learn from our European partners in regulating the use of biometric surveillance in the UK more broadly." A separate letter to Scotland's cabinet secretary for justice and home affairs raises similar points, and calls on the Scottish government "to stop the proposed use of live facial recognition surveillance by Police Scotland."
Spotted an error? If you've spotted a problem with this page, just click once to let us know.