Liberty calls for ban on 'predictive policing'
Follow us: | | Tweet
"Police forces in the UK should abandon their tests of computer programs to predict where crimes are likely to happen and whether individuals are likely to re-offend, human rights pressure group Liberty says today. According to the group, at least 14 forces in the UK are testing or in the process of developing predictive policing systems based on machine-learning algorithms.
A highly critical report, Policing by Machine, says that such systems can entrench bias, by making decisions based on historical 'big data' about crimes. The predictions may be based on 'black box' algorithms, which are impossible to scrutinise. Although police forces generally require human oversight over such programs, in practice officers are likely to defer to the algorithm, the report warns.
'A police officer may be hesitant to overrule an algorithm which indicates that someone is high risk, just in case that person goes on to commit a crime and responsibility for this falls to them they simply fear getting it wrong. It is incredibly difficult to design a process of human reasoning that can meaningfully run alongside a deeply complex mathematical process,' the report states."
See: Liberty calls for ban on 'predictive policing' (Law Society Gazette, link)
The report: Policing by machine: Predictive policing and the threat to our rights (link to pdf)
Liberty's summary: Liberty report exposes police forces use of discriminatory data to predict crime (link) including:
The report makes a number of recommendations, including:
- Police forces in the UK must end their use of predictive mapping programs and individual risk assessment programs.
- At the very least, police forces in the UK should fully disclose information about their use of predictive policing programs. Where decision-making is informed by predictive policing programs or algorithms, this information needs to be communicated to those directly impacted by their use, and the public at large, in a transparent and accessible way.
- Investment in digital solutions for policing should focus on developing programs that actively reduce biased approaches to policing. A human rights impact assessment should be developed in relation to new digital solutions, which should be rights-respecting by default and design."
USA: Problems with predictive policing (Statewatch News Online, August 2016)
Predictive policing in London: commercial interests trump accountability (Statewatch News Online, August 2014)
Predictive policing: mapping the future of policing? (OpenDemocracy, link)
"Predictive policing" comes to the UK (Statewatch News Online, March 2013)
Search our database for more articles and information or subscribe to our mailing list for regular updates from Statewatch News Online.
Support our work by making a one-off or regular donation to help us continue to monitor the state and civil liberties in Europe.
We welcome contributions to News Online and comments on this website. E-mail us or send post to Statewatch c/o MayDay Rooms, 88 Fleet Street, London EC4Y 1DH.
Home | News Online | Journal | Observatories | Analyses | Database | SEMDOC | About Statewatch
© Statewatch ISSN 1756-851X. Personal usage as private individuals/"fair dealing" is allowed. We also welcome links to material on our site. Usage by those working for organisations is allowed only if the organisation holds an appropriate licence from the relevant reprographic rights organisation (eg: Copyright Licensing Agency in the UK) with such usage being subject to the terms and conditions of that licence and to local copyright law.