UK: Liberty calls for ban on 'predictive policing'

Topic
Country/Region
UK

"Police forces in the UK should abandon their tests of computer programs to predict where crimes are likely to happen and whether individuals are likely to re-offend, human rights pressure group Liberty says today. According to the group, at least 14 forces in the UK are testing or in the process of developing ‘predictive policing’ systems based on machine-learning algorithms."

Support our work: become a Friend of Statewatch from as little as £1/€1 per month.

See: Liberty calls for ban on 'predictive policing' (Law Society Gazette, link):

"A highly critical report, Policing by Machine, says that such systems can entrench bias, by making decisions based on historical 'big data' about crimes. The predictions may be based on 'black box' algorithms, which are impossible to scrutinise. Although police forces generally require human oversight over such programs, in practice officers are likely to defer to the algorithm, the report warns.

'A police officer may be hesitant to overrule an algorithm which indicates that someone is high risk, just in case that person goes on to commit a crime and responsibility for this falls to them – they simply fear getting it wrong. It is incredibly difficult to design a process of human reasoning that can meaningfully run alongside a deeply complex mathematical process,' the report states."

The report: Policing by machine: Predictive policing and the threat to our rights (link to pdf)

Liberty's summary: Liberty report exposes police forces’ use of discriminatory data to predict crime (link) including:

"Recommendations

The report makes a number of recommendations, including:

  • Police forces in the UK must end their use of predictive mapping programs and individual risk assessment programs.
  • At the very least, police forces in the UK should fully disclose information about their use of predictive policing programs. Where decision-making is informed by predictive policing programs or algorithms, this information needs to be communicated to those directly impacted by their use, and the public at large, in a transparent and accessible way.
  • Investment in digital solutions for policing should focus on developing programs that actively reduce biased approaches to policing. A human rights impact assessment should be developed in relation to new digital solutions, which should be rights-respecting by default and design."

Further reading

USA: Problems with predictive policing (Statewatch News Online, August 2016)

Predictive policing in London: commercial interests trump accountability (Statewatch News Online, August 2014)

Predictive policing: mapping the future of policing? (OpenDemocracy, link)

"Predictive policing" comes to the UK (Statewatch News Online, March 2013)

Our work is only possible with your support.
Become a Friend of Statewatch from as little as £1/€1 per month.

 

Spotted an error? If you've spotted a problem with this page, just click once to let us know.

Report error