UK: Threat of legal challenge forces Home Office to abandon "racist visa algorithm"

The UK Home Office has said that it will get rid of the "streaming algorithm" used to classify visa applications and will launch a review of the system, following an application for judicial review brought by the civil society organisations Foxglove and the Joint Council for the Welfare of Immigrants (JCWI).


Little was known about how the algorithm worked, aside from that it was being used to scan all visa applications and then put them:

"...into a fast lane (Green), a slow lane (Yellow), or a full digital pat-down (Red)... As far as we can tell, the algorithm is using problematic and biased criteria, like nationality, to choose which “stream” you get in. People from rich white countries get “Speedy Boarding”; poorer people of colour get pushed to the back of the queue."

Foxglove and JCWI had requested that the courts hand down "an order prohibiting the continued use of the Streaming Tool to assess visa applications, pending a substantive review of its operation."

However, the Home Office responded to the two groups on 3 August to say that it "will discontinue the use of the Streaming Tool pending a redesign of the process and the way in which visa applications are allocated for decision-making."

The Home Office's letter says it "will be undertaking Equality Impact Assessments and Data Protection Impact Assessments," and that "the intent is that the redesign will be completed as quickly as possible and at latest by 30 October 2020."

Chai Patel, Legal Policy Director for JCWI said:

“The Home Office’s own independent review of the Windrush scandal, found that it was oblivious to the racist assumptions and systems it operates. This streaming tool took decades of institutionally racist practices, such as targeting particular nationalities for immigration raids, and turned them into software. The immigration system needs to be rebuilt from the ground up to monitor for such bias and to root it out.”

Foxglove said (emphasis in original):

"This marks the end of a computer system which had been used for years to process every visa application to the UK. It's great news, because the algorithm entrenched racism and bias into the visa system. The Home Office kept a secret list of suspect nationalities automatically given a ‘Red’ traffic-light risk score – people of these nationalities were likely to be denied a visa. It had got so bad that academic and nonprofit organisations told us they no longer even tried to have colleagues from certain countries visit the UK to work with them.

We also discovered that the algorithm suffered from ‘feedback loop’ problems known to plague many such automated systems - where past bias and discrimination, fed into a computer program, reinforce future bias and discrimination.  Researchers documented this issue with predictive policing systems in the US, and we realised the same problem had crept in here.

It's also great news because this was the first successful judicial review of a UK government algorithmic decision-making system. More and more government departments are talking up the potential for using machine learning and artificial intelligence to aid decisions. Make no mistake: this is where government is heading, from your local council right on up to Number 10."

It is not only the UK authorities that are seeking to take advantage of new technologies to 'streamline' immigration proceedings.

The use of algorithms is due to become a central part of the decision-making process for EU short-stay Schengen visas, as documented in our recent report Automated suspicion: The EU's new travel surveillance initiatives.

New "screening rules" and "risk indicators" will be used to assess all visa applicants, raising many of the same risks as those set out by Foxglove and JCWI in their case against the Home Office's "streaming tool".

Further reading

 

Spotted an error? If you've spotted a problem with this page, just click once to let us know.

Report error