“The Home Office’s own independent review of the Windrush scandal, found that it was oblivious to the racist assumptions and systems it operates. This streaming tool took decades of institutionally racist practices, such as targeting particular nationalities for immigration raids, and turned them into software. The immigration system needs to be rebuilt from the ground up to monitor for such bias and to root it out.”

Chai Patel, Legal Policy Director of JCWI

Today’s win represents the UK’s first successful court challenge to an algorithmic decision system. We had asked the Court to declare the streaming algorithm unlawful, and to order a halt to its use to assess visa applications, pending a review. The Home Office’s decision effectively concedes the claim.

“We’re delighted the Home Office has seen sense and scrapped the streaming tool. Racist feedback loops meant that what should have been a fair migration process was, in practice, just ‘speedy boarding for white people.’ What we need is democracy, not government by algorithm. Before any further systems get rolled out, let’s ask experts and the public whether automation is appropriate at all, and how historic biases can be spotted and dug out at the roots.”

Cori Crider, founder and Director of Foxglove

What does the algorithm do?

Since 2015, Home Office algorithm has used a traffic-light system to grade every entry visa application to the UK. The tool, which the Home Office described as a digital “streaming tool,” assigns a Red, Amber or Green risk rating to applicants. Once assigned by the algorithm, this rating plays a major role in determining the outcome of the visa application.

The visa algorithm discriminated on the basis of nationality – by design. Applications made by people holding ‘suspect’ nationalities received a higher risk score. Their applications received intensive scrutiny by Home Office officials, were approached with more scepticism, took longer to determine, and were much more likely to be refused. We argued this was racial discrimination and breached the Equality Act 2010.

Entrenched bias and racism in the visa system breaks hearts and tears families apart, like the four siblings from Nigeria unable to travel to the UK for their sister’s wedding, or the countless skilled professionals refused unable to contribute to conferences and events in the UK just because they don’t come from a rich white country – including scores of African academics and artists denied entry for no good reason.

The streaming tool was opaque. Aside from admitting the existence of a secret list of suspect nationalities, the Home Office refused to provide meaningful information about the algorithm. It remains unclear what other factors were used to grade applications.

The algorithm suffered from a feedback loop—a vicious circle in which biased enforcement and visa statistics reinforce which countries stay on the list of suspect nationalities. In short, applicants from suspect nationalities were more likely to have their visa application rejected. These visa rejections then informed which nationalities appeared on the list of ‘suspect’ nations. This error, combined with the pre-existing bias in Home Office enforcement (in which some nationalities are targeted for enforcement because they are believed to be easier to remove), accelerated bias in the Home Office’s visa process. Such feedback loops are a well-documented problem with automated decision systems.

The Home Secretary is putting in place an interim process for visa applications, and has agreed to put in place the essential legal protections we demanded – an Equality Impact Assessments and Data Protection Impact Assessments for the new system.

source: JCWI