Government Scraps Immigration “Streaming Tool” before Judicial Review

Source: ukhumanrightsblog.com

In response to a legal challenge brought by the Joint Council for the Welfare of Immigrants (JCWI), the Home Office has scrapped an algorithm used for sorting visa applications. Represented by Foxglove, a legal non-profit specialising in data privacy law, JCWI launched judicial review proceedings,, arguing that the algorithmic tool was unlawful on the grounds that it was discriminatory under the Equality Act 2010 and irrational under common law. 

In a letter to Foxglove from 3rd August on behalf of the Secretary of State for the Home Department (SSHD), the Government Legal Department stated that it would stop using the algorithm, known as the “streaming tool”, “pending a redesign of the process and way in which visa applications are allocated for decision making”. The Department denied that the tool was discriminatory. During the redesign, visa application decisions would be made “by reference to person-centric attributes… and nationality will not be taken into account”. 

The “streaming tool” was an algorithmic system designed to categorise visa applications with reference to how much scrutiny each application needed. It would assign an application a red, amber, or green rating: red indicated that the application’s case worker ought to spend more time applying scrutiny, and would have to justify approving the application to a more senior officer. Applications with a red rating were much less likely to be successful than those rated green, with around 99.5% of green being successful but only 48.59% of red. 

The exact weighting of the numerous factors that contributed to the streaming tool’s decision making are not known, as the architecture of the algorithm was not revealed. However, in a letter to Foxglove, the SSHD revealed that “nationality is one of the relevant factors used by the streaming tool”. Certain nationalities are identified in the Equality Act Nationality Risk Assessment (EANRA) as “suspect”. A visa application coming from someone whose nationality was identified in the EANRA would be automatically given a red rating. An applicant’s nationality, even if not on the EANRA “suspect” list, could still, in conjunction with other factors, contribute to the awarding of a red or amber rating.

Nationality is protected from discrimination under Section 4 of the Equality Act. However, the Equality Act does allow for enhanced scrutiny of visa applications on the basis of nationality if prescribed by a Ministerial Authorisation issued under section Schedule 3 of the EA. The use of the streaming tool was justified with reference to the Ministerial Authorisation, as its only authorised use was to signify the need for a “more rigorous examination” of the application.

The Ministerial Authority which legitimised the streaming tool’s categorisation by nationality sets out various routes by which a specific nationality can be placed on the EANRA “suspect” list, most notably a nationality being associated with a high number of “adverse events”. These can include unauthorised behaviours (over-staying, working, etc.). Adverse events also include having a visa application refused. Given that red ratings were typically refused at a higher rate than other ratings, this risked creating a vicious cycle where certain nationalities would be locked onto the EANRA “suspect” list.

Foxglove argued that the use of the streaming tool was discriminatory and irrational. The streaming tool’s only authorised function was to classify applications in relation to required caseworker scrutiny, and not to contribute to decision making. Foxglove held that the ratings materially contributed to the decision making process. They suggested that rating an application red would create confirmation bias, leading case workers to rate evidence contributing negatively to the application more highly than positive evidence. This, they suggested, is evidenced in the difference in success rates between red rated and green rated applications. Furthermore, Foxglove cites a report from the Independent Chief Inspector of Borders and Immigration from 2017 which states that the streaming tool had become a “de facto decision-making tool”. 

Both the confirmation bias and the report make clear how the streaming tool was used beyond its authorised bounds. As nationality was a significant factor in the streaming tool’s weighting (in many cases, thesignificant factor), its use was argued to be illegal under section 4 of the Equality Act. 

The vicious circle present in the streaming tool “produce[d] substantive results which [were] irrational”. Because visa application refusals were considered to be adverse events, and those same adverse events fed into the algorithm’s decision making, certain nationalities were locked onto the EANRA “suspect” list. This further increased the number of adverse events associated with that nationality, in turn contributing to its position on the EANRA list. As such, the algorithm would class applications as high risk merely because it had done so in the past. Foxglove argued that this constituted irrationality. 

The function of the streaming tool highlights a wider debate surrounding the use of reinforcement learning algorithms and AI in government. Algorithms that feed their own results back into their learning processes, like the streaming tool and other algorithms relying on reinforcement learning, often end up shaping their own learning environments and entrenching biases. This risks manifesting in discriminatory ways. 

While the streaming tool was shelved before a judicial review could be conducted, the Foxglove/JCWI case could prove to be an important referent as more public services use algorithms in their functioning. Foxglove also argued that the government had failed to undertake the Data Protection Impact Assessment required for the use of the streaming tool. The Home Office has committed to a fast redesign, intending to complete it by the latest 30th October 2020.

Related Posts

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x
Artificial Intelligence