Zoe Gardner’s transcript and interview (JCWI)

Sophia Akram: So thank you so much, Zoe Gardner from JCWI for going through JCWI’s landmark
legal case with Foxglove on the Home Office’s visa streaming tool. You had a success last year.
And it would be really great to hear your experience of challenging the algorithm as a way for
other rights groups to learn from your experience.

And if you don’t mind, I’ve got a few questions to go through. So just to start off, could you
briefly describe the issue we’re talking about right now? Which is the obviously the visa
streaming algorithm.

Zoe Gardner: Yeah. So, as part of its wider sort of digitisation agenda, the Home Office was using an algorithm to stream visa applications. And the the algorithm was sort of showing up with red, amber or green to say whether people should have their visa automatically passed on to somebody to review it, or should be granted. And when we heard that was taking place, we understood immediately that this must have racist implications in terms of the discrimination against people from certain nationalities, certain backgrounds, certain ethnicities.

And so we came up against the ordinary problem that you find across so many parts of the HomeOffice’s work, which is that the the system they were using was completely opaque. There was no transparency about what the criteria was, what the feedback was, what the systems were, what the checks and balances were, what the safeguards were, nothing of that was available. So yeah, so we basically took the Home Office to court saying, you have to show us how the system works, if you want to prove that it isn’t causing racist discrimination.

And in sort of typical, again, Home Office defensive style, rather than reveal the algorithm andwhat its criteria was for marking visa applications, red, amber, or green, they said, “No, no, we’ll withdraw. It certainly wasn’t racist, but we don’t want to show it to you. And we’ll stop using it.” So that was the win.

Sophia Akram: How did it even come to your attention?

Zoe Gardner: I think we were just made aware. So, the Home Office sort of plans agendas in a positive sense of like, ‘we’re moving into a 21st century system that has great digital systems’ andso on. And that raises red flags with us, because we know that the Home Office has this history of implementing huge digital processes that have poor outcomes or that fail and that lose people’s data or misuse people’s data. So, as soon as we read that this digitisation agenda was underway from Home Office outfits, we were on the alert.

And I think it was just in press reports that we heard that some journalists had come across thefacts offhand that there was an algorithm being used to stream visa applications. And this sort ofpresented as, ‘yes, of course, this is happening’ and moving on to the point of the story, which is something else and we picked up on that and realised that there must be a problem there.

Sophia Akram: Thanks for explaining that. The JCWI has obviously done great work, trying to counter or challenge immigration policy in the UK. But I was wondering if this subject of data discrimination was one that your organisation typically works on? And if not, did you find any challenges for this sort of new area?

Zoe Gardner: Yeah, I think it is a challenge. It’s been a challenge for JCWI and I think in general for all migrants’ rights defenders, because even where we are experts on the policies that are used to produce this hostile environment for all different kinds of migrants, we’re not necessarily experts on data systems, and digital systems and how they operate and what the safeguarding concerns are around those. And so it’s sort of a whole new area of expertise that we, generally speaking,don’t have. I think it’s taken some time to get to the point of understanding how those systems in and of themselves introduce new risks and new problems to a system that’s already sort of designed to act in a hostile way towards the people that we represent.

It also introduces new areas for people to fall through the gaps and introduces new risks for the systems to fail. So, I think that for all of us across the sector, that’s been something that we’ve really had to get our heads around – the fact that, you know, this introduces a whole new layer of complexity to every part because the digitization agenda is, you know, is focused on every part of Home Office systems. And I think that there’s there’s some concern that we’re not, because we don’t understand the digital systems, then we’re going to lose a layer of understanding of how the policies are actually impacting people.

So we’re trying to level up in the sector, and to get some training and to understand those issues and how they’re going to play in. But I think it’s very much still an area where we’re growing and we’re learning.

Sophia Akram: Do you mind talking a little bit about how you are levelling up and has have partnerships played a key role in that? Has it been useful to partner with people like Foxglove? Or other organisations?

Zoe Gardner: Yeah, definitely. Foxglove are brilliant experts in this area and how it crosscuts a lot of different rights issues, in how these systems are set up. So again, they’ve been able to give us an insight. Partnering with them on this case was absolutely essential for us to have that insight of how the systems create a problem in and of themselves – rather than just the policies which we understand very well – and what issues they create for migrants. So, that has been really important. Yurself, at Open Rights Group has provided absolutely invaluable support, especially when we were responding to the government’s consultation on this digitalisation agenda.

And we’ve also had some really fruitful partnerships with people working in other sort of areas. So for example, we had a great conversation with a researcher who’s primarily concerned with poverty and people who are in receipt of state benefits. And he spoke to us about how the use of algorithms and the use of automated decision making is being introduced more and more into that sphere as well and into establishing you know, somebody’s eligibility for benefits.

And that was really interesting for us, because it’s exactly the same issue that I mentioned before, of like, an area where people are really experts on the policy and the impacts of the policy, but where the systems and the automation that’s being brought in and laid over that policy is introducing a whole new level of complication and problems. It’s exactly the same issue facing both populations of concern. So I think it’s for the future, an area that we should really looktowards building better links with because, yeah, it’s going to impact in all cases. These new systems create more spaces for the most vulnerable to fall through the gaps.

Sophia Akram: Could I just ask you how the use of data or new technologies has affected the rights of migrants from your organization’s point of view?

Zoe Gardner: Yeah, so the use of data and technology by the Home Office is really worrying for us, for a number of reasons.

First of all, this department that has a really poor record of safeguarding the information and the data that it collects on its users. So we’ve seen multiple times data breaches, huge amounts of data lost. And data not stored adequately. So I mean, the really obvious example of this was with the Windrush scandal where a lot of the people who were impacted by the Windrush scandal who were targeted by immigration enforcement, as though they were living in the country without authorisation, when in fact, they did have the papers and the right to be here, but that data had been stored – basically had been lost – very poorly by the Home Office, and that resulted in enormous miscarriage of justice. So that’s a really dramatic example but this is a pattern that we’ve seen again and again – the Home Office, not really a trustworthy body for this kind of really sensitive and important data.

And furthermore, when you reduce large numbers of people down to data that can be fed through an algorithm in order to make decisions – decisions that are fundamentally important to people’s life chances, you automatically will lose a lot of the nuance a lot of the protections of the individual needs of certain circumstances. And what you do is you flatten people out into what the Home Office considers sort of more risky groups to give rights to, and that, unfortunately, in the world we live in, corresponds to the economic and political disadvantage that is based in certain countries in the world, as opposed to others. And that obviously correlates as well with race and ethnicity, and it produces a racist outcome.

Sophia Akram: Thanks so much. So you discovered the algorithm, discovered it was a problem. How did you then settle on a mechanism to bring the issue to light?

Zoe Gardner: Well, the problem with the Home Office – there are many – but one of the problemsin the way that the Home Office operates is this sort of veil of secrecy. And this huge defensiveness over its systems and its processes. They don’t collect or publish good data about the impact of their policies and the impact of their actions. They don’t publicise the systems by which they make decisions, they don’t publicise the training that their decision makers undergo. So at every level, there’s just a resistance to open, transparent and accountable decision making.

So, when we realised that this algorithm was in use, it became apparent quite quickly that there wasn’t going to be any way to force the Home Office to share that the criteria that the algorithm was based on without going down a legal pathway. So because JCWI has a long history of taking strategic legal interventions in order to protect, adjust and in the transparent and an accountable immigration system. And because we had a good relationship with Foxglove, who, as I say, are really experts in this area of understanding how these systems work. It was just the only sort of avenue available to us in order to force a change was through challenging the use of the algorithm in the courts.

Sophia Akram: And obviously, algorithms are quite a technical tool. How did you deal with the technicalities?

Zoe Gardner: Well, it was it was complex. But, basically, what this algorithm did was also really quite base – quite a sort of base and evidently discriminatory system. So as I say, it was a sort of red, amber, green process, and you receive a visa application form. And if it comes from these countries, it’s an automatic red, if it comes from these countries, it’s an automatic amber, if it contains this information, then it moves from a red to an amber to a green.

And what we could tell that that was doing was, in the first instance, it was obviously directlydiscriminating against certain types of application. But also, what what the algorithm did, it wasn’t the final decision on any visa unless that visa was granted green straight away. So, certain types of application, which I’m sure we can all imagine coming from countries that are deemed low risk by the Home Office and so on, might have been given a green straight away, but if they were flagged as amber or as red, they would then be passed on to a Home Office caseworker, a decision maker, and that was how they described it as sort of the safeguard – then that person makes a decision based on the factors of the case.

It gave no consideration to the implicit bias that would be produced in the mind of a person who was receiving that application through the visa process, as flagged because there are problems here, flagged because there are risks here, and how that would impact their decision making. And they weren’t able to show that there was any training that was being provided that would be helped to override those biases and those implicit implications that system was producing. And they weren’t willing to share either, how what criteria were producing different results. But what we could see, and what there is evidence to see, again, difficult as the Home Office doesn’t produce very comprehensive data.

But we do have some evidence that it is significantly more likely that your visa application will be refused, whether that’s for a visitor visa or work visa or whatever. If you come from countries from, for example, the African subcontinent, from Sub Saharan Africa, from countries, actually predominantly countries that are previously ex colonies of the UK. So that’s sort of additional irony that goes in there. So, we could see that the outcomes were discriminatory and that’s why we could tell that there was a problem.

Sophia Akram: It definitely sounds complex. And for anyone, let alone anyone that hasn’t reallyoperated in the space before. And if you reflect on the challenges that you faced with the legal response, as well, do you think there are any lessons that organisations could take on board?

Zoe Gardner: Yeah, I think it would have been extremely difficult –again, it comes back to this thing I keep repeating about how the Home Office doesn’t collect and publish good and wide-ranging data on the outcomes of its policies – so, it would have been quite difficult for us to evidence to a level that would have succeeded in court that say, visa decision making is racist, or is causing racial discrimination, right? That’s actually quite difficult to collect that level of evidence, the data is murky and hard to reach.

So what we did was challenge the lack of transparency of the government. We said that, you know, it was the Home Office’s responsibility to publish how their system works, so that they could be held accountable and that sort of forced the government’s hand because they couldn’t publish it – we obviously don’t know why – but they refuse to publish it, and therefore had to stop using it, rather than us proving [with] it being available to us… and us being able to prove on that basis that there was a problem.

We didn’t actually have to prove that the algorithm was producing racial discrimination, becauseas long as the Home Office refused to publish it, they preferred to withdraw it rather than show us and prove to us that it didn’t, then we were able to achieve the same aim. So I think sometimes you have to, I mean, that was definitely sort of a legal strategic calculation that, you know, people with a lot of expertise on this issue will have made. But it definitely, I think, that issue of the government’s failure to be transparent, and therefore, that that failure to fulfil its positive obligations to avoid discrimination was a good way of approaching it.

Sophia Akram: Thank you