Amidst Brexit chaos, privacy advocates say politicians are turning a blind eye to AI surveillance
South Wales Police have been testing facial recognition technologies in public spaces since 2017
As Brexit discussions carry on in the U.K. House of Commons, civil liberties groups are concerned that a key debate on surveillance is being overlooked.
On Saturday, the South Wales Police will install a facial recognition system at the Principality Stadium in Cardiff, where crowds of rugby fans are gathering for the Six Nations championship.
Police say the system will scan the game's attendees in order to track down spectators that match people on their wanted list. If the technology logs a match, those suspects may well find themselves under arrest.
This is not the first time the South Wales Police have deployed their technology in public: Saturday's trial will be the 29th time they've tested facial recognition technology since 2017.
But London-based Politico EU reporter Annabelle Dickson, who has been following the technology's rise, says the trials have largely escaped the notice of British lawmakers amid ongoing debates about the country's exit from the EU.
Dickson spoke with Day 6host Brent Bambury from London, U.K.
Here's part of their conversation.
This is the 29th time the South Wales Police have tested the technology in public. What is their endgame? What are they hoping to use it for in the future?
The police make it very clear that at the moment, as you say, that they're trialling it. They're trying to work out if this is going to be useful to use on a wider scale.
So, some of the people I spoke to were saying perhaps it could be used more widely. Say you're looking for a missing child, you might be able to live scan CCTV footage and potentially come up with a match.
We're not there at the moment, I've got to stress. This is a controlled test but that's one of the things that people who are thinking about this technology have suggested it could be used for.
The police are saying we need the lawmakers to engage on this and think about what regulations they can put in place so that we can use this.- Annabelle Dickson, Politico EU reporter
A missing child is one thing, but looking for suspects in a crime is another. Is there a political debate around this or are people in the U.K. talking about the use of these technologies and some of the problems that the technologies have had?
Yes, they are. Civil society groups are certainly very engaged on this. There has been coverage of what South Wales Police are doing and, indeed, the London Police force are also doing this.
However, in Parliament, where, as I'm sure your listeners will have followed, they are very preoccupied with the U.K.'s departure from the European Union. There hasn't been a huge amount of debate about this.
Two or three years ago, we had the Investigatory Powers Act, which was a big sort of civil liberties debate and it was the sort of thing you'd have seen on the front of all the nationals day after day.
There were hours of debate in the Houses of Parliament and that isn't happening in these particular fields [now]. Civil liberties campaigners and even MPs … admitted that could in part be because of all the attention and time and energy that's being expended on Brexit.
Do you think that police forces see an advantage here? That they have an opening to try something that they might not have had so much freedom to do had politicians not been preoccupied by Brexit?
Funnily enough, you'd think that but the commissioner of the Metropolitan Police — the London Police force — which has also been trialling this technology has actually said that the police want MPs to put in some sort of legal framework because two civil liberties groups have brought court cases against South Wales Police and the Metropolitan Police to try and stop these trials.
And, actually, the police are saying, "We need the lawmakers to engage on this and think about what regulations they can put in place so that we can use this and feel like there's a legal framework to do it."
Facial recognition software sometimes has a harder time identifying people of colour. Is that a concern in any of the discussions that are being had around this?
Yes, absolutely. That's one of the concerns that people are raising.
And I think that sort of goes to the heart of why we need to have this debate because people feel that they want to know exactly how the algorithms that have trained the software have been developed.
I have to say that neither of the police forces would actually do an interview with me. They provided a certain amount of information. They said they hadn't seen any evidence that it did have any sort of racial bias or gender bias.
But frankly, we're taking their word for it.
It's early days for these technologies, but by the time the debate is over on Brexit ... won't these technologies already be embedded in people's lives in a way that it's going to be too late to remove them?
That's one of the concerns of these civil liberty campaigners that are bringing these court cases.
And I think there's a hope among those who are raising the alarm on this that these two court cases might act as a tripwire to some sort of proper debate in Parliament.
This isn't, sort of, mass surveillance, but as you say, technology develops incredibly quickly and it's not something that we can probably leave any longer to have a debate on.
This transcript has been edited for length and clarity. To hear the full interview with Annabella Dickson, download our podcast or click 'Listen' above.