… you’re nicked! That, at least, is how the Metropolitan Police hoped to operate at Notting Hill Carnival 2016 and 2017. It didn’t quite work out like that.
It must have seemed like a good idea at the time: a device that could pick out ‘a bad ‘un’ from among the milling million on the west London streets on August bank holiday. Surely no one would mind being watched by hidden cameras and then have the data points on their face matched at lightning speed with a database of known or suspected criminals. Londoners already have their every move tracked and recorded whenever they step out of their homes. Camera surveillance in Britain is the most intensive in the world ‑ £2.2 billion a year is spent on it. Image databases cover 90% of the UK population and police have already accessed facial images of more than 20 million people.
One suspects that the tech company salespeople touting their facial recognition widgets didn’t have to work too hard to convince the Met commanders to sign up for a trial of their automatic bad boy spotter. And Met top brass probably didn’t take long to identify the perfect event for putting their ‘secret weapon’ through its paces ‑ Notting Hill Carnival.
For decades, the Met has used London’s carnival to test out every conceivable urban policing tactic and equipment. Carnivalists can take some perverse pride in helping to rewrite the world’s public order and crowd control playbook. If you watch video of police suppressing demonstrations in New York, Moscow or Beijing you will almost certainly be looking at tactics that were first tried out in Ladbroke Grove. Now London’s police were once again going to lead the world with this thoroughly modern marriage of live action CCTV and image database linked by a ‘top-of-the-range’ algorithm. What could possibly go wrong?
Silkie Carlo, senior advocacy officer at human rights organisation Liberty, was present on Carnival Day last year to see for herself. The full report, published at www.libertyhumanrights.org.uk on 30 August 2017, makes alarming reading. The software proved utterly inept – it was incapable of spotting the difference between a woman and a bald man. As for black people, apparently to an algorithm ‘they all look the same’. It contributed to a 98% failure rate: in other words, 98% of people the algorithm picked out as a ‘match’ with someone on the database turned out to be someone else entirely. It was, as Carlo put it, “policing led by low-quality data and low-quality algorithms”. Bear in mind that the system was not just out of the box; it was in its second year of ‘trial’.
Worse still, Carlo felt, was the casual attitude of the project leaders. The risk of racial bias in the system “wasn’t a concern”. The operators thought it perfectly acceptable to stop and question innocent people who had been wrongly identified by the unreliable system. The photos of all the system’s ‘suspects’ would be kept for “around three months, probably”, she was told. Innocent people theoretically have the right to demand removal of their image from the database, but are never likely to do so because they won’t have been made aware that it was collected in the first place.
Carnival-lovers are probably the last people to want to share their space with thugs and pushers; we just want to enjoy the music, the vibes, the mas, the food, the company of our friends. So a non-intrusive system that identifies the real trouble-makers ought to be welcomed. Not, though, at the expense of our basic rights and liberties – especially in light of Notting Hill’s often fraught relationship with the police. Yet the Met repeatedly dismissed concerns about accountability, legality, ethics, informed consent, independent oversight and human rights. It failed to consult with or respond to civil liberties and race equality groups such as Liberty, the Institute of Race Relations and Black Lives Matter, and it refused to suspend the trials.
It wasn’t just activists and carnival-goers who became alarmed by the Met’s intention of pressing ahead with a third year of its ‘experiment’ at Carnival 2018. In February, the Greater London Authority Oversight Committee asked London Mayor Sadiq Khan to suspend the Met’s deployment of the technology until proper legal protections were in place. The Mayor referred the matter to the London Policing Ethics Panel. Assembly Member Len Duvall said: “We believed the Met risked losing the public’s trust if it introduced intrusive technology like this, without public consent.” Information Commissioner Elizabeth Denham has threatened the Home Office with legal action over the failure to regulate the use of facial recognition software.
The pressure continued to mount and on 23 May the Metropolitan Police gave way, announcing that it would not be using facial recognition software at this year’s Notting Hill Carnival. It refused to say why it was suspending this aspect of its secret surveillance operation.
Although “delighted” by the announcement, Liberty’s Silkie Carlo warned that the Met “plans to dramatically increase use of facial recognition over the next six months”. The force has already earmarked seven events where it will use the technology this year, but refuses to give more details.
Soca News predicts that one of the seven events will occur on Friday 13 July when President Trump comes to Britain for talks with Theresa May. Within hours of the “working visit” being announced, 80,000 people had signed up to take part in a “Carnival of Resistance”. The prospect of facial recognition systems being employed may encourage demonstrators to go for a Venetian Carnival theme, and don masks to disguise their features from the hidden cameras. If so, this could inspire political mas of a kind rarely seen in Britain. As Sadiq Khan noted in a Twitter post, “President Trump will… no doubt see that Londoners hold their liberal values of freedom of speech very dear.”
That last sentence, one could argue, should apply as much to the Metropolitan Police and the Home Office as to the US President.