Live facial recognition cameras may become ‘commonplace’ as police use soars

6 hours ago 4

Police believe live facial recognition cameras may become “commonplace” in England and Wales, according to internal documents, with the number of faces scanned having doubled to nearly 5m in the last year.

A joint investigation by the Guardian and Liberty Investigates highlights the speed at which the technology is becoming a staple of British policing.

Major funding is being allocated and hardware bought, while the British state is also looking to enable police forces to more easily access the full spread of its image stores, including passport and immigration databases, for retrospective facial recognition searches.

Live facial recognition involves the matching of faces caught on surveillance camera footage against a police watchlist in real time, in what campaigners liken to the continual finger printing of members of the public as they go about their daily lives.

Retrospective facial recognition software is used by the police to match images on databases with those caught on CCTV and other systems.

According to one funding document drawn up by South Wales police as part of a proposal to put the West End of London or Cardiff rail station under live facial recognition cameras and released by the Metropolitan police under the Freedom of Information Act, it is believed “the use of this technology could become commonplace in our city centres and transport hubs around England and Wales”.

The first fixed live facial recognition cameras will be fitted for a trial in Croydon, south London, later this summer.

The expansion comes despite facial recognition failing to be referenced in any act of parliament.

Campaigners claim the police have been allowed to “self regulate” their use of the technology. Officers have in the past used a setting that was subsequently shown to disproportionately misidentify black people.

After a court of appeal judgment in 2020, which found that South Wales police’s use of live facial recognition cameras had been unlawful, the College of Policing provided guidance that “the threshold needs to be set with care to maximise the probability of returning true alerts while keeping the false alert rate to an acceptable level”.

There remains nothing in law to direct forces on the threshold or technology used.

The policing minister, Diane Johnson, told parliament earlier this month that she recognised “a need to consider whether a bespoke legislative framework governing the use of live facial recognition technology for law enforcement purposes is needed” but the Home Office is yet to provide details.

Facial recognition cameras were first trialled in London and south Wales from 2016 but the speed at which police forces are rolling out the technology has accelerated over the last 12 months.

The investigation by the Guardian and Liberty found:

  • Police forces scanned nearly 4.7m faces with live facial recognition cameras last year – more than twice as many as in 2023. Live facial recognition vans were deployed at least 256 times in 2024, according to official deployment records, up from 63 the year before.

  • A roving unit of 10 live facial recognition vans that can be sent anywhere in the country will be made available within days – increasing national capacity. Eight police forces have deployed the technology. The Met has four vans.

  • Police forces have considered fixed infrastructure creating a “zone of safety” by covering the West End of London with a network of live facial recognition cameras. Met officials said this remained a possibility.

  • Forces almost doubled the number of retrospective facial recognition searches made last year using the police national database (PND) from 138,720 in 2023 to 252,798. The PND contains custody mug shots, millions of which have been found to be stored unlawfully of people who have never been charged with or convicted of an offence.

  • More than 1,000 facial recognition searches using the UK passport database were carried out in the last two years, and officers are increasingly searching for matches on the Home Office immigration database, with requests up last year, to 110. Officials have concluded that using the passport database for facial recognition is “not high risk” and “is not controversial”, according to internal documents.

  • The Home Office is now working with the police to establish a new national facial recognition system, known as strategic facial matcher. The platform will be capable of searching a range of databases including custody images and immigration records.

Lindsey Chiswick, the director of intelligence at the Met and the National Police Chiefs’ Council lead on facial recognition, said surveys showed that four in five Londoners were in support of the police using innovative technology, including facial recognition cameras.

This week, a registered sex offender, David Cheneler, 73, from Lewisham, was jailed for two years after he was caught alone with a six-year-old girl by a live facial recognition camera. He had previously served nine years for 21 offences against children.

skip past newsletter promotion

The Met arrested 587 people in 2024 with the assistance of the live facial recognition cameras of which 424 were charged with offences.

Of those arrested, 58 were registered sex offenders in serious breach of their conditions and 38 have been charged.

Chiswick said: “Where there’s limited amounts of money and there’s fewer officers, but there’s more demand, and we see criminals exploiting technology to a really grand scale … We’ve got to do something different.

“There’s an opportunity out there. So policing needs to start operating a little bit differently. People talk about harnessing AI like it’s some crazy horse we want to saddle but we do need to harness the opportunities that technology and data can bring us.”

Chiswick said the Met’s policy was to take “really quite small steps and review them at every stage” but that there would be a “benefit in potentially some sort of framework or statutory guidance”.

The Met is deploying its facial recognition cameras at a setting that testing suggests avoids any statistical significance in terms of gender or ethnicity bias when it comes to cases of misidentification.

Chiswick said: “I don’t want to use a biased algorithm in London. There’s no point on all counts. I think for government, there’s a question, isn’t there around artificial intelligence? And I think clearly the public sector is going to use, and want to use AI more and more.

“I think the questions around who then decides where algorithms are purchased from, what training data is used, what countries might this technology come from and then, when you use it, are you obliged to test it and if you’re obliged to test it, are you then obliged to operate at a certain setting? That’s not really questions for law enforcement.”

The Home Office declined a request for comment.

Read Entire Article
Infrastruktur | | | |