Source: AFP
On a grey, overcast morning in December, London police deployed a state-of-the-art AI-powered camera near the train station in the suburb of Croydon and quietly scanned the faces of unsuspecting passers-by.
The use of live facial recognition (LFR) technology — which creates biometric facial signatures before instantly running them through a watchlist of suspects — led to 10 arrests for crimes including threats to kill, bank fraud, theft and possession of a crossbow.
The technology, which was used at the British Grand Prix in July and at the coronation of King Charles III in May, has proven so effective in trials that the UK government wants it to be used more.
“The development of facial recognition as a crime-fighting tool is a high priority,” Police Minister Chris Phillips told police chiefs in October, adding that the technology had “huge potential”.
![](https://images.yen.com.gh/images/4dc2488c95440aee.jpg?impolicy=cropped-image&imwidth=256)
![](https://images.yen.com.gh/images/4dc2488c95440aee.jpg?impolicy=cropped-image&imwidth=256)
Read also
Markets move ahead of US jobs as hopes of a rate cut fade
“Recent developments have led to arrests that would otherwise have been impossible and there have been no false alerts,” he added.
But the call to speed up its provision has angered some MPs, who want the government’s privacy regulator to take “tough, regulatory action” to prevent its abuse.
“Facial recognition surveillance involves the bulk processing of the sensitive biometric data of vast numbers of people — often without their knowledge,” they wrote in a letter.
“It poses a serious risk to the rights of the British public and threatens to turn our public spaces into places where people feel under constant corporate and government control.”
Fake matches
Lawmakers claim the technology’s false matches, which have yet to be debated in parliament, have led to more than 65 wrongful police interventions.
![](https://images.yen.com.gh/images/8d8096e3e2ce7346.jpg?impolicy=cropped-image&imwidth=256)
![](https://images.yen.com.gh/images/8d8096e3e2ce7346.jpg?impolicy=cropped-image&imwidth=256)
Read also
Asian markets extend new year retreat after Fed minutes
Source: AFP
One was the arrest of a 14-year-old boy in school uniform, who was surrounded by police and fingerprinted before his eventual release.
MPs said the use of the technology by private companies, meanwhile, represented a “radical transfer of power” from ordinary people to companies in private spaces, with potentially serious consequences for anyone misidentified.
Members of the public, they said, could be prevented from making essential purchases such as food, subjected to interventions or come into dangerous confrontations with security personnel.
Last year, Sports Direct chain owner Frasers Group defended the use of live LFR technology in stores, saying the technology had “significantly” reduced theft and reduced violence against staff.
“Walking ID Cards”
Civil liberties groups say the technology is oppressive and has no place in a democracy.
Mark Johnson, director of advocacy for Big Brother Watch, compares the technology to author George Orwell’s novel Nineteen Eighty-Four — a portrait of a totalitarian state in which the characters are under constant surveillance.
![](https://images.yen.com.gh/images/dd5eb1ee2a21e82c.jpg?impolicy=cropped-image&imwidth=256)
![](https://images.yen.com.gh/images/dd5eb1ee2a21e82c.jpg?impolicy=cropped-image&imwidth=256)
Read also
Private sector finance is key to climate transition, says World Bank chief
Source: AFP
The technology, he told AFP, “is an Orwellian tool of mass surveillance that turns us all into identity pros.”
Activists argue the technology puts too much unmonitored power in the hands of police, who have been given increased powers to arrest protests through the Public Order Act.
The new laws, pushed through parliament by the right-wing Tory government four days before the coronation, give police the power to stop a protest if they believe it could cause “more than minor disruption to community life”.
Critics are particularly concerned about a lack of oversight in the composition of police watch lists, saying some have been filled with protesters and people with mental health problems who are not suspected of wrongdoing.
“Off-the-shelf versions of these tools need legal and technical oversight to be used responsibly and ethically,” an activist told AFP.
“I’m concerned that police forces don’t have the resources and the ability to do that at the moment.”
![](https://images.yen.com.gh/images/0bd6a4f6b7ae9001.jpg?impolicy=cropped-image&imwidth=256)
![](https://images.yen.com.gh/images/0bd6a4f6b7ae9001.jpg?impolicy=cropped-image&imwidth=256)
Read also
American teenager becomes first person to beat Tetris
Police say the details of anyone who doesn’t match a watchlist are immediately and automatically deleted.
The Home Office insists that data protection, equality and human rights laws strictly govern the use of the technology.
But that has not satisfied opponents, in a country where previous attempts to introduce mandatory ID cards have met stiff resistance.
In June 2023, the European Parliament voted to ban live facial recognition in public places.
In the UK, lawmakers opposed to the technology want to go further.
“Live facial recognition has never been given explicit approval by parliament,” said Conservative MP David Davies, who once resigned after arguing that extending the detention time limits for uncharged terror suspects was a breach of civil liberties .
“It is an unsuspected mass surveillance tool that has no place in Britain.”
Source: AFP