United Kingdom police facial-recognition tools incorrect in more than 90pc of cases

Carla Harmon
May 17, 2018

Currently, there is no legislation in the United Kingdom that regulates the use of facial recognition systems through CCTV cameras by the police, nor is there any independent oversight for the police's use of these systems.

"One of those people matched was incorrectly on the watch list; the other was on a mental health-related watch list", it said.

Adding real-time facial recognition to our surveillance state's already worryingly militaristic arsenal would fundamentally change policing in the United Kingdom, and indeed the health of our democracy.

The UK information commissioner, Elizabeth Denham, said that police need to demonstrate the efficacy of facial recognition technology when less intrusive methods are not available: "Should my concerns not be addressed I will consider what legal action is needed to ensure the right protections are in place for the public".


London's Metropolitan Police used facial recognition at the 2017 Notting Hill carnival, where the system was wrong 98 per cent of the time, falsely telling officers on 102 occasions it had spotted a suspect.

Automated facial recognition (AFR) technology used by London's Metropolitan Police is created to find persons of interests within large groups of people by comparing the biometrics of attendees caught on camera with information already stored on law enforcement databases.

Both the South Wales and Met Police forces have defended the use of the technology.

The force made no arrest using the automated facial recognition system.


There is no legal basis for the police's use of automated facial recognition: there is no law permitting or governing its use and no guidelines for the police to follow - the police seem to think they have free reign to do whatever they want with whatever new tech they can get their hands on. A much larger trial at the UEFA Champions League final in Cardiff previous year resulted in 2,297 false positives and just 173 positive-positives.

The Met uses the technology to match people's faces against computer databases of criminals via CCTV and other cameras and they have deployed at numerous events, with very little success, according to the report, titled "Face Off: The lawless growth of facial recognition in United Kingdom policing".

South Wales Police added that a "number of safeguards" stopped police taking action against innocent people. "Firstly, the operator in the van is able to see that the person identified in the picture is clearly not the same person, and it's literally disregarded at that point", Lewis said. "Because of the poor quality, it was identifying people wrongly".

For instance, a developer of a content filtering AI system may claim that they are able to identify a high percentage of terrorist content on the web, but only when they already know that the content they're analyzing is terrorist content. A Met police spokesperson said that all alerts on its watch list were deleted after 30 days and faces that do not generate an alert are immediately deleted.


The Home Office said that it plans to publish its biometrics strategy in June, and it "continues to support police to respond to changing criminal activity and new demands".

Other reports by iNewsToday

FOLLOW OUR NEWSPAPER