top of page

Facial recognition tech used by UK police is making a ton of mistakes


CCTV Camera
Credit: Shutterstock

At the end of each summer for the last 14 years, the small Welsh town of Porthcawl has been invaded. Every year its 16,000 population is swamped by up to 35,000 Elvis fans. Many people attending the yearly festival look the same: they slick back their hair, throw on oversized sunglasses and don white flares.


At 2017's Elvis festival, impersonators were faced with something different. Police were trialling automated facial recognition technology to track down criminals. Cameras scanning the public spotted 17 faces that they believed matched those stored in databases. Ten were correct, and seven people were wrongly identified.


South Wales Police has been testing an automated facial recognition system since June 2017 and has used it in the real-world at more than ten events. In the majority of cases, the system has made more incorrect matches than the times it has been able to correctly identify a potential suspect or offender.


The automated facial recognition system has been used at sporting events, concerts and during coordinated police crackdowns on certain types of crime. Figures from South Wales Police, released following a Freedom of Information request, show the number of times its system made correct and incorrect matches. The police force has now also published the data on its website.


During the UEFA Champions League Final week in Wales last June, when the facial recognition cameras were used for the first time, there were 2,470 alerts of possible matches from the automated system. Of these 2,297 turned out to be false positives and 173 were correctly identified – 92 per cent of matches were incorrect. A spokesperson for the force blamed the low quality of images in its database and the fact that it was the first time the system had been used.


But experts have warned that the systems used by South Wales Police and other forces, have little regulatory oversight and lack transparency about how they work. There are also questions over their accuracy. On the other hand, police say the correct matches have led to arrests and that the system is improving.


"These figures show that not only is real-time facial recognition a threat to civil liberties, it is a dangerously inaccurate policing tool," says Silke Carlo the director of rights group Big Brother Watch. The group is planning on launching a campaign against facial recognition tech in parliament later this month. "South Wales’ statistics show that the tech misidentifies innocent members of the public at a terrifying rate, whilst there are only a handful of occasions where it has supported a genuine policing purpose," Carlo adds.


At the Anthony Joshua versus Kubrat Pulev boxing match in October 2017 – the same month police improved the algorithm in the system they were using – five true positives were made compared to 46 false alerts, equally 90 per cent false positives. At the Wales versus Australia rugby international six positives were outweighed by 42 negatives.

During a royal visit from Prince Harry and Meghan Markle to Cardiff in January 2018, the automated facial recognition system used by South Wales Police didn't make any matches, either positive or false. And, on November 3, 2017, when police recovered the body of a man who had jumped into the River Taff and died, automated facial recognition was used to help discover his identity.


Copyright: Wired UK - http://www.wired.co.uk/

9 views0 comments
bottom of page